Dec 05 11:03:41 localhost kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec 05 11:03:41 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 05 11:03:41 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 05 11:03:41 localhost kernel: BIOS-provided physical RAM map:
Dec 05 11:03:41 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 05 11:03:41 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 05 11:03:41 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 05 11:03:41 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 05 11:03:41 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 05 11:03:41 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 05 11:03:41 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 05 11:03:41 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 05 11:03:41 localhost kernel: NX (Execute Disable) protection: active
Dec 05 11:03:41 localhost kernel: APIC: Static calls initialized
Dec 05 11:03:41 localhost kernel: SMBIOS 2.8 present.
Dec 05 11:03:41 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 05 11:03:41 localhost kernel: Hypervisor detected: KVM
Dec 05 11:03:41 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 05 11:03:41 localhost kernel: kvm-clock: using sched offset of 3156555422 cycles
Dec 05 11:03:41 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 05 11:03:41 localhost kernel: tsc: Detected 2800.000 MHz processor
Dec 05 11:03:41 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 05 11:03:41 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 05 11:03:41 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 05 11:03:41 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 05 11:03:41 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 05 11:03:41 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 05 11:03:41 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 05 11:03:41 localhost kernel: Using GB pages for direct mapping
Dec 05 11:03:41 localhost kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec 05 11:03:41 localhost kernel: ACPI: Early table checksum verification disabled
Dec 05 11:03:41 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 05 11:03:41 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 11:03:41 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 11:03:41 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 11:03:41 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 05 11:03:41 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 11:03:41 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 11:03:41 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 05 11:03:41 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 05 11:03:41 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 05 11:03:41 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 05 11:03:41 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 05 11:03:41 localhost kernel: No NUMA configuration found
Dec 05 11:03:41 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 05 11:03:41 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec 05 11:03:41 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 05 11:03:41 localhost kernel: Zone ranges:
Dec 05 11:03:41 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 05 11:03:41 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 05 11:03:41 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 05 11:03:41 localhost kernel:   Device   empty
Dec 05 11:03:41 localhost kernel: Movable zone start for each node
Dec 05 11:03:41 localhost kernel: Early memory node ranges
Dec 05 11:03:41 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 05 11:03:41 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 05 11:03:41 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 05 11:03:41 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 05 11:03:41 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 05 11:03:41 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 05 11:03:41 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 05 11:03:41 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 05 11:03:41 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 05 11:03:41 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 05 11:03:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 05 11:03:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 05 11:03:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 05 11:03:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 05 11:03:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 05 11:03:41 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 05 11:03:41 localhost kernel: TSC deadline timer available
Dec 05 11:03:41 localhost kernel: CPU topo: Max. logical packages:   8
Dec 05 11:03:41 localhost kernel: CPU topo: Max. logical dies:       8
Dec 05 11:03:41 localhost kernel: CPU topo: Max. dies per package:   1
Dec 05 11:03:41 localhost kernel: CPU topo: Max. threads per core:   1
Dec 05 11:03:41 localhost kernel: CPU topo: Num. cores per package:     1
Dec 05 11:03:41 localhost kernel: CPU topo: Num. threads per package:   1
Dec 05 11:03:41 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 05 11:03:41 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 05 11:03:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 05 11:03:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 05 11:03:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 05 11:03:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 05 11:03:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 05 11:03:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 05 11:03:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 05 11:03:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 05 11:03:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 05 11:03:41 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 05 11:03:41 localhost kernel: Booting paravirtualized kernel on KVM
Dec 05 11:03:41 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 05 11:03:41 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 05 11:03:41 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 05 11:03:41 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 05 11:03:41 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 05 11:03:41 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 05 11:03:41 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 05 11:03:41 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec 05 11:03:41 localhost kernel: random: crng init done
Dec 05 11:03:41 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 05 11:03:41 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 05 11:03:41 localhost kernel: Fallback order for Node 0: 0 
Dec 05 11:03:41 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 05 11:03:41 localhost kernel: Policy zone: Normal
Dec 05 11:03:41 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 05 11:03:41 localhost kernel: software IO TLB: area num 8.
Dec 05 11:03:41 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 05 11:03:41 localhost kernel: ftrace: allocating 49335 entries in 193 pages
Dec 05 11:03:41 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 05 11:03:41 localhost kernel: Dynamic Preempt: voluntary
Dec 05 11:03:41 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 05 11:03:41 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 05 11:03:41 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 05 11:03:41 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 05 11:03:41 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 05 11:03:41 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 05 11:03:41 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 05 11:03:41 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 05 11:03:41 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 05 11:03:41 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 05 11:03:41 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 05 11:03:41 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 05 11:03:41 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 05 11:03:41 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 05 11:03:41 localhost kernel: Console: colour VGA+ 80x25
Dec 05 11:03:41 localhost kernel: printk: console [ttyS0] enabled
Dec 05 11:03:41 localhost kernel: ACPI: Core revision 20230331
Dec 05 11:03:41 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 05 11:03:41 localhost kernel: x2apic enabled
Dec 05 11:03:41 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 05 11:03:41 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 05 11:03:41 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec 05 11:03:41 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 05 11:03:41 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 05 11:03:41 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 05 11:03:41 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 05 11:03:41 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 05 11:03:41 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 05 11:03:41 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 05 11:03:41 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 05 11:03:41 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 05 11:03:41 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 05 11:03:41 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 05 11:03:41 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 05 11:03:41 localhost kernel: x86/bugs: return thunk changed
Dec 05 11:03:41 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 05 11:03:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 05 11:03:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 05 11:03:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 05 11:03:41 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 05 11:03:41 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 05 11:03:41 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 05 11:03:41 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 05 11:03:41 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 05 11:03:41 localhost kernel: landlock: Up and running.
Dec 05 11:03:41 localhost kernel: Yama: becoming mindful.
Dec 05 11:03:41 localhost kernel: SELinux:  Initializing.
Dec 05 11:03:41 localhost kernel: LSM support for eBPF active
Dec 05 11:03:41 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 05 11:03:41 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 05 11:03:41 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 05 11:03:41 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 05 11:03:41 localhost kernel: ... version:                0
Dec 05 11:03:41 localhost kernel: ... bit width:              48
Dec 05 11:03:41 localhost kernel: ... generic registers:      6
Dec 05 11:03:41 localhost kernel: ... value mask:             0000ffffffffffff
Dec 05 11:03:41 localhost kernel: ... max period:             00007fffffffffff
Dec 05 11:03:41 localhost kernel: ... fixed-purpose events:   0
Dec 05 11:03:41 localhost kernel: ... event mask:             000000000000003f
Dec 05 11:03:41 localhost kernel: signal: max sigframe size: 1776
Dec 05 11:03:41 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 05 11:03:41 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 05 11:03:41 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 05 11:03:41 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 05 11:03:41 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 05 11:03:41 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 05 11:03:41 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec 05 11:03:41 localhost kernel: node 0 deferred pages initialised in 10ms
Dec 05 11:03:41 localhost kernel: Memory: 7763740K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618208K reserved, 0K cma-reserved)
Dec 05 11:03:41 localhost kernel: devtmpfs: initialized
Dec 05 11:03:41 localhost kernel: x86/mm: Memory block size: 128MB
Dec 05 11:03:41 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 05 11:03:41 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 05 11:03:41 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 05 11:03:41 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 05 11:03:41 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 05 11:03:41 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 05 11:03:41 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 05 11:03:41 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 05 11:03:41 localhost kernel: audit: type=2000 audit(1764932619.865:1): state=initialized audit_enabled=0 res=1
Dec 05 11:03:41 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 05 11:03:41 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 05 11:03:41 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 05 11:03:41 localhost kernel: cpuidle: using governor menu
Dec 05 11:03:41 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 05 11:03:41 localhost kernel: PCI: Using configuration type 1 for base access
Dec 05 11:03:41 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 05 11:03:41 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 05 11:03:41 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 05 11:03:41 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 05 11:03:41 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 05 11:03:41 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 05 11:03:41 localhost kernel: Demotion targets for Node 0: null
Dec 05 11:03:41 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 05 11:03:41 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 05 11:03:41 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 05 11:03:41 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 05 11:03:41 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 05 11:03:41 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 05 11:03:41 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 05 11:03:41 localhost kernel: ACPI: Interpreter enabled
Dec 05 11:03:41 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 05 11:03:41 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 05 11:03:41 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 05 11:03:41 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 05 11:03:41 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 05 11:03:41 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 05 11:03:41 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [3] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [4] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [5] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [6] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [7] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [8] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [9] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [10] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [11] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [12] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [13] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [14] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [15] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [16] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [17] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [18] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [19] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [20] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [21] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [22] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [23] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [24] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [25] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [26] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [27] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [28] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [29] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [30] registered
Dec 05 11:03:41 localhost kernel: acpiphp: Slot [31] registered
Dec 05 11:03:41 localhost kernel: PCI host bridge to bus 0000:00
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 05 11:03:41 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 05 11:03:41 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 05 11:03:41 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 05 11:03:41 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 05 11:03:41 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 05 11:03:41 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 05 11:03:41 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 05 11:03:41 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 05 11:03:41 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 05 11:03:41 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 05 11:03:41 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 05 11:03:41 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 05 11:03:41 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 05 11:03:41 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 05 11:03:41 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 05 11:03:41 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 05 11:03:41 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 05 11:03:41 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 05 11:03:41 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 05 11:03:41 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 05 11:03:41 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 05 11:03:41 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 05 11:03:41 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 05 11:03:41 localhost kernel: iommu: Default domain type: Translated
Dec 05 11:03:41 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 05 11:03:41 localhost kernel: SCSI subsystem initialized
Dec 05 11:03:41 localhost kernel: ACPI: bus type USB registered
Dec 05 11:03:41 localhost kernel: usbcore: registered new interface driver usbfs
Dec 05 11:03:41 localhost kernel: usbcore: registered new interface driver hub
Dec 05 11:03:41 localhost kernel: usbcore: registered new device driver usb
Dec 05 11:03:41 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 05 11:03:41 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 05 11:03:41 localhost kernel: PTP clock support registered
Dec 05 11:03:41 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 05 11:03:41 localhost kernel: NetLabel: Initializing
Dec 05 11:03:41 localhost kernel: NetLabel:  domain hash size = 128
Dec 05 11:03:41 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 05 11:03:41 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 05 11:03:41 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 05 11:03:41 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 05 11:03:41 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 05 11:03:41 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 05 11:03:41 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 05 11:03:41 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 05 11:03:41 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 05 11:03:41 localhost kernel: vgaarb: loaded
Dec 05 11:03:41 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 05 11:03:41 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 05 11:03:41 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 05 11:03:41 localhost kernel: pnp: PnP ACPI init
Dec 05 11:03:41 localhost kernel: pnp 00:03: [dma 2]
Dec 05 11:03:41 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 05 11:03:41 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 05 11:03:41 localhost kernel: NET: Registered PF_INET protocol family
Dec 05 11:03:41 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 05 11:03:41 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 05 11:03:41 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 05 11:03:41 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 05 11:03:41 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 05 11:03:41 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 05 11:03:41 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 05 11:03:41 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 05 11:03:41 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 05 11:03:41 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 05 11:03:41 localhost kernel: NET: Registered PF_XDP protocol family
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 05 11:03:41 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 05 11:03:41 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 05 11:03:41 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 05 11:03:41 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 76426 usecs
Dec 05 11:03:41 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 05 11:03:41 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 05 11:03:41 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 05 11:03:41 localhost kernel: ACPI: bus type thunderbolt registered
Dec 05 11:03:41 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 05 11:03:41 localhost kernel: Initialise system trusted keyrings
Dec 05 11:03:41 localhost kernel: Key type blacklist registered
Dec 05 11:03:41 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 05 11:03:41 localhost kernel: zbud: loaded
Dec 05 11:03:41 localhost kernel: integrity: Platform Keyring initialized
Dec 05 11:03:41 localhost kernel: integrity: Machine keyring initialized
Dec 05 11:03:41 localhost kernel: Freeing initrd memory: 87804K
Dec 05 11:03:41 localhost kernel: NET: Registered PF_ALG protocol family
Dec 05 11:03:41 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 05 11:03:41 localhost kernel: Key type asymmetric registered
Dec 05 11:03:41 localhost kernel: Asymmetric key parser 'x509' registered
Dec 05 11:03:41 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 05 11:03:41 localhost kernel: io scheduler mq-deadline registered
Dec 05 11:03:41 localhost kernel: io scheduler kyber registered
Dec 05 11:03:41 localhost kernel: io scheduler bfq registered
Dec 05 11:03:41 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 05 11:03:41 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 05 11:03:41 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 05 11:03:41 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 05 11:03:41 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 05 11:03:41 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 05 11:03:41 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 05 11:03:41 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 05 11:03:41 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 05 11:03:41 localhost kernel: Non-volatile memory driver v1.3
Dec 05 11:03:41 localhost kernel: rdac: device handler registered
Dec 05 11:03:41 localhost kernel: hp_sw: device handler registered
Dec 05 11:03:41 localhost kernel: emc: device handler registered
Dec 05 11:03:41 localhost kernel: alua: device handler registered
Dec 05 11:03:41 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 05 11:03:41 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 05 11:03:41 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 05 11:03:41 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 05 11:03:41 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 05 11:03:41 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 05 11:03:41 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 05 11:03:41 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec 05 11:03:41 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 05 11:03:41 localhost kernel: hub 1-0:1.0: USB hub found
Dec 05 11:03:41 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 05 11:03:41 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 05 11:03:41 localhost kernel: usbserial: USB Serial support registered for generic
Dec 05 11:03:41 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 05 11:03:41 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 05 11:03:41 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 05 11:03:41 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 05 11:03:41 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 05 11:03:41 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 05 11:03:41 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 05 11:03:41 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-05T11:03:40 UTC (1764932620)
Dec 05 11:03:41 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 05 11:03:41 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 05 11:03:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 05 11:03:41 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 05 11:03:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 05 11:03:41 localhost kernel: usbcore: registered new interface driver usbhid
Dec 05 11:03:41 localhost kernel: usbhid: USB HID core driver
Dec 05 11:03:41 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 05 11:03:41 localhost kernel: Initializing XFRM netlink socket
Dec 05 11:03:41 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 05 11:03:41 localhost kernel: Segment Routing with IPv6
Dec 05 11:03:41 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 05 11:03:41 localhost kernel: mpls_gso: MPLS GSO support
Dec 05 11:03:41 localhost kernel: IPI shorthand broadcast: enabled
Dec 05 11:03:41 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 05 11:03:41 localhost kernel: AES CTR mode by8 optimization enabled
Dec 05 11:03:41 localhost kernel: sched_clock: Marking stable (1157026929, 143918630)->(1412857209, -111911650)
Dec 05 11:03:41 localhost kernel: registered taskstats version 1
Dec 05 11:03:41 localhost kernel: Loading compiled-in X.509 certificates
Dec 05 11:03:41 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 05 11:03:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 05 11:03:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 05 11:03:41 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 05 11:03:41 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 05 11:03:41 localhost kernel: Demotion targets for Node 0: null
Dec 05 11:03:41 localhost kernel: page_owner is disabled
Dec 05 11:03:41 localhost kernel: Key type .fscrypt registered
Dec 05 11:03:41 localhost kernel: Key type fscrypt-provisioning registered
Dec 05 11:03:41 localhost kernel: Key type big_key registered
Dec 05 11:03:41 localhost kernel: Key type encrypted registered
Dec 05 11:03:41 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 05 11:03:41 localhost kernel: Loading compiled-in module X.509 certificates
Dec 05 11:03:41 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 05 11:03:41 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 05 11:03:41 localhost kernel: ima: No architecture policies found
Dec 05 11:03:41 localhost kernel: evm: Initialising EVM extended attributes:
Dec 05 11:03:41 localhost kernel: evm: security.selinux
Dec 05 11:03:41 localhost kernel: evm: security.SMACK64 (disabled)
Dec 05 11:03:41 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 05 11:03:41 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 05 11:03:41 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 05 11:03:41 localhost kernel: evm: security.apparmor (disabled)
Dec 05 11:03:41 localhost kernel: evm: security.ima
Dec 05 11:03:41 localhost kernel: evm: security.capability
Dec 05 11:03:41 localhost kernel: evm: HMAC attrs: 0x1
Dec 05 11:03:41 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 05 11:03:41 localhost kernel: Running certificate verification RSA selftest
Dec 05 11:03:41 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 05 11:03:41 localhost kernel: Running certificate verification ECDSA selftest
Dec 05 11:03:41 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 05 11:03:41 localhost kernel: clk: Disabling unused clocks
Dec 05 11:03:41 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 05 11:03:41 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec 05 11:03:41 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 05 11:03:41 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec 05 11:03:41 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 05 11:03:41 localhost kernel: Run /init as init process
Dec 05 11:03:41 localhost kernel:   with arguments:
Dec 05 11:03:41 localhost kernel:     /init
Dec 05 11:03:41 localhost kernel:   with environment:
Dec 05 11:03:41 localhost kernel:     HOME=/
Dec 05 11:03:41 localhost kernel:     TERM=linux
Dec 05 11:03:41 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64
Dec 05 11:03:41 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 05 11:03:41 localhost systemd[1]: Detected virtualization kvm.
Dec 05 11:03:41 localhost systemd[1]: Detected architecture x86-64.
Dec 05 11:03:41 localhost systemd[1]: Running in initrd.
Dec 05 11:03:41 localhost systemd[1]: No hostname configured, using default hostname.
Dec 05 11:03:41 localhost systemd[1]: Hostname set to <localhost>.
Dec 05 11:03:41 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 05 11:03:41 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 05 11:03:41 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 05 11:03:41 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 05 11:03:41 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 05 11:03:41 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 05 11:03:41 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 05 11:03:41 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 05 11:03:41 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 05 11:03:41 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 05 11:03:41 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 05 11:03:41 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 05 11:03:41 localhost systemd[1]: Reached target Local File Systems.
Dec 05 11:03:41 localhost systemd[1]: Reached target Path Units.
Dec 05 11:03:41 localhost systemd[1]: Reached target Slice Units.
Dec 05 11:03:41 localhost systemd[1]: Reached target Swaps.
Dec 05 11:03:41 localhost systemd[1]: Reached target Timer Units.
Dec 05 11:03:41 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 05 11:03:41 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 05 11:03:41 localhost systemd[1]: Listening on Journal Socket.
Dec 05 11:03:41 localhost systemd[1]: Listening on udev Control Socket.
Dec 05 11:03:41 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 05 11:03:41 localhost systemd[1]: Reached target Socket Units.
Dec 05 11:03:41 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 05 11:03:41 localhost systemd[1]: Starting Journal Service...
Dec 05 11:03:41 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 05 11:03:41 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 05 11:03:41 localhost systemd[1]: Starting Create System Users...
Dec 05 11:03:41 localhost systemd[1]: Starting Setup Virtual Console...
Dec 05 11:03:41 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 05 11:03:41 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 05 11:03:41 localhost systemd-journald[307]: Journal started
Dec 05 11:03:41 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/60bd4df1481e4d2395858528ade5c2b1) is 8.0M, max 153.6M, 145.6M free.
Dec 05 11:03:41 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Dec 05 11:03:41 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Dec 05 11:03:41 localhost systemd[1]: Started Journal Service.
Dec 05 11:03:41 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 05 11:03:41 localhost systemd[1]: Finished Create System Users.
Dec 05 11:03:41 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 05 11:03:41 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 05 11:03:41 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 05 11:03:41 localhost systemd[1]: Finished Setup Virtual Console.
Dec 05 11:03:41 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 05 11:03:41 localhost systemd[1]: Starting dracut cmdline hook...
Dec 05 11:03:41 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 05 11:03:41 localhost dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Dec 05 11:03:41 localhost dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 05 11:03:41 localhost systemd[1]: Finished dracut cmdline hook.
Dec 05 11:03:41 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 05 11:03:41 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 05 11:03:41 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 05 11:03:41 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 05 11:03:41 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 05 11:03:41 localhost kernel: RPC: Registered udp transport module.
Dec 05 11:03:41 localhost kernel: RPC: Registered tcp transport module.
Dec 05 11:03:41 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 05 11:03:41 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 05 11:03:41 localhost rpc.statd[443]: Version 2.5.4 starting
Dec 05 11:03:41 localhost rpc.statd[443]: Initializing NSM state
Dec 05 11:03:41 localhost rpc.idmapd[448]: Setting log level to 0
Dec 05 11:03:41 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 05 11:03:41 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 05 11:03:41 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Dec 05 11:03:41 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 05 11:03:41 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 05 11:03:41 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 05 11:03:41 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 05 11:03:41 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 05 11:03:41 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 05 11:03:41 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 05 11:03:41 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 05 11:03:41 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 05 11:03:41 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 05 11:03:41 localhost systemd[1]: Reached target Network.
Dec 05 11:03:41 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 05 11:03:41 localhost systemd[1]: Starting dracut initqueue hook...
Dec 05 11:03:41 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 05 11:03:41 localhost kernel: libata version 3.00 loaded.
Dec 05 11:03:41 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 05 11:03:41 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 05 11:03:41 localhost kernel:  vda: vda1
Dec 05 11:03:41 localhost systemd-udevd[484]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:03:41 localhost kernel: scsi host0: ata_piix
Dec 05 11:03:41 localhost kernel: scsi host1: ata_piix
Dec 05 11:03:41 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 05 11:03:41 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 05 11:03:41 localhost systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 05 11:03:42 localhost systemd[1]: Reached target Initrd Root Device.
Dec 05 11:03:42 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 05 11:03:42 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 05 11:03:42 localhost systemd[1]: Reached target System Initialization.
Dec 05 11:03:42 localhost kernel: ata1: found unknown device (class 0)
Dec 05 11:03:42 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 05 11:03:42 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 05 11:03:42 localhost systemd[1]: Reached target Basic System.
Dec 05 11:03:42 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 05 11:03:42 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 05 11:03:42 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 05 11:03:42 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 05 11:03:42 localhost systemd[1]: Finished dracut initqueue hook.
Dec 05 11:03:42 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 05 11:03:42 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 05 11:03:42 localhost systemd[1]: Reached target Remote File Systems.
Dec 05 11:03:42 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 05 11:03:42 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 05 11:03:42 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec 05 11:03:42 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Dec 05 11:03:42 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 05 11:03:42 localhost systemd[1]: Mounting /sysroot...
Dec 05 11:03:42 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 05 11:03:42 localhost kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec 05 11:03:42 localhost kernel: XFS (vda1): Ending clean mount
Dec 05 11:03:42 localhost systemd[1]: Mounted /sysroot.
Dec 05 11:03:42 localhost systemd[1]: Reached target Initrd Root File System.
Dec 05 11:03:42 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 05 11:03:42 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 05 11:03:42 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 05 11:03:42 localhost systemd[1]: Reached target Initrd File Systems.
Dec 05 11:03:42 localhost systemd[1]: Reached target Initrd Default Target.
Dec 05 11:03:42 localhost systemd[1]: Starting dracut mount hook...
Dec 05 11:03:42 localhost systemd[1]: Finished dracut mount hook.
Dec 05 11:03:42 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 05 11:03:43 localhost rpc.idmapd[448]: exiting on signal 15
Dec 05 11:03:43 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 05 11:03:43 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 05 11:03:43 localhost systemd[1]: Stopped target Network.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Timer Units.
Dec 05 11:03:43 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 05 11:03:43 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Basic System.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Path Units.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Remote File Systems.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Slice Units.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Socket Units.
Dec 05 11:03:43 localhost systemd[1]: Stopped target System Initialization.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Local File Systems.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Swaps.
Dec 05 11:03:43 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped dracut mount hook.
Dec 05 11:03:43 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 05 11:03:43 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 05 11:03:43 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 05 11:03:43 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 05 11:03:43 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 05 11:03:43 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 05 11:03:43 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 05 11:03:43 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 05 11:03:43 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 05 11:03:43 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 05 11:03:43 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 05 11:03:43 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Closed udev Control Socket.
Dec 05 11:03:43 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Closed udev Kernel Socket.
Dec 05 11:03:43 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 05 11:03:43 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 05 11:03:43 localhost systemd[1]: Starting Cleanup udev Database...
Dec 05 11:03:43 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 05 11:03:43 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 05 11:03:43 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Create System Users.
Dec 05 11:03:43 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Finished Cleanup udev Database.
Dec 05 11:03:43 localhost systemd[1]: Reached target Switch Root.
Dec 05 11:03:43 localhost systemd[1]: Starting Switch Root...
Dec 05 11:03:43 localhost systemd[1]: Switching root.
Dec 05 11:03:43 localhost systemd-journald[307]: Journal stopped
Dec 05 11:03:43 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Dec 05 11:03:43 localhost kernel: audit: type=1404 audit(1764932623.289:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 05 11:03:43 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:03:43 localhost kernel: SELinux:  policy capability open_perms=1
Dec 05 11:03:43 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:03:43 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:03:43 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:03:43 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:03:43 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:03:43 localhost kernel: audit: type=1403 audit(1764932623.410:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 05 11:03:43 localhost systemd[1]: Successfully loaded SELinux policy in 124.917ms.
Dec 05 11:03:43 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.387ms.
Dec 05 11:03:43 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 05 11:03:43 localhost systemd[1]: Detected virtualization kvm.
Dec 05 11:03:43 localhost systemd[1]: Detected architecture x86-64.
Dec 05 11:03:43 localhost systemd-rc-local-generator[636]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:03:43 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped Switch Root.
Dec 05 11:03:43 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 05 11:03:43 localhost systemd[1]: Created slice Slice /system/getty.
Dec 05 11:03:43 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 05 11:03:43 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 05 11:03:43 localhost systemd[1]: Created slice User and Session Slice.
Dec 05 11:03:43 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 05 11:03:43 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 05 11:03:43 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 05 11:03:43 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Switch Root.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 05 11:03:43 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 05 11:03:43 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 05 11:03:43 localhost systemd[1]: Reached target Path Units.
Dec 05 11:03:43 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 05 11:03:43 localhost systemd[1]: Reached target Slice Units.
Dec 05 11:03:43 localhost systemd[1]: Reached target Swaps.
Dec 05 11:03:43 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 05 11:03:43 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 05 11:03:43 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 05 11:03:43 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 05 11:03:43 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 05 11:03:43 localhost systemd[1]: Listening on udev Control Socket.
Dec 05 11:03:43 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 05 11:03:43 localhost systemd[1]: Mounting Huge Pages File System...
Dec 05 11:03:43 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 05 11:03:43 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 05 11:03:43 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 05 11:03:43 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 05 11:03:43 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 05 11:03:43 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 05 11:03:43 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 05 11:03:43 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 05 11:03:43 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 05 11:03:43 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 05 11:03:43 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 05 11:03:43 localhost systemd[1]: Stopped Journal Service.
Dec 05 11:03:43 localhost systemd[1]: Starting Journal Service...
Dec 05 11:03:43 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 05 11:03:43 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 05 11:03:43 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 05 11:03:43 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 05 11:03:43 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 05 11:03:43 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 05 11:03:43 localhost kernel: fuse: init (API version 7.37)
Dec 05 11:03:43 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 05 11:03:43 localhost systemd[1]: Mounted Huge Pages File System.
Dec 05 11:03:43 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 05 11:03:43 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 05 11:03:43 localhost systemd-journald[677]: Journal started
Dec 05 11:03:43 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 05 11:03:43 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 05 11:03:43 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Started Journal Service.
Dec 05 11:03:43 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 05 11:03:43 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 05 11:03:43 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 05 11:03:43 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 05 11:03:43 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 05 11:03:43 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 05 11:03:43 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 05 11:03:43 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 05 11:03:43 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 05 11:03:43 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 05 11:03:43 localhost kernel: ACPI: bus type drm_connector registered
Dec 05 11:03:43 localhost systemd[1]: Mounting FUSE Control File System...
Dec 05 11:03:43 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 05 11:03:43 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 05 11:03:43 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 05 11:03:43 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 05 11:03:43 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 05 11:03:43 localhost systemd[1]: Starting Create System Users...
Dec 05 11:03:43 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 05 11:03:43 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 05 11:03:43 localhost systemd[1]: Mounted FUSE Control File System.
Dec 05 11:03:43 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 05 11:03:43 localhost systemd-journald[677]: Received client request to flush runtime journal.
Dec 05 11:03:43 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 05 11:03:43 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 05 11:03:43 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 05 11:03:43 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 05 11:03:44 localhost systemd[1]: Finished Create System Users.
Dec 05 11:03:44 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 05 11:03:44 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 05 11:03:44 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 05 11:03:44 localhost systemd[1]: Reached target Local File Systems.
Dec 05 11:03:44 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 05 11:03:44 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 05 11:03:44 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 05 11:03:44 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 05 11:03:44 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 05 11:03:44 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 05 11:03:44 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 05 11:03:44 localhost bootctl[693]: Couldn't find EFI system partition, skipping.
Dec 05 11:03:44 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 05 11:03:44 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 05 11:03:44 localhost systemd[1]: Starting Security Auditing Service...
Dec 05 11:03:44 localhost systemd[1]: Starting RPC Bind...
Dec 05 11:03:44 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 05 11:03:44 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 05 11:03:44 localhost auditd[699]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 05 11:03:44 localhost auditd[699]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 05 11:03:44 localhost systemd[1]: Started RPC Bind.
Dec 05 11:03:44 localhost augenrules[704]: /sbin/augenrules: No change
Dec 05 11:03:44 localhost augenrules[719]: No rules
Dec 05 11:03:44 localhost augenrules[719]: enabled 1
Dec 05 11:03:44 localhost augenrules[719]: failure 1
Dec 05 11:03:44 localhost augenrules[719]: pid 699
Dec 05 11:03:44 localhost augenrules[719]: rate_limit 0
Dec 05 11:03:44 localhost augenrules[719]: backlog_limit 8192
Dec 05 11:03:44 localhost augenrules[719]: lost 0
Dec 05 11:03:44 localhost augenrules[719]: backlog 3
Dec 05 11:03:44 localhost augenrules[719]: backlog_wait_time 60000
Dec 05 11:03:44 localhost augenrules[719]: backlog_wait_time_actual 0
Dec 05 11:03:44 localhost augenrules[719]: enabled 1
Dec 05 11:03:44 localhost augenrules[719]: failure 1
Dec 05 11:03:44 localhost augenrules[719]: pid 699
Dec 05 11:03:44 localhost augenrules[719]: rate_limit 0
Dec 05 11:03:44 localhost augenrules[719]: backlog_limit 8192
Dec 05 11:03:44 localhost augenrules[719]: lost 0
Dec 05 11:03:44 localhost augenrules[719]: backlog 4
Dec 05 11:03:44 localhost augenrules[719]: backlog_wait_time 60000
Dec 05 11:03:44 localhost augenrules[719]: backlog_wait_time_actual 0
Dec 05 11:03:44 localhost augenrules[719]: enabled 1
Dec 05 11:03:44 localhost augenrules[719]: failure 1
Dec 05 11:03:44 localhost augenrules[719]: pid 699
Dec 05 11:03:44 localhost augenrules[719]: rate_limit 0
Dec 05 11:03:44 localhost augenrules[719]: backlog_limit 8192
Dec 05 11:03:44 localhost augenrules[719]: lost 0
Dec 05 11:03:44 localhost augenrules[719]: backlog 4
Dec 05 11:03:44 localhost augenrules[719]: backlog_wait_time 60000
Dec 05 11:03:44 localhost augenrules[719]: backlog_wait_time_actual 0
Dec 05 11:03:44 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 05 11:03:44 localhost systemd[1]: Started Security Auditing Service.
Dec 05 11:03:44 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 05 11:03:44 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 05 11:03:44 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 05 11:03:44 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 05 11:03:44 localhost systemd[1]: Starting Update is Completed...
Dec 05 11:03:44 localhost systemd[1]: Finished Update is Completed.
Dec 05 11:03:44 localhost systemd-udevd[727]: Using default interface naming scheme 'rhel-9.0'.
Dec 05 11:03:44 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 05 11:03:44 localhost systemd[1]: Reached target System Initialization.
Dec 05 11:03:44 localhost systemd[1]: Started dnf makecache --timer.
Dec 05 11:03:44 localhost systemd[1]: Started Daily rotation of log files.
Dec 05 11:03:44 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 05 11:03:44 localhost systemd[1]: Reached target Timer Units.
Dec 05 11:03:44 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 05 11:03:44 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 05 11:03:44 localhost systemd[1]: Reached target Socket Units.
Dec 05 11:03:44 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 05 11:03:44 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 05 11:03:44 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 05 11:03:44 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 05 11:03:44 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 05 11:03:44 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 05 11:03:44 localhost systemd-udevd[744]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:03:44 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 05 11:03:44 localhost systemd[1]: Reached target Basic System.
Dec 05 11:03:44 localhost dbus-broker-lau[760]: Ready
Dec 05 11:03:44 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 05 11:03:44 localhost systemd[1]: Starting NTP client/server...
Dec 05 11:03:44 localhost chronyd[781]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 05 11:03:44 localhost chronyd[781]: Loaded 0 symmetric keys
Dec 05 11:03:44 localhost chronyd[781]: Using right/UTC timezone to obtain leap second data
Dec 05 11:03:44 localhost chronyd[781]: Loaded seccomp filter (level 2)
Dec 05 11:03:44 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 05 11:03:44 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 05 11:03:44 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 05 11:03:44 localhost systemd[1]: Started irqbalance daemon.
Dec 05 11:03:44 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 05 11:03:44 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 11:03:44 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 11:03:44 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 11:03:44 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 05 11:03:44 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 05 11:03:44 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 05 11:03:44 localhost systemd[1]: Starting User Login Management...
Dec 05 11:03:44 localhost systemd[1]: Started NTP client/server.
Dec 05 11:03:44 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 05 11:03:44 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 05 11:03:44 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 05 11:03:44 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 05 11:03:44 localhost kernel: kvm_amd: TSC scaling supported
Dec 05 11:03:44 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 05 11:03:44 localhost kernel: kvm_amd: Nested Paging enabled
Dec 05 11:03:44 localhost kernel: kvm_amd: LBR virtualization supported
Dec 05 11:03:44 localhost systemd-logind[792]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 05 11:03:44 localhost systemd-logind[792]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 05 11:03:44 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 05 11:03:44 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 05 11:03:44 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 05 11:03:44 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 05 11:03:44 localhost systemd-logind[792]: New seat seat0.
Dec 05 11:03:44 localhost systemd[1]: Started User Login Management.
Dec 05 11:03:44 localhost kernel: Console: switching to colour dummy device 80x25
Dec 05 11:03:44 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 05 11:03:44 localhost kernel: [drm] features: -context_init
Dec 05 11:03:44 localhost kernel: [drm] number of scanouts: 1
Dec 05 11:03:44 localhost kernel: [drm] number of cap sets: 0
Dec 05 11:03:44 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 05 11:03:44 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 05 11:03:44 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 05 11:03:44 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 05 11:03:45 localhost iptables.init[785]: iptables: Applying firewall rules: [  OK  ]
Dec 05 11:03:45 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 05 11:03:45 localhost cloud-init[837]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 05 Dec 2025 11:03:45 +0000. Up 5.91 seconds.
Dec 05 11:03:45 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 05 11:03:45 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 05 11:03:45 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp5q2ph1a4.mount: Deactivated successfully.
Dec 05 11:03:45 localhost systemd[1]: Starting Hostname Service...
Dec 05 11:03:45 localhost systemd[1]: Started Hostname Service.
Dec 05 11:03:45 np0005546909.novalocal systemd-hostnamed[851]: Hostname set to <np0005546909.novalocal> (static)
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Reached target Preparation for Network.
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Starting Network Manager...
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8452] NetworkManager (version 1.54.1-1.el9) is starting... (boot:f0f69436-bbfa-48e7-b73e-9b22f091bec6)
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8458] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8538] manager[0x5568d3ad5080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8575] hostname: hostname: using hostnamed
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8576] hostname: static hostname changed from (none) to "np0005546909.novalocal"
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8582] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8692] manager[0x5568d3ad5080]: rfkill: Wi-Fi hardware radio set enabled
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8693] manager[0x5568d3ad5080]: rfkill: WWAN hardware radio set enabled
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8742] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8744] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8745] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8745] manager: Networking is enabled by state file
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8748] settings: Loaded settings plugin: keyfile (internal)
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8766] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8789] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8804] dhcp: init: Using DHCP client 'internal'
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8806] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8820] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8828] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8836] device (lo): Activation: starting connection 'lo' (0f41f2b7-1648-4484-8c07-96c2526b8b00)
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8845] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8849] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8881] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8886] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8889] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8891] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8894] device (eth0): carrier: link connected
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8897] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8903] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8909] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8914] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8915] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8917] manager: NetworkManager state is now CONNECTING
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8918] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8925] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.8928] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Started Network Manager.
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Reached target Network.
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9124] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9134] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9159] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9262] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9264] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9272] device (lo): Activation: successful, device activated.
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9280] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9281] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9285] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9289] device (eth0): Activation: successful, device activated.
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9298] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 05 11:03:45 np0005546909.novalocal NetworkManager[855]: <info>  [1764932625.9301] manager: startup complete
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Reached target NFS client services.
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Reached target Remote File Systems.
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 05 11:03:45 np0005546909.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 05 Dec 2025 11:03:46 +0000. Up 6.87 seconds.
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: |  eth0  | True |        38.102.83.241         | 255.255.255.0 | global | fa:16:3e:00:6c:52 |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe00:6c52/64 |       .       |  link  | fa:16:3e:00:6c:52 |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 05 11:03:46 np0005546909.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 05 11:03:47 np0005546909.novalocal useradd[986]: new group: name=cloud-user, GID=1001
Dec 05 11:03:47 np0005546909.novalocal useradd[986]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 05 11:03:47 np0005546909.novalocal useradd[986]: add 'cloud-user' to group 'adm'
Dec 05 11:03:47 np0005546909.novalocal useradd[986]: add 'cloud-user' to group 'systemd-journal'
Dec 05 11:03:47 np0005546909.novalocal useradd[986]: add 'cloud-user' to shadow group 'adm'
Dec 05 11:03:47 np0005546909.novalocal useradd[986]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: Generating public/private rsa key pair.
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: The key fingerprint is:
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: SHA256:wZRtmTk4xsuEHAFXH1CEZ7A5hnT5jJOf/JEm3f1kVic root@np0005546909.novalocal
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: The key's randomart image is:
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: +---[RSA 3072]----+
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |    .o=**@++     |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |     oo=X+X.     |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |      .+B@..     |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |       .*oo   E o|
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |        S+ o o oo|
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |          = = . =|
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |           + . +.|
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |            .   .|
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |                 |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: +----[SHA256]-----+
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: Generating public/private ecdsa key pair.
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: The key fingerprint is:
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: SHA256:1IoFMFejFR73nAw9wcEDdArEGSpq08/uI1F9/3/Yz88 root@np0005546909.novalocal
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: The key's randomart image is:
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: +---[ECDSA 256]---+
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |    o.o+X===+o   |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |     o *o* B*.   |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |    . o.+ o =o   |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |   o ..+...      |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |  + ... S. .     |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: | . ..o      .    |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |     .o      . o |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |    ...       ooo|
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |     oo.       .E|
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: +----[SHA256]-----+
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: Generating public/private ed25519 key pair.
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: The key fingerprint is:
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: SHA256:g7DiXQvFLA7MQpifJ0DQGua+AsTYSz/Yk0wN3orkgv4 root@np0005546909.novalocal
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: The key's randomart image is:
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: +--[ED25519 256]--+
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |==               |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |*+. .o           |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |**+oo++          |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |o=Bo+=o.         |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |++.@+oo S        |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |+o=oOo . .       |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |o.o .o.          |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |.o               |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: |. .E             |
Dec 05 11:03:47 np0005546909.novalocal cloud-init[920]: +----[SHA256]-----+
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Reached target Network is Online.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Starting System Logging Service...
Dec 05 11:03:47 np0005546909.novalocal sm-notify[1003]: Version 2.5.4 starting
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Starting Permit User Sessions...
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 05 11:03:47 np0005546909.novalocal sshd[1005]: Server listening on 0.0.0.0 port 22.
Dec 05 11:03:47 np0005546909.novalocal sshd[1005]: Server listening on :: port 22.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Finished Permit User Sessions.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Started Command Scheduler.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Started Getty on tty1.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Reached target Login Prompts.
Dec 05 11:03:47 np0005546909.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Dec 05 11:03:47 np0005546909.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 05 11:03:47 np0005546909.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 72% if used.)
Dec 05 11:03:47 np0005546909.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Dec 05 11:03:47 np0005546909.novalocal rsyslogd[1004]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1004" x-info="https://www.rsyslog.com"] start
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Started System Logging Service.
Dec 05 11:03:47 np0005546909.novalocal rsyslogd[1004]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Reached target Multi-User System.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 05 11:03:47 np0005546909.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 05 11:03:47 np0005546909.novalocal rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 11:03:47 np0005546909.novalocal kdumpctl[1014]: kdump: No kdump initial ramdisk found.
Dec 05 11:03:47 np0005546909.novalocal kdumpctl[1014]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec 05 11:03:47 np0005546909.novalocal cloud-init[1159]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 05 Dec 2025 11:03:47 +0000. Up 8.53 seconds.
Dec 05 11:03:48 np0005546909.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 05 11:03:48 np0005546909.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 05 11:03:48 np0005546909.novalocal dracut[1265]: dracut-057-102.git20250818.el9
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec 05 11:03:48 np0005546909.novalocal cloud-init[1295]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 05 Dec 2025 11:03:48 +0000. Up 8.92 seconds.
Dec 05 11:03:48 np0005546909.novalocal cloud-init[1330]: #############################################################
Dec 05 11:03:48 np0005546909.novalocal cloud-init[1332]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 05 11:03:48 np0005546909.novalocal cloud-init[1340]: 256 SHA256:1IoFMFejFR73nAw9wcEDdArEGSpq08/uI1F9/3/Yz88 root@np0005546909.novalocal (ECDSA)
Dec 05 11:03:48 np0005546909.novalocal cloud-init[1342]: 256 SHA256:g7DiXQvFLA7MQpifJ0DQGua+AsTYSz/Yk0wN3orkgv4 root@np0005546909.novalocal (ED25519)
Dec 05 11:03:48 np0005546909.novalocal cloud-init[1344]: 3072 SHA256:wZRtmTk4xsuEHAFXH1CEZ7A5hnT5jJOf/JEm3f1kVic root@np0005546909.novalocal (RSA)
Dec 05 11:03:48 np0005546909.novalocal cloud-init[1345]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 05 11:03:48 np0005546909.novalocal cloud-init[1346]: #############################################################
Dec 05 11:03:48 np0005546909.novalocal cloud-init[1295]: Cloud-init v. 24.4-7.el9 finished at Fri, 05 Dec 2025 11:03:48 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.11 seconds
Dec 05 11:03:48 np0005546909.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 05 11:03:48 np0005546909.novalocal systemd[1]: Reached target Cloud-init target.
Dec 05 11:03:48 np0005546909.novalocal sshd-session[1368]: Connection closed by 38.102.83.114 port 54946 [preauth]
Dec 05 11:03:48 np0005546909.novalocal sshd-session[1382]: Unable to negotiate with 38.102.83.114 port 47832: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 05 11:03:48 np0005546909.novalocal sshd-session[1403]: Unable to negotiate with 38.102.83.114 port 47848: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 05 11:03:48 np0005546909.novalocal sshd-session[1409]: Unable to negotiate with 38.102.83.114 port 47862: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 05 11:03:48 np0005546909.novalocal sshd-session[1390]: Connection closed by 38.102.83.114 port 47846 [preauth]
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 05 11:03:48 np0005546909.novalocal sshd-session[1449]: Unable to negotiate with 38.102.83.114 port 47904: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 05 11:03:48 np0005546909.novalocal sshd-session[1460]: Unable to negotiate with 38.102.83.114 port 47908: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 05 11:03:48 np0005546909.novalocal sshd-session[1414]: Connection closed by 38.102.83.114 port 47872 [preauth]
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 05 11:03:48 np0005546909.novalocal sshd-session[1435]: Connection closed by 38.102.83.114 port 47888 [preauth]
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 05 11:03:48 np0005546909.novalocal dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: memstrack is not available
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: memstrack is not available
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 05 11:03:49 np0005546909.novalocal dracut[1267]: *** Including module: systemd ***
Dec 05 11:03:50 np0005546909.novalocal dracut[1267]: *** Including module: fips ***
Dec 05 11:03:50 np0005546909.novalocal dracut[1267]: *** Including module: systemd-initrd ***
Dec 05 11:03:50 np0005546909.novalocal dracut[1267]: *** Including module: i18n ***
Dec 05 11:03:50 np0005546909.novalocal dracut[1267]: *** Including module: drm ***
Dec 05 11:03:50 np0005546909.novalocal chronyd[781]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Dec 05 11:03:50 np0005546909.novalocal chronyd[781]: System clock TAI offset set to 37 seconds
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]: *** Including module: prefixdevname ***
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]: *** Including module: kernel-modules ***
Dec 05 11:03:51 np0005546909.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]: *** Including module: kernel-modules-extra ***
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]: *** Including module: qemu ***
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]: *** Including module: fstab-sys ***
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]: *** Including module: rootfs-block ***
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]: *** Including module: terminfo ***
Dec 05 11:03:51 np0005546909.novalocal dracut[1267]: *** Including module: udev-rules ***
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]: Skipping udev rule: 91-permissions.rules
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]: *** Including module: virtiofs ***
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]: *** Including module: dracut-systemd ***
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]: *** Including module: usrmount ***
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]: *** Including module: base ***
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]: *** Including module: fs-lib ***
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]: *** Including module: kdumpbase ***
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]:   microcode_ctl module: mangling fw_dir
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel" is ignored
Dec 05 11:03:52 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]: *** Including module: openssl ***
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]: *** Including module: shutdown ***
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]: *** Including module: squash ***
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]: *** Including modules done ***
Dec 05 11:03:53 np0005546909.novalocal dracut[1267]: *** Installing kernel module dependencies ***
Dec 05 11:03:54 np0005546909.novalocal dracut[1267]: *** Installing kernel module dependencies done ***
Dec 05 11:03:54 np0005546909.novalocal dracut[1267]: *** Resolving executable dependencies ***
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: IRQ 25 affinity is now unmanaged
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: IRQ 31 affinity is now unmanaged
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: IRQ 28 affinity is now unmanaged
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: IRQ 32 affinity is now unmanaged
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: IRQ 30 affinity is now unmanaged
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 05 11:03:55 np0005546909.novalocal irqbalance[790]: IRQ 29 affinity is now unmanaged
Dec 05 11:03:56 np0005546909.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 11:03:56 np0005546909.novalocal dracut[1267]: *** Resolving executable dependencies done ***
Dec 05 11:03:56 np0005546909.novalocal dracut[1267]: *** Generating early-microcode cpio image ***
Dec 05 11:03:56 np0005546909.novalocal dracut[1267]: *** Store current command line parameters ***
Dec 05 11:03:56 np0005546909.novalocal dracut[1267]: Stored kernel commandline:
Dec 05 11:03:56 np0005546909.novalocal dracut[1267]: No dracut internal kernel commandline stored in the initramfs
Dec 05 11:03:56 np0005546909.novalocal dracut[1267]: *** Install squash loader ***
Dec 05 11:03:57 np0005546909.novalocal dracut[1267]: *** Squashing the files inside the initramfs ***
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: *** Squashing the files inside the initramfs done ***
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: *** Hardlinking files ***
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: Mode:           real
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: Files:          50
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: Linked:         0 files
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: Compared:       0 xattrs
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: Compared:       0 files
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: Saved:          0 B
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: Duration:       0.001026 seconds
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: *** Hardlinking files done ***
Dec 05 11:03:58 np0005546909.novalocal dracut[1267]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec 05 11:03:59 np0005546909.novalocal kdumpctl[1014]: kdump: kexec: loaded kdump kernel
Dec 05 11:03:59 np0005546909.novalocal kdumpctl[1014]: kdump: Starting kdump: [OK]
Dec 05 11:03:59 np0005546909.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 05 11:03:59 np0005546909.novalocal systemd[1]: Startup finished in 1.484s (kernel) + 2.423s (initrd) + 16.074s (userspace) = 19.982s.
Dec 05 11:04:05 np0005546909.novalocal sshd-session[4296]: Accepted publickey for zuul from 38.102.83.114 port 57032 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 05 11:04:05 np0005546909.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 05 11:04:06 np0005546909.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 05 11:04:06 np0005546909.novalocal systemd-logind[792]: New session 1 of user zuul.
Dec 05 11:04:06 np0005546909.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 05 11:04:06 np0005546909.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Queued start job for default target Main User Target.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Created slice User Application Slice.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Started Daily Cleanup of User's Temporary Directories.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Reached target Paths.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Reached target Timers.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Starting D-Bus User Message Bus Socket...
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Starting Create User's Volatile Files and Directories...
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Finished Create User's Volatile Files and Directories.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Listening on D-Bus User Message Bus Socket.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Reached target Sockets.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Reached target Basic System.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Reached target Main User Target.
Dec 05 11:04:06 np0005546909.novalocal systemd[4300]: Startup finished in 136ms.
Dec 05 11:04:06 np0005546909.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 05 11:04:06 np0005546909.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 05 11:04:06 np0005546909.novalocal sshd-session[4296]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:04:06 np0005546909.novalocal sshd-session[4294]: Received disconnect from 197.248.8.33 port 49008:11: Bye Bye [preauth]
Dec 05 11:04:06 np0005546909.novalocal sshd-session[4294]: Disconnected from authenticating user root 197.248.8.33 port 49008 [preauth]
Dec 05 11:04:06 np0005546909.novalocal python3[4382]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:04:08 np0005546909.novalocal python3[4410]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:04:12 np0005546909.novalocal sshd-session[4445]: Received disconnect from 24.232.50.5 port 58548:11: Bye Bye [preauth]
Dec 05 11:04:12 np0005546909.novalocal sshd-session[4445]: Disconnected from authenticating user root 24.232.50.5 port 58548 [preauth]
Dec 05 11:04:14 np0005546909.novalocal python3[4470]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:04:15 np0005546909.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 11:04:16 np0005546909.novalocal python3[4512]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 05 11:04:18 np0005546909.novalocal python3[4538]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCipPqWEhKyX4DzIESCTQ5zeFvB37ZfQQTUe3rWhe/03Eja4O1tm2CJAU6o/v+Lq4C404mYmqiSqlPrK9lJclR8ymX6Vgc5kPdmbqL3yuwOHZIETrF6lSZrbAZ+B7KPs+3HGZE9+3cdNAMl2wE+hbkEq4XKqY4IIv4NAiZ1+XAtPreUOzrcWFfXPsN+ArpYmXv6RtvwToVw31Va5i8r0wQdQj8Eu9fgcpp0JD4YMJHk8nqC1MpsviDXYPMio35QAigcj9hqYh654FcwKvtGF82QakFYEUUuqyJx2gSTRrOBzZ9tKnByb9Qlk+8Pqx1aBiGaCjwiIP15av/wWl79eHnpm5gxNJci5Jw/REHkUzi5bcD9m2ZEjYGJSWAKzeLZ3Cw3/jRrYgQPJjrXrhhVE0kxPnFVVFlnI+NkXQRx6snw2OZjWBG4cn8+E+1Lg+xYbgeq8AVZPvTWLlccowOVcDOFZVpsuAUQIZRB/8rVlCmseuaSwhKjGCzAVZegwwbT29c= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:18 np0005546909.novalocal python3[4562]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:19 np0005546909.novalocal python3[4661]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:04:19 np0005546909.novalocal python3[4734]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764932658.861529-207-256986111877061/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=61705b5911bf401d89a5c7fe2bc3b5ac_id_rsa follow=False checksum=151ba3e6330bfdea1541c5df9b33a50aaf84a208 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:20 np0005546909.novalocal python3[4857]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:04:20 np0005546909.novalocal python3[4928]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764932659.7510955-240-49647840172004/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=61705b5911bf401d89a5c7fe2bc3b5ac_id_rsa.pub follow=False checksum=a840b3e226ec9e3618104f55c7f7555a733047f5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:20 np0005546909.novalocal sshd-session[4662]: Received disconnect from 103.231.14.54 port 54090:11: Bye Bye [preauth]
Dec 05 11:04:20 np0005546909.novalocal sshd-session[4662]: Disconnected from authenticating user root 103.231.14.54 port 54090 [preauth]
Dec 05 11:04:21 np0005546909.novalocal python3[4976]: ansible-ping Invoked with data=pong
Dec 05 11:04:22 np0005546909.novalocal python3[5000]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:04:24 np0005546909.novalocal python3[5060]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 05 11:04:25 np0005546909.novalocal sshd-session[5035]: Received disconnect from 41.94.88.49 port 37254:11: Bye Bye [preauth]
Dec 05 11:04:25 np0005546909.novalocal sshd-session[5035]: Disconnected from authenticating user root 41.94.88.49 port 37254 [preauth]
Dec 05 11:04:27 np0005546909.novalocal python3[5092]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:27 np0005546909.novalocal python3[5116]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:27 np0005546909.novalocal python3[5140]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:28 np0005546909.novalocal python3[5164]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:28 np0005546909.novalocal python3[5188]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:28 np0005546909.novalocal python3[5212]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:29 np0005546909.novalocal sudo[5236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjfddwqvfhpcndlubxoyobzmvjudvtrp ; /usr/bin/python3'
Dec 05 11:04:29 np0005546909.novalocal sudo[5236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:30 np0005546909.novalocal python3[5238]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:30 np0005546909.novalocal sudo[5236]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:30 np0005546909.novalocal sudo[5314]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeekiqhgzyewylmvexbcitkqcppaofge ; /usr/bin/python3'
Dec 05 11:04:30 np0005546909.novalocal sudo[5314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:30 np0005546909.novalocal python3[5316]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:04:30 np0005546909.novalocal sudo[5314]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:30 np0005546909.novalocal sudo[5387]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoogonhvzskvehcipvsbstncjprfzgge ; /usr/bin/python3'
Dec 05 11:04:30 np0005546909.novalocal sudo[5387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:31 np0005546909.novalocal python3[5389]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764932670.2422287-21-263598303897381/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:31 np0005546909.novalocal sudo[5387]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:31 np0005546909.novalocal python3[5437]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:31 np0005546909.novalocal python3[5461]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:32 np0005546909.novalocal python3[5485]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:32 np0005546909.novalocal python3[5509]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:32 np0005546909.novalocal python3[5533]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:33 np0005546909.novalocal python3[5557]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:33 np0005546909.novalocal python3[5581]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:33 np0005546909.novalocal python3[5605]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:33 np0005546909.novalocal python3[5629]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:34 np0005546909.novalocal python3[5653]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:34 np0005546909.novalocal python3[5677]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:34 np0005546909.novalocal python3[5701]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:35 np0005546909.novalocal python3[5725]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:35 np0005546909.novalocal python3[5749]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:35 np0005546909.novalocal python3[5773]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:35 np0005546909.novalocal python3[5797]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:36 np0005546909.novalocal python3[5823]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:36 np0005546909.novalocal python3[5847]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:36 np0005546909.novalocal python3[5871]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:36 np0005546909.novalocal sshd-session[5798]: Received disconnect from 189.47.10.39 port 40562:11: Bye Bye [preauth]
Dec 05 11:04:36 np0005546909.novalocal sshd-session[5798]: Disconnected from authenticating user root 189.47.10.39 port 40562 [preauth]
Dec 05 11:04:37 np0005546909.novalocal python3[5895]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:37 np0005546909.novalocal python3[5919]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:37 np0005546909.novalocal python3[5943]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:37 np0005546909.novalocal python3[5967]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:38 np0005546909.novalocal python3[5991]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:38 np0005546909.novalocal python3[6015]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:38 np0005546909.novalocal python3[6039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:04:41 np0005546909.novalocal sudo[6063]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccxdkpkzcejzqyhfdaiiizwwassouemn ; /usr/bin/python3'
Dec 05 11:04:41 np0005546909.novalocal sudo[6063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:41 np0005546909.novalocal python3[6065]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 05 11:04:41 np0005546909.novalocal systemd[1]: Starting Time & Date Service...
Dec 05 11:04:41 np0005546909.novalocal systemd[1]: Started Time & Date Service.
Dec 05 11:04:41 np0005546909.novalocal systemd-timedated[6067]: Changed time zone to 'UTC' (UTC).
Dec 05 11:04:41 np0005546909.novalocal sudo[6063]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:41 np0005546909.novalocal sudo[6094]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdlmgzhqptrswvcdgrubzahvasgptusv ; /usr/bin/python3'
Dec 05 11:04:41 np0005546909.novalocal sudo[6094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:41 np0005546909.novalocal python3[6096]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:41 np0005546909.novalocal sudo[6094]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:42 np0005546909.novalocal python3[6172]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:04:42 np0005546909.novalocal python3[6243]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764932681.9862192-153-249807561528417/source _original_basename=tmpbkrn64fr follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:43 np0005546909.novalocal python3[6343]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:04:43 np0005546909.novalocal python3[6414]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764932682.8548641-183-255391387919749/source _original_basename=tmpcmgjegsd follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:44 np0005546909.novalocal sudo[6514]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgnlszewlutbepnizzuplnxhrwieyway ; /usr/bin/python3'
Dec 05 11:04:44 np0005546909.novalocal sudo[6514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:44 np0005546909.novalocal python3[6516]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:04:44 np0005546909.novalocal sudo[6514]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:44 np0005546909.novalocal sudo[6587]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yylamqxjzmapyyzdmadyqfvuqqssjbzc ; /usr/bin/python3'
Dec 05 11:04:44 np0005546909.novalocal sudo[6587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:44 np0005546909.novalocal python3[6589]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764932684.003452-231-102883948401276/source _original_basename=tmphwegjqjq follow=False checksum=9afea3fa7e450257b25577284f0f4f0dfca88d28 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:44 np0005546909.novalocal sudo[6587]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:45 np0005546909.novalocal python3[6637]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:04:45 np0005546909.novalocal python3[6663]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:04:45 np0005546909.novalocal sudo[6741]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfvqroyznanxgzpnxmbcpnhlfwelvvvr ; /usr/bin/python3'
Dec 05 11:04:45 np0005546909.novalocal sudo[6741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:45 np0005546909.novalocal python3[6743]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:04:45 np0005546909.novalocal sudo[6741]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:46 np0005546909.novalocal sudo[6814]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cizdqoeoudpqzmfvhlylxcgumnwucrmh ; /usr/bin/python3'
Dec 05 11:04:46 np0005546909.novalocal sudo[6814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:46 np0005546909.novalocal python3[6816]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764932685.6772873-273-119285366837967/source _original_basename=tmpynbzqnj8 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:04:46 np0005546909.novalocal sudo[6814]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:46 np0005546909.novalocal sudo[6865]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvwbefgktvsxuonjdbvgfhewksdixdkq ; /usr/bin/python3'
Dec 05 11:04:46 np0005546909.novalocal sudo[6865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:04:46 np0005546909.novalocal python3[6867]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-4bf7-d06f-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:04:46 np0005546909.novalocal sudo[6865]: pam_unix(sudo:session): session closed for user root
Dec 05 11:04:47 np0005546909.novalocal python3[6895]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-4bf7-d06f-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 05 11:04:48 np0005546909.novalocal python3[6923]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:05:09 np0005546909.novalocal sudo[6947]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywbltcsrwusaspiyxflumqsceizqoggh ; /usr/bin/python3'
Dec 05 11:05:09 np0005546909.novalocal sudo[6947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:05:10 np0005546909.novalocal python3[6949]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:05:10 np0005546909.novalocal sudo[6947]: pam_unix(sudo:session): session closed for user root
Dec 05 11:05:11 np0005546909.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 05 11:05:34 np0005546909.novalocal sshd-session[6952]: Received disconnect from 197.248.8.33 port 50470:11: Bye Bye [preauth]
Dec 05 11:05:34 np0005546909.novalocal sshd-session[6952]: Disconnected from authenticating user root 197.248.8.33 port 50470 [preauth]
Dec 05 11:05:46 np0005546909.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 05 11:05:46 np0005546909.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 05 11:05:46 np0005546909.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 05 11:05:46 np0005546909.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 05 11:05:46 np0005546909.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 05 11:05:46 np0005546909.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 05 11:05:46 np0005546909.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 05 11:05:46 np0005546909.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 05 11:05:46 np0005546909.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 05 11:05:46 np0005546909.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.6721] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 05 11:05:46 np0005546909.novalocal systemd-udevd[6955]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.6939] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.6972] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.6976] device (eth1): carrier: link connected
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.6978] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.6986] policy: auto-activating connection 'Wired connection 1' (93f975fe-2181-3e88-a16c-de115f1f749a)
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.6993] device (eth1): Activation: starting connection 'Wired connection 1' (93f975fe-2181-3e88-a16c-de115f1f749a)
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.6994] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.6997] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.7001] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:05:46 np0005546909.novalocal NetworkManager[855]: <info>  [1764932746.7006] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 05 11:05:47 np0005546909.novalocal python3[6981]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-8d79-8267-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:05:52 np0005546909.novalocal sshd-session[6984]: Received disconnect from 41.94.88.49 port 58056:11: Bye Bye [preauth]
Dec 05 11:05:52 np0005546909.novalocal sshd-session[6984]: Disconnected from authenticating user root 41.94.88.49 port 58056 [preauth]
Dec 05 11:05:54 np0005546909.novalocal sudo[7061]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxmcxrtomdfgklrepkzlbfwluhijnyme ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 11:05:54 np0005546909.novalocal sudo[7061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:05:54 np0005546909.novalocal python3[7063]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:05:54 np0005546909.novalocal sudo[7061]: pam_unix(sudo:session): session closed for user root
Dec 05 11:05:54 np0005546909.novalocal sudo[7134]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtngdsmulpjflwntcmjhulqomkubdrqi ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 11:05:54 np0005546909.novalocal sudo[7134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:05:54 np0005546909.novalocal python3[7136]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764932754.1670415-102-53616069002325/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=539d5bbbf3bac426eed0d945c812bcc01cd40bad backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:05:54 np0005546909.novalocal sudo[7134]: pam_unix(sudo:session): session closed for user root
Dec 05 11:05:55 np0005546909.novalocal sudo[7184]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbxwciovnnthmvmqjkyiwywvphuvlxva ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 11:05:55 np0005546909.novalocal sudo[7184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:05:55 np0005546909.novalocal python3[7186]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[855]: <info>  [1764932755.7666] caught SIGTERM, shutting down normally.
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Stopping Network Manager...
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[855]: <info>  [1764932755.7676] dhcp4 (eth0): canceled DHCP transaction
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[855]: <info>  [1764932755.7677] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[855]: <info>  [1764932755.7677] dhcp4 (eth0): state changed no lease
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[855]: <info>  [1764932755.7679] manager: NetworkManager state is now CONNECTING
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[855]: <info>  [1764932755.7800] dhcp4 (eth1): canceled DHCP transaction
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[855]: <info>  [1764932755.7801] dhcp4 (eth1): state changed no lease
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[855]: <info>  [1764932755.7863] exiting (success)
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Stopped Network Manager.
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Starting Network Manager...
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.8537] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:f0f69436-bbfa-48e7-b73e-9b22f091bec6)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.8543] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.8611] manager[0x5623e83a1070]: monitoring kernel firmware directory '/lib/firmware'.
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Starting Hostname Service...
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Started Hostname Service.
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9324] hostname: hostname: using hostnamed
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9325] hostname: static hostname changed from (none) to "np0005546909.novalocal"
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9333] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9341] manager[0x5623e83a1070]: rfkill: Wi-Fi hardware radio set enabled
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9341] manager[0x5623e83a1070]: rfkill: WWAN hardware radio set enabled
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9388] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9388] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9390] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9391] manager: Networking is enabled by state file
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9395] settings: Loaded settings plugin: keyfile (internal)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9402] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9444] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9463] dhcp: init: Using DHCP client 'internal'
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9468] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9477] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9488] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9504] device (lo): Activation: starting connection 'lo' (0f41f2b7-1648-4484-8c07-96c2526b8b00)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9515] device (eth0): carrier: link connected
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9523] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9532] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9533] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9544] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9555] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9564] device (eth1): carrier: link connected
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9571] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9579] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (93f975fe-2181-3e88-a16c-de115f1f749a) (indicated)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9581] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9591] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9603] device (eth1): Activation: starting connection 'Wired connection 1' (93f975fe-2181-3e88-a16c-de115f1f749a)
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Started Network Manager.
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9614] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9623] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9627] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9629] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9633] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9637] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9641] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9645] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9650] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9660] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9664] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9679] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9684] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9713] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9722] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 05 11:05:55 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932755.9730] device (lo): Activation: successful, device activated.
Dec 05 11:05:55 np0005546909.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 05 11:05:56 np0005546909.novalocal sudo[7184]: pam_unix(sudo:session): session closed for user root
Dec 05 11:05:56 np0005546909.novalocal python3[7251]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-8d79-8267-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:05:56 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932756.8667] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Dec 05 11:05:56 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932756.8682] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 05 11:05:56 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932756.8766] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 11:05:56 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932756.8814] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 11:05:56 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932756.8815] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 11:05:56 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932756.8819] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 11:05:56 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932756.8823] device (eth0): Activation: successful, device activated.
Dec 05 11:05:56 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932756.8830] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 05 11:06:01 np0005546909.novalocal sshd-session[7273]: Received disconnect from 24.232.50.5 port 43828:11: Bye Bye [preauth]
Dec 05 11:06:01 np0005546909.novalocal sshd-session[7273]: Disconnected from authenticating user root 24.232.50.5 port 43828 [preauth]
Dec 05 11:06:06 np0005546909.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 11:06:25 np0005546909.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.3794] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 11:06:41 np0005546909.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 11:06:41 np0005546909.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4155] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4158] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4169] device (eth1): Activation: successful, device activated.
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4177] manager: startup complete
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4182] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <warn>  [1764932801.4189] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4199] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 05 11:06:41 np0005546909.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4322] dhcp4 (eth1): canceled DHCP transaction
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4323] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4323] dhcp4 (eth1): state changed no lease
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4340] policy: auto-activating connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c)
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4348] device (eth1): Activation: starting connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c)
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4349] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4352] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4362] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4372] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4824] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4826] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:06:41 np0005546909.novalocal NetworkManager[7198]: <info>  [1764932801.4832] device (eth1): Activation: successful, device activated.
Dec 05 11:06:51 np0005546909.novalocal sshd-session[7300]: Connection reset by authenticating user root 91.202.233.33 port 58390 [preauth]
Dec 05 11:06:51 np0005546909.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 11:06:52 np0005546909.novalocal sudo[7379]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kagoigbpdpwhlimtlhmnvhmjgovjqzyf ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 11:06:52 np0005546909.novalocal sudo[7379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:06:52 np0005546909.novalocal python3[7381]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:06:52 np0005546909.novalocal sudo[7379]: pam_unix(sudo:session): session closed for user root
Dec 05 11:06:52 np0005546909.novalocal sudo[7452]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stfpisehqrniottzhyhhwanysmxziyoy ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 11:06:52 np0005546909.novalocal sudo[7452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:06:52 np0005546909.novalocal python3[7454]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764932812.3509858-259-42282614824109/source _original_basename=tmpr6a0i03o follow=False checksum=9d1ae235b500c1f86ea6820f43fa64d4d77947c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:06:53 np0005546909.novalocal sudo[7452]: pam_unix(sudo:session): session closed for user root
Dec 05 11:06:53 np0005546909.novalocal sshd-session[7302]: Connection reset by authenticating user root 91.202.233.33 port 33422 [preauth]
Dec 05 11:06:56 np0005546909.novalocal sshd-session[7479]: Connection reset by authenticating user root 91.202.233.33 port 33432 [preauth]
Dec 05 11:06:58 np0005546909.novalocal sshd-session[7483]: Received disconnect from 197.248.8.33 port 51926:11: Bye Bye [preauth]
Dec 05 11:06:58 np0005546909.novalocal sshd-session[7483]: Disconnected from authenticating user root 197.248.8.33 port 51926 [preauth]
Dec 05 11:06:58 np0005546909.novalocal sshd-session[7481]: Connection reset by authenticating user root 91.202.233.33 port 33450 [preauth]
Dec 05 11:07:00 np0005546909.novalocal sshd-session[7485]: Connection reset by authenticating user root 91.202.233.33 port 33452 [preauth]
Dec 05 11:07:05 np0005546909.novalocal systemd[4300]: Starting Mark boot as successful...
Dec 05 11:07:05 np0005546909.novalocal systemd[4300]: Finished Mark boot as successful.
Dec 05 11:07:16 np0005546909.novalocal sshd-session[7488]: Received disconnect from 41.94.88.49 port 51616:11: Bye Bye [preauth]
Dec 05 11:07:16 np0005546909.novalocal sshd-session[7488]: Disconnected from authenticating user root 41.94.88.49 port 51616 [preauth]
Dec 05 11:07:48 np0005546909.novalocal sshd-session[7490]: Received disconnect from 24.232.50.5 port 40000:11: Bye Bye [preauth]
Dec 05 11:07:48 np0005546909.novalocal sshd-session[7490]: Disconnected from authenticating user root 24.232.50.5 port 40000 [preauth]
Dec 05 11:07:53 np0005546909.novalocal sshd-session[4309]: Received disconnect from 38.102.83.114 port 57032:11: disconnected by user
Dec 05 11:07:53 np0005546909.novalocal sshd-session[4309]: Disconnected from user zuul 38.102.83.114 port 57032
Dec 05 11:07:53 np0005546909.novalocal sshd-session[4296]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:07:53 np0005546909.novalocal systemd-logind[792]: Session 1 logged out. Waiting for processes to exit.
Dec 05 11:08:24 np0005546909.novalocal sshd-session[7492]: Invalid user ubuntu from 43.225.159.82 port 34358
Dec 05 11:08:24 np0005546909.novalocal sshd-session[7492]: Received disconnect from 43.225.159.82 port 34358:11:  [preauth]
Dec 05 11:08:24 np0005546909.novalocal sshd-session[7492]: Disconnected from invalid user ubuntu 43.225.159.82 port 34358 [preauth]
Dec 05 11:08:37 np0005546909.novalocal sshd-session[7494]: Received disconnect from 41.94.88.49 port 50816:11: Bye Bye [preauth]
Dec 05 11:08:37 np0005546909.novalocal sshd-session[7494]: Disconnected from authenticating user root 41.94.88.49 port 50816 [preauth]
Dec 05 11:10:00 np0005546909.novalocal sshd-session[7498]: Received disconnect from 41.94.88.49 port 54782:11: Bye Bye [preauth]
Dec 05 11:10:00 np0005546909.novalocal sshd-session[7498]: Disconnected from authenticating user root 41.94.88.49 port 54782 [preauth]
Dec 05 11:10:05 np0005546909.novalocal systemd[4300]: Created slice User Background Tasks Slice.
Dec 05 11:10:05 np0005546909.novalocal systemd[4300]: Starting Cleanup of User's Temporary Files and Directories...
Dec 05 11:10:05 np0005546909.novalocal systemd[4300]: Finished Cleanup of User's Temporary Files and Directories.
Dec 05 11:11:07 np0005546909.novalocal sshd-session[7503]: Received disconnect from 189.47.10.39 port 43734:11: Bye Bye [preauth]
Dec 05 11:11:07 np0005546909.novalocal sshd-session[7503]: Disconnected from authenticating user root 189.47.10.39 port 43734 [preauth]
Dec 05 11:11:24 np0005546909.novalocal sshd-session[7505]: Received disconnect from 41.94.88.49 port 38558:11: Bye Bye [preauth]
Dec 05 11:11:24 np0005546909.novalocal sshd-session[7505]: Disconnected from authenticating user root 41.94.88.49 port 38558 [preauth]
Dec 05 11:12:47 np0005546909.novalocal sshd-session[7507]: Received disconnect from 41.94.88.49 port 36072:11: Bye Bye [preauth]
Dec 05 11:12:47 np0005546909.novalocal sshd-session[7507]: Disconnected from authenticating user root 41.94.88.49 port 36072 [preauth]
Dec 05 11:14:09 np0005546909.novalocal sshd-session[7512]: Accepted publickey for zuul from 38.102.83.114 port 58902 ssh2: RSA SHA256:R78ri56Dxdg65lTd62kgZ+VtfOSW6yNvlxzwjlCA1bM
Dec 05 11:14:09 np0005546909.novalocal systemd-logind[792]: New session 3 of user zuul.
Dec 05 11:14:09 np0005546909.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 05 11:14:09 np0005546909.novalocal sshd-session[7512]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:14:09 np0005546909.novalocal sudo[7539]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trfbdfsuccikugzgohlbxyehbqirhunc ; /usr/bin/python3'
Dec 05 11:14:09 np0005546909.novalocal sudo[7539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:09 np0005546909.novalocal sshd-session[7509]: Received disconnect from 41.94.88.49 port 36000:11: Bye Bye [preauth]
Dec 05 11:14:09 np0005546909.novalocal sshd-session[7509]: Disconnected from authenticating user root 41.94.88.49 port 36000 [preauth]
Dec 05 11:14:09 np0005546909.novalocal python3[7541]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-9a9b-9f2e-000000001cdc-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:14:09 np0005546909.novalocal sudo[7539]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:09 np0005546909.novalocal sudo[7568]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpgwlvwsaniteemftcekpcefbhbvqszj ; /usr/bin/python3'
Dec 05 11:14:09 np0005546909.novalocal sudo[7568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:09 np0005546909.novalocal python3[7570]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:14:09 np0005546909.novalocal sudo[7568]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:09 np0005546909.novalocal sudo[7594]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trxbfwpuqnlgamviglhqfxcvskqzvpch ; /usr/bin/python3'
Dec 05 11:14:09 np0005546909.novalocal sudo[7594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:10 np0005546909.novalocal python3[7596]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:14:10 np0005546909.novalocal sudo[7594]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:10 np0005546909.novalocal sudo[7620]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flqryzhqldbgfxjfrukxfdjfyjnjjxfv ; /usr/bin/python3'
Dec 05 11:14:10 np0005546909.novalocal sudo[7620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:10 np0005546909.novalocal python3[7622]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:14:10 np0005546909.novalocal sudo[7620]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:10 np0005546909.novalocal sudo[7646]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovfgxgtcrydfevizhqysvdpfdnrzhktr ; /usr/bin/python3'
Dec 05 11:14:10 np0005546909.novalocal sudo[7646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:10 np0005546909.novalocal python3[7648]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:14:10 np0005546909.novalocal sudo[7646]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:11 np0005546909.novalocal sudo[7672]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dchivtkxyoeupfwyqofmdvslyaxdzuwe ; /usr/bin/python3'
Dec 05 11:14:11 np0005546909.novalocal sudo[7672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:11 np0005546909.novalocal python3[7674]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:14:11 np0005546909.novalocal sudo[7672]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:11 np0005546909.novalocal sudo[7750]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dchcxjmmitsykqxukdmdkqwqkqnxeubk ; /usr/bin/python3'
Dec 05 11:14:11 np0005546909.novalocal sudo[7750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:11 np0005546909.novalocal python3[7752]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:14:11 np0005546909.novalocal sudo[7750]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:11 np0005546909.novalocal sudo[7823]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkxnnjzpcmmptjhstqgeiiwapdwaarwl ; /usr/bin/python3'
Dec 05 11:14:11 np0005546909.novalocal sudo[7823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:12 np0005546909.novalocal python3[7825]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764933251.4433565-480-270729542744686/source _original_basename=tmpnqxdp_4x follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:14:12 np0005546909.novalocal sudo[7823]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:12 np0005546909.novalocal sudo[7873]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiprvywamjvsbkndyeuadvdvbiyjjias ; /usr/bin/python3'
Dec 05 11:14:12 np0005546909.novalocal sudo[7873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:13 np0005546909.novalocal python3[7875]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:14:13 np0005546909.novalocal systemd[1]: Reloading.
Dec 05 11:14:13 np0005546909.novalocal systemd-rc-local-generator[7897]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:14:13 np0005546909.novalocal sudo[7873]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:14 np0005546909.novalocal sudo[7929]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhzefbtibrbtvukasrusrgarnzdfflmb ; /usr/bin/python3'
Dec 05 11:14:14 np0005546909.novalocal sudo[7929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:14 np0005546909.novalocal python3[7931]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 05 11:14:14 np0005546909.novalocal sudo[7929]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:14 np0005546909.novalocal sudo[7955]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lquqfpxharblevvbatsxjiumpmagnqur ; /usr/bin/python3'
Dec 05 11:14:14 np0005546909.novalocal sudo[7955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:15 np0005546909.novalocal python3[7957]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:14:15 np0005546909.novalocal sudo[7955]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:15 np0005546909.novalocal sudo[7983]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pepywrjbaopvihcswriofelwdwtzswjv ; /usr/bin/python3'
Dec 05 11:14:15 np0005546909.novalocal sudo[7983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:15 np0005546909.novalocal python3[7985]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:14:15 np0005546909.novalocal sudo[7983]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:15 np0005546909.novalocal sudo[8011]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzsojigpakqktgghstdxjkigxzqwoadm ; /usr/bin/python3'
Dec 05 11:14:15 np0005546909.novalocal sudo[8011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:15 np0005546909.novalocal python3[8013]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:14:15 np0005546909.novalocal sudo[8011]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:15 np0005546909.novalocal sudo[8039]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyrzrahisgkymujiyhvtaydqcjjbfqkh ; /usr/bin/python3'
Dec 05 11:14:15 np0005546909.novalocal sudo[8039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:15 np0005546909.novalocal python3[8041]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:14:15 np0005546909.novalocal sudo[8039]: pam_unix(sudo:session): session closed for user root
Dec 05 11:14:16 np0005546909.novalocal python3[8068]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-9a9b-9f2e-000000001ce3-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:14:16 np0005546909.novalocal python3[8098]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 05 11:14:18 np0005546909.novalocal sshd-session[7515]: Connection closed by 38.102.83.114 port 58902
Dec 05 11:14:18 np0005546909.novalocal sshd-session[7512]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:14:18 np0005546909.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 05 11:14:18 np0005546909.novalocal systemd[1]: session-3.scope: Consumed 3.918s CPU time.
Dec 05 11:14:18 np0005546909.novalocal systemd-logind[792]: Session 3 logged out. Waiting for processes to exit.
Dec 05 11:14:18 np0005546909.novalocal systemd-logind[792]: Removed session 3.
Dec 05 11:14:20 np0005546909.novalocal sshd-session[8104]: Accepted publickey for zuul from 38.102.83.114 port 36724 ssh2: RSA SHA256:R78ri56Dxdg65lTd62kgZ+VtfOSW6yNvlxzwjlCA1bM
Dec 05 11:14:20 np0005546909.novalocal systemd-logind[792]: New session 4 of user zuul.
Dec 05 11:14:20 np0005546909.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 05 11:14:20 np0005546909.novalocal sshd-session[8104]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:14:20 np0005546909.novalocal sudo[8131]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkvcbknsgcpkscsqbjqvfwmemojizqvg ; /usr/bin/python3'
Dec 05 11:14:20 np0005546909.novalocal sudo[8131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:14:20 np0005546909.novalocal python3[8133]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 05 11:14:34 np0005546909.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 05 11:14:34 np0005546909.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:14:34 np0005546909.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 05 11:14:34 np0005546909.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:14:34 np0005546909.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:14:34 np0005546909.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:14:34 np0005546909.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:14:34 np0005546909.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:14:44 np0005546909.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 05 11:14:44 np0005546909.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:14:44 np0005546909.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 05 11:14:44 np0005546909.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:14:44 np0005546909.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:14:44 np0005546909.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:14:44 np0005546909.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:14:44 np0005546909.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:14:54 np0005546909.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 05 11:14:54 np0005546909.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:14:54 np0005546909.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 05 11:14:54 np0005546909.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:14:54 np0005546909.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:14:54 np0005546909.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:14:54 np0005546909.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:14:54 np0005546909.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:14:55 np0005546909.novalocal setsebool[8200]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 05 11:14:55 np0005546909.novalocal setsebool[8200]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 05 11:15:07 np0005546909.novalocal kernel: SELinux:  Converting 388 SID table entries...
Dec 05 11:15:07 np0005546909.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:15:07 np0005546909.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 05 11:15:07 np0005546909.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:15:07 np0005546909.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:15:07 np0005546909.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:15:07 np0005546909.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:15:07 np0005546909.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:15:25 np0005546909.novalocal dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 05 11:15:25 np0005546909.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 11:15:25 np0005546909.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 05 11:15:25 np0005546909.novalocal systemd[1]: Reloading.
Dec 05 11:15:25 np0005546909.novalocal systemd-rc-local-generator[8954]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:15:25 np0005546909.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 11:15:26 np0005546909.novalocal sudo[8131]: pam_unix(sudo:session): session closed for user root
Dec 05 11:15:33 np0005546909.novalocal sshd-session[13854]: Received disconnect from 41.94.88.49 port 46936:11: Bye Bye [preauth]
Dec 05 11:15:33 np0005546909.novalocal sshd-session[13854]: Disconnected from authenticating user root 41.94.88.49 port 46936 [preauth]
Dec 05 11:15:45 np0005546909.novalocal irqbalance[790]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 05 11:15:45 np0005546909.novalocal irqbalance[790]: IRQ 27 affinity is now unmanaged
Dec 05 11:15:51 np0005546909.novalocal python3[21365]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-af62-e437-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:15:52 np0005546909.novalocal kernel: evm: overlay not supported
Dec 05 11:15:52 np0005546909.novalocal systemd[4300]: Starting D-Bus User Message Bus...
Dec 05 11:15:52 np0005546909.novalocal dbus-broker-launch[21795]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 05 11:15:52 np0005546909.novalocal dbus-broker-launch[21795]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 05 11:15:52 np0005546909.novalocal systemd[4300]: Started D-Bus User Message Bus.
Dec 05 11:15:52 np0005546909.novalocal dbus-broker-lau[21795]: Ready
Dec 05 11:15:52 np0005546909.novalocal systemd[4300]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 05 11:15:52 np0005546909.novalocal systemd[4300]: Created slice Slice /user.
Dec 05 11:15:52 np0005546909.novalocal systemd[4300]: podman-21723.scope: unit configures an IP firewall, but not running as root.
Dec 05 11:15:52 np0005546909.novalocal systemd[4300]: (This warning is only shown for the first unit using IP firewalling.)
Dec 05 11:15:52 np0005546909.novalocal systemd[4300]: Started podman-21723.scope.
Dec 05 11:15:52 np0005546909.novalocal systemd[4300]: Started podman-pause-972d2d54.scope.
Dec 05 11:15:54 np0005546909.novalocal sudo[22473]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqzwajhmihukjizbteclgxncduxvqzwc ; /usr/bin/python3'
Dec 05 11:15:54 np0005546909.novalocal sudo[22473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:15:54 np0005546909.novalocal python3[22486]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.150:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.150:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:15:54 np0005546909.novalocal python3[22486]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 05 11:15:54 np0005546909.novalocal sudo[22473]: pam_unix(sudo:session): session closed for user root
Dec 05 11:15:54 np0005546909.novalocal sshd-session[8107]: Connection closed by 38.102.83.114 port 36724
Dec 05 11:15:54 np0005546909.novalocal sshd-session[8104]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:15:54 np0005546909.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 05 11:15:54 np0005546909.novalocal systemd[1]: session-4.scope: Consumed 1min 2.340s CPU time.
Dec 05 11:15:54 np0005546909.novalocal systemd-logind[792]: Session 4 logged out. Waiting for processes to exit.
Dec 05 11:15:54 np0005546909.novalocal systemd-logind[792]: Removed session 4.
Dec 05 11:16:14 np0005546909.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 11:16:14 np0005546909.novalocal systemd[1]: Finished man-db-cache-update.service.
Dec 05 11:16:14 np0005546909.novalocal systemd[1]: man-db-cache-update.service: Consumed 1min 985ms CPU time.
Dec 05 11:16:14 np0005546909.novalocal systemd[1]: run-r05ad47f3fc23409bba8855ead1a4753d.service: Deactivated successfully.
Dec 05 11:16:15 np0005546909.novalocal sshd-session[29671]: Unable to negotiate with 38.102.83.23 port 41714: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 05 11:16:15 np0005546909.novalocal sshd-session[29672]: Unable to negotiate with 38.102.83.23 port 41716: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 05 11:16:15 np0005546909.novalocal sshd-session[29674]: Connection closed by 38.102.83.23 port 41684 [preauth]
Dec 05 11:16:15 np0005546909.novalocal sshd-session[29673]: Unable to negotiate with 38.102.83.23 port 41718: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 05 11:16:15 np0005546909.novalocal sshd-session[29675]: Connection closed by 38.102.83.23 port 41698 [preauth]
Dec 05 11:16:16 np0005546909.novalocal sshd-session[29428]: Connection reset by authenticating user root 45.140.17.124 port 42220 [preauth]
Dec 05 11:16:19 np0005546909.novalocal sshd-session[29681]: Connection reset by authenticating user root 45.140.17.124 port 42224 [preauth]
Dec 05 11:16:20 np0005546909.novalocal sshd-session[29685]: Accepted publickey for zuul from 38.102.83.114 port 46968 ssh2: RSA SHA256:R78ri56Dxdg65lTd62kgZ+VtfOSW6yNvlxzwjlCA1bM
Dec 05 11:16:20 np0005546909.novalocal systemd-logind[792]: New session 5 of user zuul.
Dec 05 11:16:20 np0005546909.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 05 11:16:20 np0005546909.novalocal sshd-session[29685]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:16:20 np0005546909.novalocal python3[29712]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPxMqPca2AFFkthc+SJB5KQce9qhUGI8TmxjbFm+QVrTZb5Aso+AhgnMC72fIw5eFURySQzpF/Zz36QB1PAho/E= zuul@np0005546908.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:16:21 np0005546909.novalocal sudo[29736]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzblfxaufvtfvqyqevxqxvvoaljoxtht ; /usr/bin/python3'
Dec 05 11:16:21 np0005546909.novalocal sudo[29736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:16:21 np0005546909.novalocal python3[29738]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPxMqPca2AFFkthc+SJB5KQce9qhUGI8TmxjbFm+QVrTZb5Aso+AhgnMC72fIw5eFURySQzpF/Zz36QB1PAho/E= zuul@np0005546908.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:16:21 np0005546909.novalocal sudo[29736]: pam_unix(sudo:session): session closed for user root
Dec 05 11:16:21 np0005546909.novalocal sshd-session[29683]: Connection reset by authenticating user root 45.140.17.124 port 42234 [preauth]
Dec 05 11:16:22 np0005546909.novalocal sudo[29764]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzujeaagsbfxceevvulkfoiwaggiseha ; /usr/bin/python3'
Dec 05 11:16:22 np0005546909.novalocal sudo[29764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:16:22 np0005546909.novalocal python3[29766]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005546909.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 05 11:16:22 np0005546909.novalocal useradd[29768]: new group: name=cloud-admin, GID=1002
Dec 05 11:16:22 np0005546909.novalocal useradd[29768]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 05 11:16:22 np0005546909.novalocal sudo[29764]: pam_unix(sudo:session): session closed for user root
Dec 05 11:16:22 np0005546909.novalocal sudo[29798]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvsotliubgqrtlfnuledvuiadnalmody ; /usr/bin/python3'
Dec 05 11:16:22 np0005546909.novalocal sudo[29798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:16:22 np0005546909.novalocal python3[29800]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPxMqPca2AFFkthc+SJB5KQce9qhUGI8TmxjbFm+QVrTZb5Aso+AhgnMC72fIw5eFURySQzpF/Zz36QB1PAho/E= zuul@np0005546908.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 11:16:22 np0005546909.novalocal sudo[29798]: pam_unix(sudo:session): session closed for user root
Dec 05 11:16:23 np0005546909.novalocal sudo[29876]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaboaleiyaydayijxcscehunpiqwhzxd ; /usr/bin/python3'
Dec 05 11:16:23 np0005546909.novalocal sudo[29876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:16:23 np0005546909.novalocal python3[29878]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:16:23 np0005546909.novalocal sudo[29876]: pam_unix(sudo:session): session closed for user root
Dec 05 11:16:23 np0005546909.novalocal sudo[29949]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axawvugjtmecyjkwovhqzgcxpbcrrjbf ; /usr/bin/python3'
Dec 05 11:16:23 np0005546909.novalocal sudo[29949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:16:23 np0005546909.novalocal python3[29951]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764933383.0035822-135-108564735354486/source _original_basename=tmp3x4q2l93 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:16:23 np0005546909.novalocal sudo[29949]: pam_unix(sudo:session): session closed for user root
Dec 05 11:16:24 np0005546909.novalocal sudo[29999]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jutqawppwzorqqseiuoeqajvkycxnvpf ; /usr/bin/python3'
Dec 05 11:16:24 np0005546909.novalocal sudo[29999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:16:24 np0005546909.novalocal sshd-session[29739]: Connection reset by authenticating user root 45.140.17.124 port 49000 [preauth]
Dec 05 11:16:24 np0005546909.novalocal python3[30001]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec 05 11:16:24 np0005546909.novalocal systemd[1]: Starting Hostname Service...
Dec 05 11:16:24 np0005546909.novalocal systemd[1]: Started Hostname Service.
Dec 05 11:16:24 np0005546909.novalocal systemd-hostnamed[30006]: Changed pretty hostname to 'compute-0'
Dec 05 11:16:24 compute-0 systemd-hostnamed[30006]: Hostname set to <compute-0> (static)
Dec 05 11:16:24 compute-0 NetworkManager[7198]: <info>  [1764933384.8871] hostname: static hostname changed from "np0005546909.novalocal" to "compute-0"
Dec 05 11:16:24 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 11:16:24 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 11:16:24 compute-0 sudo[29999]: pam_unix(sudo:session): session closed for user root
Dec 05 11:16:25 compute-0 sshd-session[29688]: Connection closed by 38.102.83.114 port 46968
Dec 05 11:16:25 compute-0 sshd-session[29685]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:16:25 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Dec 05 11:16:25 compute-0 systemd[1]: session-5.scope: Consumed 2.548s CPU time.
Dec 05 11:16:25 compute-0 systemd-logind[792]: Session 5 logged out. Waiting for processes to exit.
Dec 05 11:16:25 compute-0 systemd-logind[792]: Removed session 5.
Dec 05 11:16:27 compute-0 sshd-session[30002]: Connection reset by authenticating user root 45.140.17.124 port 49002 [preauth]
Dec 05 11:16:34 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 11:16:54 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 11:17:00 compute-0 sshd-session[30025]: Received disconnect from 41.94.88.49 port 54688:11: Bye Bye [preauth]
Dec 05 11:17:00 compute-0 sshd-session[30025]: Disconnected from authenticating user root 41.94.88.49 port 54688 [preauth]
Dec 05 11:17:08 compute-0 sshd-session[30027]: Received disconnect from 189.47.10.39 port 48226:11: Bye Bye [preauth]
Dec 05 11:17:08 compute-0 sshd-session[30027]: Disconnected from authenticating user root 189.47.10.39 port 48226 [preauth]
Dec 05 11:18:26 compute-0 sshd-session[30033]: Received disconnect from 41.94.88.49 port 54098:11: Bye Bye [preauth]
Dec 05 11:18:26 compute-0 sshd-session[30033]: Disconnected from authenticating user root 41.94.88.49 port 54098 [preauth]
Dec 05 11:19:01 compute-0 sshd-session[30035]: Received disconnect from 193.46.255.159 port 40642:11:  [preauth]
Dec 05 11:19:01 compute-0 sshd-session[30035]: Disconnected from authenticating user root 193.46.255.159 port 40642 [preauth]
Dec 05 11:19:01 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 05 11:19:01 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 05 11:19:01 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 05 11:19:01 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 05 11:19:49 compute-0 sshd-session[30039]: Received disconnect from 41.94.88.49 port 56652:11: Bye Bye [preauth]
Dec 05 11:19:49 compute-0 sshd-session[30039]: Disconnected from authenticating user root 41.94.88.49 port 56652 [preauth]
Dec 05 11:20:13 compute-0 sshd-session[30042]: Received disconnect from 189.47.10.39 port 45480:11: Bye Bye [preauth]
Dec 05 11:20:13 compute-0 sshd-session[30042]: Disconnected from authenticating user root 189.47.10.39 port 45480 [preauth]
Dec 05 11:20:58 compute-0 sshd-session[30044]: Accepted publickey for zuul from 38.102.83.23 port 59844 ssh2: RSA SHA256:R78ri56Dxdg65lTd62kgZ+VtfOSW6yNvlxzwjlCA1bM
Dec 05 11:20:58 compute-0 systemd-logind[792]: New session 6 of user zuul.
Dec 05 11:20:58 compute-0 systemd[1]: Started Session 6 of User zuul.
Dec 05 11:20:58 compute-0 sshd-session[30044]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:20:59 compute-0 python3[30120]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:21:00 compute-0 sudo[30234]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyjebfkdwgilyecviehzwbhummtclzvz ; /usr/bin/python3'
Dec 05 11:21:00 compute-0 sudo[30234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:00 compute-0 python3[30236]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:21:00 compute-0 sudo[30234]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:01 compute-0 sudo[30307]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqmaxtmwudotpdlfegruirovkmtkegoi ; /usr/bin/python3'
Dec 05 11:21:01 compute-0 sudo[30307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:01 compute-0 python3[30309]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:21:01 compute-0 sudo[30307]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:01 compute-0 sudo[30333]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejwgsjlbdblfkksvkibaauxecvklwtvp ; /usr/bin/python3'
Dec 05 11:21:01 compute-0 sudo[30333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:01 compute-0 python3[30335]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:21:01 compute-0 sudo[30333]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:01 compute-0 sudo[30406]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjwezslwyfxwjsgnkfiljxheyrbagkts ; /usr/bin/python3'
Dec 05 11:21:01 compute-0 sudo[30406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:02 compute-0 python3[30408]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:21:02 compute-0 sudo[30406]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:02 compute-0 sudo[30432]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crtkfkwdtsjudwbfogusfjmaqegfllnw ; /usr/bin/python3'
Dec 05 11:21:02 compute-0 sudo[30432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:02 compute-0 python3[30434]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:21:02 compute-0 sudo[30432]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:02 compute-0 sudo[30505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvoaaflnrgifgohqkqchqfqnprgtifox ; /usr/bin/python3'
Dec 05 11:21:02 compute-0 sudo[30505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:02 compute-0 python3[30507]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:21:02 compute-0 sudo[30505]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:02 compute-0 sudo[30531]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsgvahrwhpunocejplvuonmcutltwmhd ; /usr/bin/python3'
Dec 05 11:21:02 compute-0 sudo[30531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:02 compute-0 python3[30533]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:21:02 compute-0 sudo[30531]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:03 compute-0 sudo[30604]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqvscnbhlywtpegrcdgdbsbbeqfnmaqt ; /usr/bin/python3'
Dec 05 11:21:03 compute-0 sudo[30604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:03 compute-0 python3[30606]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:21:03 compute-0 sudo[30604]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:03 compute-0 sudo[30630]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzcpsmqfuqzgeudilzxljnqxosbvdfbw ; /usr/bin/python3'
Dec 05 11:21:03 compute-0 sudo[30630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:03 compute-0 python3[30632]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:21:03 compute-0 sudo[30630]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:03 compute-0 sudo[30703]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvkaggepagwqksfajeyfjdtdkipydjsy ; /usr/bin/python3'
Dec 05 11:21:03 compute-0 sudo[30703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:03 compute-0 python3[30705]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:21:03 compute-0 sudo[30703]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:04 compute-0 sudo[30729]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giujslenbxmgrnacmbqerafenygojdcr ; /usr/bin/python3'
Dec 05 11:21:04 compute-0 sudo[30729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:04 compute-0 python3[30731]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:21:04 compute-0 sudo[30729]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:04 compute-0 sudo[30802]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asjzgukfiinxkitqaitfdcxnrncirilz ; /usr/bin/python3'
Dec 05 11:21:04 compute-0 sudo[30802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:04 compute-0 python3[30804]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:21:04 compute-0 sudo[30802]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:04 compute-0 sudo[30828]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfujmrhqsjwgoocxygvsbmjknzkyiing ; /usr/bin/python3'
Dec 05 11:21:04 compute-0 sudo[30828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:04 compute-0 python3[30830]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 11:21:04 compute-0 sudo[30828]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:05 compute-0 sudo[30901]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cujwngbgpaqlxwgllbdnfvdvcgsswyvm ; /usr/bin/python3'
Dec 05 11:21:05 compute-0 sudo[30901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:21:05 compute-0 python3[30903]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:21:05 compute-0 sudo[30901]: pam_unix(sudo:session): session closed for user root
Dec 05 11:21:07 compute-0 sshd-session[30928]: Unable to negotiate with 192.168.122.11 port 52120: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 05 11:21:07 compute-0 sshd-session[30929]: Connection closed by 192.168.122.11 port 52074 [preauth]
Dec 05 11:21:07 compute-0 sshd-session[30931]: Connection closed by 192.168.122.11 port 52082 [preauth]
Dec 05 11:21:07 compute-0 sshd-session[30932]: Unable to negotiate with 192.168.122.11 port 52092: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 05 11:21:07 compute-0 sshd-session[30930]: Unable to negotiate with 192.168.122.11 port 52108: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 05 11:22:43 compute-0 python3[30961]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:24:43 compute-0 sshd-session[30964]: Received disconnect from 189.47.10.39 port 56120:11: Bye Bye [preauth]
Dec 05 11:24:43 compute-0 sshd-session[30964]: Disconnected from authenticating user root 189.47.10.39 port 56120 [preauth]
Dec 05 11:27:43 compute-0 sshd-session[30047]: Received disconnect from 38.102.83.23 port 59844:11: disconnected by user
Dec 05 11:27:43 compute-0 sshd-session[30047]: Disconnected from user zuul 38.102.83.23 port 59844
Dec 05 11:27:43 compute-0 sshd-session[30044]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:27:43 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Dec 05 11:27:43 compute-0 systemd[1]: session-6.scope: Consumed 5.276s CPU time.
Dec 05 11:27:43 compute-0 systemd-logind[792]: Session 6 logged out. Waiting for processes to exit.
Dec 05 11:27:43 compute-0 systemd-logind[792]: Removed session 6.
Dec 05 11:30:41 compute-0 sshd-session[30969]: Connection reset by authenticating user root 91.202.233.33 port 55454 [preauth]
Dec 05 11:30:43 compute-0 sshd-session[30971]: Connection reset by authenticating user root 91.202.233.33 port 22032 [preauth]
Dec 05 11:30:47 compute-0 sshd-session[30973]: Connection reset by authenticating user root 91.202.233.33 port 22038 [preauth]
Dec 05 11:30:49 compute-0 sshd-session[30975]: Connection reset by authenticating user root 91.202.233.33 port 22044 [preauth]
Dec 05 11:30:53 compute-0 sshd-session[30977]: Connection reset by authenticating user root 91.202.233.33 port 22048 [preauth]
Dec 05 11:33:44 compute-0 sshd-session[30980]: Received disconnect from 189.47.10.39 port 33030:11: Bye Bye [preauth]
Dec 05 11:33:44 compute-0 sshd-session[30980]: Disconnected from authenticating user root 189.47.10.39 port 33030 [preauth]
Dec 05 11:34:23 compute-0 sshd-session[30982]: Accepted publickey for zuul from 192.168.122.30 port 35174 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:34:23 compute-0 systemd-logind[792]: New session 7 of user zuul.
Dec 05 11:34:23 compute-0 systemd[1]: Started Session 7 of User zuul.
Dec 05 11:34:23 compute-0 sshd-session[30982]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:34:24 compute-0 python3.9[31135]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:34:26 compute-0 sudo[31314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srplugbfrucksrnzzdcoqmgwctypguus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934465.5743763-32-175481268941661/AnsiballZ_command.py'
Dec 05 11:34:26 compute-0 sudo[31314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:34:26 compute-0 python3.9[31316]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:34:33 compute-0 sudo[31314]: pam_unix(sudo:session): session closed for user root
Dec 05 11:34:33 compute-0 sshd-session[30985]: Connection closed by 192.168.122.30 port 35174
Dec 05 11:34:33 compute-0 sshd-session[30982]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:34:33 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Dec 05 11:34:33 compute-0 systemd[1]: session-7.scope: Consumed 8.085s CPU time.
Dec 05 11:34:33 compute-0 systemd-logind[792]: Session 7 logged out. Waiting for processes to exit.
Dec 05 11:34:33 compute-0 systemd-logind[792]: Removed session 7.
Dec 05 11:34:39 compute-0 sshd-session[31375]: Accepted publickey for zuul from 192.168.122.30 port 33884 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:34:39 compute-0 systemd-logind[792]: New session 8 of user zuul.
Dec 05 11:34:39 compute-0 systemd[1]: Started Session 8 of User zuul.
Dec 05 11:34:39 compute-0 sshd-session[31375]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:34:40 compute-0 python3.9[31528]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:34:40 compute-0 sshd-session[31378]: Connection closed by 192.168.122.30 port 33884
Dec 05 11:34:40 compute-0 sshd-session[31375]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:34:40 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Dec 05 11:34:40 compute-0 systemd-logind[792]: Session 8 logged out. Waiting for processes to exit.
Dec 05 11:34:40 compute-0 systemd-logind[792]: Removed session 8.
Dec 05 11:34:59 compute-0 sshd-session[31557]: Accepted publickey for zuul from 192.168.122.30 port 52568 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:34:59 compute-0 systemd-logind[792]: New session 9 of user zuul.
Dec 05 11:34:59 compute-0 systemd[1]: Started Session 9 of User zuul.
Dec 05 11:34:59 compute-0 sshd-session[31557]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:35:00 compute-0 python3.9[31710]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 05 11:35:01 compute-0 python3.9[31884]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:35:01 compute-0 sudo[32034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioyuzuvwxffnojodcjvyrbpcpdvnvcie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934501.6038318-45-209999189921045/AnsiballZ_command.py'
Dec 05 11:35:01 compute-0 sudo[32034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:02 compute-0 python3.9[32036]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:35:02 compute-0 sudo[32034]: pam_unix(sudo:session): session closed for user root
Dec 05 11:35:02 compute-0 sudo[32187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khjsqkckgbzkudhveqsaeuvtojfeqcjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934502.474469-57-20885210827758/AnsiballZ_stat.py'
Dec 05 11:35:02 compute-0 sudo[32187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:03 compute-0 python3.9[32189]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:35:03 compute-0 sudo[32187]: pam_unix(sudo:session): session closed for user root
Dec 05 11:35:03 compute-0 sudo[32341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bokvvcqrzfarctppqolebjrdohyznbmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934503.3225605-65-125842742616204/AnsiballZ_file.py'
Dec 05 11:35:03 compute-0 sudo[32341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:03 compute-0 python3.9[32343]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:35:03 compute-0 sudo[32341]: pam_unix(sudo:session): session closed for user root
Dec 05 11:35:04 compute-0 sshd-session[32266]: Received disconnect from 193.46.255.103 port 61990:11:  [preauth]
Dec 05 11:35:04 compute-0 sshd-session[32266]: Disconnected from authenticating user root 193.46.255.103 port 61990 [preauth]
Dec 05 11:35:04 compute-0 sudo[32493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwvxfffdotpgvqezxcarnyxlhwpzuccz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934504.1262367-73-61209407418530/AnsiballZ_stat.py'
Dec 05 11:35:04 compute-0 sudo[32493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:04 compute-0 python3.9[32495]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:35:04 compute-0 sudo[32493]: pam_unix(sudo:session): session closed for user root
Dec 05 11:35:05 compute-0 sudo[32616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmlqkgasjdtynvhergjqemvvqqpshbxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934504.1262367-73-61209407418530/AnsiballZ_copy.py'
Dec 05 11:35:05 compute-0 sudo[32616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:05 compute-0 python3.9[32618]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934504.1262367-73-61209407418530/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:35:05 compute-0 sudo[32616]: pam_unix(sudo:session): session closed for user root
Dec 05 11:35:06 compute-0 sudo[32768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxnrlbpnsmnlvunthzzmcorbyfmhpsfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934505.6999178-88-24754994830682/AnsiballZ_setup.py'
Dec 05 11:35:06 compute-0 sudo[32768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:06 compute-0 python3.9[32770]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:35:06 compute-0 sudo[32768]: pam_unix(sudo:session): session closed for user root
Dec 05 11:35:07 compute-0 sudo[32924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wablhobjbqjqszaawjgaosjgkmzmupvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934506.7625992-96-237163659458742/AnsiballZ_file.py'
Dec 05 11:35:07 compute-0 sudo[32924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:07 compute-0 python3.9[32926]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:35:07 compute-0 sudo[32924]: pam_unix(sudo:session): session closed for user root
Dec 05 11:35:07 compute-0 sudo[33076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvtkuwglojuxnbfntxplgzkkrfoovmkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934507.5287888-105-263286106767333/AnsiballZ_file.py'
Dec 05 11:35:07 compute-0 sudo[33076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:08 compute-0 python3.9[33078]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:35:08 compute-0 sudo[33076]: pam_unix(sudo:session): session closed for user root
Dec 05 11:35:08 compute-0 python3.9[33229]: ansible-ansible.builtin.service_facts Invoked
Dec 05 11:35:14 compute-0 python3.9[33482]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:35:15 compute-0 irqbalance[790]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 05 11:35:15 compute-0 irqbalance[790]: IRQ 26 affinity is now unmanaged
Dec 05 11:35:15 compute-0 python3.9[33632]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:35:16 compute-0 python3.9[33786]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:35:17 compute-0 sudo[33942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qupidtcvmstlttzvnlryllgsvexoajol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934517.1405659-153-30443742322369/AnsiballZ_setup.py'
Dec 05 11:35:17 compute-0 sudo[33942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:17 compute-0 python3.9[33944]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:35:18 compute-0 sudo[33942]: pam_unix(sudo:session): session closed for user root
Dec 05 11:35:18 compute-0 sudo[34026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewktkrjbttpvkusefijsqpnmtubgdrcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934517.1405659-153-30443742322369/AnsiballZ_dnf.py'
Dec 05 11:35:18 compute-0 sudo[34026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:35:18 compute-0 python3.9[34028]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:35:38 compute-0 sshd-session[34144]: Invalid user ubuntu from 43.225.159.82 port 53640
Dec 05 11:35:38 compute-0 sshd-session[34144]: Received disconnect from 43.225.159.82 port 53640:11:  [preauth]
Dec 05 11:35:38 compute-0 sshd-session[34144]: Disconnected from invalid user ubuntu 43.225.159.82 port 53640 [preauth]
Dec 05 11:36:02 compute-0 systemd[1]: Reloading.
Dec 05 11:36:02 compute-0 systemd-rc-local-generator[34225]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:36:02 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 05 11:36:02 compute-0 systemd[1]: Reloading.
Dec 05 11:36:02 compute-0 systemd-rc-local-generator[34267]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:36:02 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 05 11:36:02 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 05 11:36:02 compute-0 systemd[1]: Reloading.
Dec 05 11:36:03 compute-0 systemd-rc-local-generator[34307]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:36:03 compute-0 systemd[1]: Starting dnf makecache...
Dec 05 11:36:03 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 05 11:36:03 compute-0 dnf[34317]: Failed determining last makecache time.
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-barbican-42b4c41831408a8e323 165 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 214 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-cinder-1c00d6490d88e436f26ef 197 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-python-stevedore-c4acc5639fd2329372142 196 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-python-cloudkitty-tests-tempest-2c80f8 184 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-os-net-config-d0cedbdb788d43e5c7551df5 187 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 193 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-python-designate-tests-tempest-347fdbc 175 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-glance-1fd12c29b339f30fe823e 177 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 170 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-manila-3c01b7181572c95dac462 185 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-python-whitebox-neutron-tests-tempest- 174 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-octavia-ba397f07a7331190208c 175 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-watcher-c014f81a8647287f6dcc 177 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-ansible-config_template-5ccaa22121a7ff 185 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 181 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-swift-dc98a8463506ac520c469a 182 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-python-tempestconf-8515371b7cceebd4282 179 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: delorean-openstack-heat-ui-013accbfd179753bc3f0 180 kB/s | 3.0 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: CentOS Stream 9 - BaseOS                         78 kB/s | 7.3 kB     00:00
Dec 05 11:36:03 compute-0 dnf[34317]: CentOS Stream 9 - AppStream                      68 kB/s | 7.4 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: CentOS Stream 9 - CRB                            83 kB/s | 7.2 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: CentOS Stream 9 - Extras packages                67 kB/s | 8.3 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: dlrn-antelope-testing                           102 kB/s | 3.0 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: dlrn-antelope-build-deps                        111 kB/s | 3.0 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: centos9-rabbitmq                                 92 kB/s | 3.0 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: centos9-storage                                 108 kB/s | 3.0 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: centos9-opstools                                 97 kB/s | 3.0 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: NFV SIG OpenvSwitch                              23 kB/s | 3.0 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: repo-setup-centos-appstream                     126 kB/s | 4.4 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: repo-setup-centos-baseos                        169 kB/s | 3.9 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: repo-setup-centos-highavailability              176 kB/s | 3.9 kB     00:00
Dec 05 11:36:04 compute-0 dnf[34317]: repo-setup-centos-powertools                    219 kB/s | 4.3 kB     00:00
Dec 05 11:36:05 compute-0 dnf[34317]: Extra Packages for Enterprise Linux 9 - x86_64  241 kB/s |  30 kB     00:00
Dec 05 11:36:05 compute-0 dnf[34317]: Metadata cache created.
Dec 05 11:36:05 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 05 11:36:05 compute-0 systemd[1]: Finished dnf makecache.
Dec 05 11:36:05 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.765s CPU time.
Dec 05 11:37:07 compute-0 kernel: SELinux:  Converting 2718 SID table entries...
Dec 05 11:37:07 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:37:07 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 11:37:07 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:37:07 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:37:07 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:37:07 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:37:07 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:37:07 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 05 11:37:07 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 11:37:07 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 11:37:07 compute-0 systemd[1]: Reloading.
Dec 05 11:37:07 compute-0 systemd-rc-local-generator[34679]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:37:07 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 11:37:08 compute-0 sudo[34026]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:08 compute-0 sudo[35597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kywrzyjkiuwaivoccdqycdjkyvyaxals ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934628.3994415-165-270782636245339/AnsiballZ_command.py'
Dec 05 11:37:08 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 11:37:08 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 11:37:08 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.218s CPU time.
Dec 05 11:37:08 compute-0 systemd[1]: run-rb33580635abc4a4da3a2d8712f8bbec5.service: Deactivated successfully.
Dec 05 11:37:08 compute-0 sudo[35597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:08 compute-0 python3.9[35600]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:37:09 compute-0 sudo[35597]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:10 compute-0 sudo[35879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbmsyjxpblzpllbtmokwerdgofmwumzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934630.088372-173-273833886414757/AnsiballZ_selinux.py'
Dec 05 11:37:10 compute-0 sudo[35879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:11 compute-0 python3.9[35881]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 05 11:37:11 compute-0 sudo[35879]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:11 compute-0 sudo[36031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcjwmtkrlwabderogldehvpzosatsltq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934631.3809907-184-239500216710879/AnsiballZ_command.py'
Dec 05 11:37:11 compute-0 sudo[36031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:11 compute-0 python3.9[36033]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 05 11:37:12 compute-0 sudo[36031]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:13 compute-0 sudo[36184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypwfhdvmrpjuuwxndepytmfxniqodgfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934633.1490743-192-55777173814223/AnsiballZ_file.py'
Dec 05 11:37:13 compute-0 sudo[36184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:14 compute-0 python3.9[36186]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:37:14 compute-0 sudo[36184]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:15 compute-0 sudo[36337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktfwaihubzdukexzudprlerysqeianxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934634.6148965-200-8853323804883/AnsiballZ_mount.py'
Dec 05 11:37:15 compute-0 sudo[36337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:15 compute-0 python3.9[36339]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 05 11:37:15 compute-0 sudo[36337]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:16 compute-0 sudo[36489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvhefxhjuxouppkrhksxbtkdhgjraptd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934636.257507-228-56218955781123/AnsiballZ_file.py'
Dec 05 11:37:16 compute-0 sudo[36489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:16 compute-0 python3.9[36491]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:37:16 compute-0 sudo[36489]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:17 compute-0 sudo[36641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnuwtrgajaddcoaiexrdlslunmpwhzmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934636.9991395-236-241056937324363/AnsiballZ_stat.py'
Dec 05 11:37:17 compute-0 sudo[36641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:17 compute-0 python3.9[36643]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:37:17 compute-0 sudo[36641]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:17 compute-0 sudo[36764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqoycccrhdmdjjhpkrflygweffpypycz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934636.9991395-236-241056937324363/AnsiballZ_copy.py'
Dec 05 11:37:17 compute-0 sudo[36764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:18 compute-0 python3.9[36766]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934636.9991395-236-241056937324363/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:37:18 compute-0 sudo[36764]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:19 compute-0 sudo[36916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhaiwdjbzcztcpvgyakbmvgtiioztxmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934638.679304-260-238532098363670/AnsiballZ_stat.py'
Dec 05 11:37:19 compute-0 sudo[36916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:19 compute-0 python3.9[36918]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:37:19 compute-0 sudo[36916]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:19 compute-0 sudo[37068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shekoeysjscinspzhvwgkecjkdmkbmra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934639.4856791-268-210387366045631/AnsiballZ_command.py'
Dec 05 11:37:19 compute-0 sudo[37068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:20 compute-0 python3.9[37070]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:37:20 compute-0 sudo[37068]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:20 compute-0 sudo[37221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgujemgzgpsuccgopqetxtabmhitjbmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934640.2836769-276-153295671273809/AnsiballZ_file.py'
Dec 05 11:37:20 compute-0 sudo[37221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:23 compute-0 python3.9[37223]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:37:23 compute-0 sudo[37221]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:24 compute-0 sudo[37373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmjowphayxcnwbgrqfkvaosyekqpwxcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934643.6763546-287-235408429221848/AnsiballZ_getent.py'
Dec 05 11:37:24 compute-0 sudo[37373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:24 compute-0 python3.9[37375]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 05 11:37:24 compute-0 sudo[37373]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:24 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 11:37:24 compute-0 sudo[37527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jadancjkiwnsqivinftbwnsnvhlbvdbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934644.5395355-295-178942421588970/AnsiballZ_group.py'
Dec 05 11:37:24 compute-0 sudo[37527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:26 compute-0 python3.9[37529]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 11:37:26 compute-0 groupadd[37530]: group added to /etc/group: name=qemu, GID=107
Dec 05 11:37:26 compute-0 groupadd[37530]: group added to /etc/gshadow: name=qemu
Dec 05 11:37:26 compute-0 groupadd[37530]: new group: name=qemu, GID=107
Dec 05 11:37:26 compute-0 sudo[37527]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:27 compute-0 sudo[37685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwjzsbafusucazcowtycdlwrlfdwahdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934646.5759163-303-79075504822092/AnsiballZ_user.py'
Dec 05 11:37:27 compute-0 sudo[37685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:27 compute-0 python3.9[37687]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 11:37:27 compute-0 useradd[37689]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 11:37:27 compute-0 sudo[37685]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:27 compute-0 sudo[37845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxumcvklevdqmlwniijrukcaxmvwbnfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934647.6374676-311-190997925375318/AnsiballZ_getent.py'
Dec 05 11:37:27 compute-0 sudo[37845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:28 compute-0 python3.9[37847]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 05 11:37:28 compute-0 sudo[37845]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:28 compute-0 sudo[37998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzldpwxlepffnbgugkowuutiljzhgilp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934648.3462844-319-185259952233213/AnsiballZ_group.py'
Dec 05 11:37:28 compute-0 sudo[37998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:28 compute-0 python3.9[38000]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 11:37:28 compute-0 groupadd[38001]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 05 11:37:28 compute-0 groupadd[38001]: group added to /etc/gshadow: name=hugetlbfs
Dec 05 11:37:28 compute-0 groupadd[38001]: new group: name=hugetlbfs, GID=42477
Dec 05 11:37:28 compute-0 sudo[37998]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:29 compute-0 sudo[38156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brvkiijceusfxqssltkwstruvoadzdel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934649.1276984-328-190604947717977/AnsiballZ_file.py'
Dec 05 11:37:29 compute-0 sudo[38156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:29 compute-0 python3.9[38158]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 05 11:37:29 compute-0 sudo[38156]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:30 compute-0 sudo[38308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbwmlcalbkrzqccbctoyjnlhratutmdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934650.037279-339-194055930783174/AnsiballZ_dnf.py'
Dec 05 11:37:30 compute-0 sudo[38308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:30 compute-0 python3.9[38310]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:37:32 compute-0 sudo[38308]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:32 compute-0 sudo[38461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciencrcikckfcamcbpnvcgjggxuhyzod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934652.3771596-347-252151083902214/AnsiballZ_file.py'
Dec 05 11:37:32 compute-0 sudo[38461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:32 compute-0 python3.9[38463]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:37:32 compute-0 sudo[38461]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:33 compute-0 sudo[38613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avgawyvlnmfisdoqqtcfvdfcayfiumyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934653.0928617-355-256812939155680/AnsiballZ_stat.py'
Dec 05 11:37:33 compute-0 sudo[38613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:33 compute-0 python3.9[38615]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:37:33 compute-0 sudo[38613]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:34 compute-0 sudo[38736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igyellcewlswdgirmxomjxsgzzlpevjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934653.0928617-355-256812939155680/AnsiballZ_copy.py'
Dec 05 11:37:34 compute-0 sudo[38736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:34 compute-0 python3.9[38738]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934653.0928617-355-256812939155680/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:37:34 compute-0 sudo[38736]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:35 compute-0 sudo[38888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfcpvepcgbrqkjprmrspmxzpecvsztis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934654.447098-370-246219434568100/AnsiballZ_systemd.py'
Dec 05 11:37:35 compute-0 sudo[38888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:35 compute-0 python3.9[38890]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:37:35 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 05 11:37:35 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 05 11:37:35 compute-0 kernel: Bridge firewalling registered
Dec 05 11:37:35 compute-0 systemd-modules-load[38894]: Inserted module 'br_netfilter'
Dec 05 11:37:35 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 05 11:37:35 compute-0 sudo[38888]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:36 compute-0 sudo[39047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngdhzbdmcohfvttwpgzwbmoswnbmfpct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934655.6348865-378-104020133287761/AnsiballZ_stat.py'
Dec 05 11:37:36 compute-0 sudo[39047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:36 compute-0 python3.9[39049]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:37:36 compute-0 sudo[39047]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:36 compute-0 sudo[39170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdhfftisaazfvilxnautlffnwbbqamct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934655.6348865-378-104020133287761/AnsiballZ_copy.py'
Dec 05 11:37:36 compute-0 sudo[39170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:36 compute-0 python3.9[39172]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934655.6348865-378-104020133287761/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:37:36 compute-0 sudo[39170]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:37 compute-0 sudo[39322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgfucqnueijawxkndahucogbivapxzru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934657.1937761-396-257029614599762/AnsiballZ_dnf.py'
Dec 05 11:37:37 compute-0 sudo[39322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:37 compute-0 python3.9[39324]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:37:40 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec 05 11:37:40 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec 05 11:37:40 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 11:37:40 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 11:37:40 compute-0 systemd[1]: Reloading.
Dec 05 11:37:40 compute-0 systemd-rc-local-generator[39388]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:37:41 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 11:37:41 compute-0 sudo[39322]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:42 compute-0 python3.9[40680]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:37:43 compute-0 python3.9[41645]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 05 11:37:43 compute-0 python3.9[42411]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:37:44 compute-0 sudo[43262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwztvlnssfzfnqupqszbbmiahfaoelut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934664.2364767-435-191859085620901/AnsiballZ_command.py'
Dec 05 11:37:44 compute-0 sudo[43262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:44 compute-0 python3.9[43282]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:37:44 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 05 11:37:44 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 11:37:44 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 11:37:44 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.971s CPU time.
Dec 05 11:37:44 compute-0 systemd[1]: run-r356ff7713f124989b473af8497147361.service: Deactivated successfully.
Dec 05 11:37:45 compute-0 systemd[1]: Starting Authorization Manager...
Dec 05 11:37:45 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 05 11:37:45 compute-0 polkitd[43701]: Started polkitd version 0.117
Dec 05 11:37:45 compute-0 polkitd[43701]: Loading rules from directory /etc/polkit-1/rules.d
Dec 05 11:37:45 compute-0 polkitd[43701]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 05 11:37:45 compute-0 polkitd[43701]: Finished loading, compiling and executing 2 rules
Dec 05 11:37:45 compute-0 polkitd[43701]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 05 11:37:45 compute-0 systemd[1]: Started Authorization Manager.
Dec 05 11:37:45 compute-0 sudo[43262]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:45 compute-0 sudo[43869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqpoifrpeoqgcizvazadtkybraqlmjga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934665.5320737-444-72659254455064/AnsiballZ_systemd.py'
Dec 05 11:37:45 compute-0 sudo[43869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:46 compute-0 python3.9[43871]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:37:46 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 05 11:37:46 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Dec 05 11:37:46 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 05 11:37:46 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 05 11:37:46 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 05 11:37:46 compute-0 sudo[43869]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:47 compute-0 python3.9[44033]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 05 11:37:49 compute-0 sudo[44183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emvhygdcbzbreulpvlvadtmxxbvjwduc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934669.2893093-501-249753787976991/AnsiballZ_systemd.py'
Dec 05 11:37:49 compute-0 sudo[44183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:49 compute-0 python3.9[44185]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:37:49 compute-0 systemd[1]: Reloading.
Dec 05 11:37:50 compute-0 systemd-rc-local-generator[44211]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:37:50 compute-0 sudo[44183]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:50 compute-0 sudo[44371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqlpigvvfmzrqmbkxqybagelpyskjsxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934670.3466792-501-174239050354080/AnsiballZ_systemd.py'
Dec 05 11:37:50 compute-0 sudo[44371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:50 compute-0 python3.9[44373]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:37:51 compute-0 systemd[1]: Reloading.
Dec 05 11:37:51 compute-0 systemd-rc-local-generator[44407]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:37:51 compute-0 sudo[44371]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:51 compute-0 sudo[44561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uypmtrhqsdjmopssuvtvouqhnnmlmsee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934671.4966989-517-258130722415324/AnsiballZ_command.py'
Dec 05 11:37:51 compute-0 sudo[44561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:52 compute-0 python3.9[44563]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:37:52 compute-0 sudo[44561]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:52 compute-0 sudo[44714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qycyqwludvvtgvrohuucppcfyghehgtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934672.265982-525-195593906087461/AnsiballZ_command.py'
Dec 05 11:37:52 compute-0 sudo[44714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:52 compute-0 python3.9[44716]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:37:52 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 05 11:37:52 compute-0 sudo[44714]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:53 compute-0 sudo[44867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxmfpmenwzqadylxinfbljbwyhdtmamn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934673.047934-533-30220154393797/AnsiballZ_command.py'
Dec 05 11:37:53 compute-0 sudo[44867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:53 compute-0 python3.9[44869]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:37:54 compute-0 sudo[44867]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:55 compute-0 sudo[45029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehkhwczrzcxmichjhcqlnakqolxfgbfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934675.171985-541-175384737305408/AnsiballZ_command.py'
Dec 05 11:37:55 compute-0 sudo[45029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:55 compute-0 python3.9[45031]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:37:55 compute-0 sudo[45029]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:56 compute-0 sudo[45182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myrbvvdjkfxzzkdyiceijudkyptwcefc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934676.051553-549-243762130207198/AnsiballZ_systemd.py'
Dec 05 11:37:56 compute-0 sudo[45182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:37:56 compute-0 python3.9[45184]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:37:56 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 05 11:37:56 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Dec 05 11:37:56 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Dec 05 11:37:56 compute-0 systemd[1]: Starting Apply Kernel Variables...
Dec 05 11:37:56 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 05 11:37:56 compute-0 systemd[1]: Finished Apply Kernel Variables.
Dec 05 11:37:56 compute-0 sudo[45182]: pam_unix(sudo:session): session closed for user root
Dec 05 11:37:57 compute-0 sshd-session[31560]: Connection closed by 192.168.122.30 port 52568
Dec 05 11:37:57 compute-0 sshd-session[31557]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:37:57 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Dec 05 11:37:57 compute-0 systemd[1]: session-9.scope: Consumed 2min 17.023s CPU time.
Dec 05 11:37:57 compute-0 systemd-logind[792]: Session 9 logged out. Waiting for processes to exit.
Dec 05 11:37:57 compute-0 systemd-logind[792]: Removed session 9.
Dec 05 11:38:02 compute-0 sshd-session[45214]: Accepted publickey for zuul from 192.168.122.30 port 36106 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:38:02 compute-0 systemd-logind[792]: New session 10 of user zuul.
Dec 05 11:38:02 compute-0 systemd[1]: Started Session 10 of User zuul.
Dec 05 11:38:02 compute-0 sshd-session[45214]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:38:04 compute-0 python3.9[45367]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:38:05 compute-0 python3.9[45521]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:38:06 compute-0 sudo[45675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkkxrimjnswppazqvsrbxnspxlbgxalv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934685.7236447-50-45963221999791/AnsiballZ_command.py'
Dec 05 11:38:06 compute-0 sudo[45675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:06 compute-0 python3.9[45677]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:38:06 compute-0 sudo[45675]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:07 compute-0 python3.9[45828]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:38:07 compute-0 sudo[45982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngmsmyguagiyprtdhhyisucixxqxpiav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934687.6919568-70-6840766108746/AnsiballZ_setup.py'
Dec 05 11:38:07 compute-0 sudo[45982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:08 compute-0 python3.9[45984]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:38:08 compute-0 sudo[45982]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:09 compute-0 sudo[46066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mizuqhmicocdrntrxzsbcnaulzbkealx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934687.6919568-70-6840766108746/AnsiballZ_dnf.py'
Dec 05 11:38:09 compute-0 sudo[46066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:09 compute-0 python3.9[46068]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:38:10 compute-0 sudo[46066]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:11 compute-0 sudo[46219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irgdmztjyxiaoyofroxctslkreszivpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934690.8918703-82-13909401438323/AnsiballZ_setup.py'
Dec 05 11:38:11 compute-0 sudo[46219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:11 compute-0 python3.9[46221]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:38:11 compute-0 sudo[46219]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:12 compute-0 sudo[46390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiyeowtrzemmxxunqmejmvhubfdhzdiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934691.845869-93-153310357735660/AnsiballZ_file.py'
Dec 05 11:38:12 compute-0 sudo[46390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:12 compute-0 python3.9[46392]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:38:12 compute-0 sudo[46390]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:13 compute-0 sudo[46542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idueykjusaqgogrxwfyldwombbncchsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934692.7142253-101-88416622255451/AnsiballZ_command.py'
Dec 05 11:38:13 compute-0 sudo[46542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:13 compute-0 python3.9[46544]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:38:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1772982915-merged.mount: Deactivated successfully.
Dec 05 11:38:13 compute-0 podman[46545]: 2025-12-05 11:38:13.299616903 +0000 UTC m=+0.059632046 system refresh
Dec 05 11:38:13 compute-0 sudo[46542]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:14 compute-0 sudo[46706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stdltklccmbmnzxkbplgnbhuegqcyujz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934693.5554507-109-274498842155679/AnsiballZ_stat.py'
Dec 05 11:38:14 compute-0 sudo[46706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:14 compute-0 python3.9[46708]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:38:14 compute-0 sudo[46706]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:38:14 compute-0 sudo[46829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcdtonvsrxjhjgxzbhlgjrrdwwyaediy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934693.5554507-109-274498842155679/AnsiballZ_copy.py'
Dec 05 11:38:14 compute-0 sudo[46829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:15 compute-0 python3.9[46831]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934693.5554507-109-274498842155679/.source.json follow=False _original_basename=podman_network_config.j2 checksum=7ff3ae51ba32067874fed07a7a999793302cd1b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:38:15 compute-0 sudo[46829]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:15 compute-0 sudo[46981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obknjmchbsbzlwaxhevzucsknvfvwual ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934695.2314103-124-93566517516550/AnsiballZ_stat.py'
Dec 05 11:38:15 compute-0 sudo[46981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:15 compute-0 python3.9[46983]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:38:15 compute-0 sudo[46981]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:16 compute-0 sudo[47104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpndqrnyeeolqcmchleqtpnrwfavlrng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934695.2314103-124-93566517516550/AnsiballZ_copy.py'
Dec 05 11:38:16 compute-0 sudo[47104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:16 compute-0 python3.9[47106]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934695.2314103-124-93566517516550/.source.conf follow=False _original_basename=registries.conf.j2 checksum=88b6a52c62914061ba0322e1e0763af09791b362 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:38:16 compute-0 sudo[47104]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:17 compute-0 sudo[47256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydyvynggkymlcawlveufwkhdwokhcobc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934696.671977-140-28262500781832/AnsiballZ_ini_file.py'
Dec 05 11:38:17 compute-0 sudo[47256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:17 compute-0 python3.9[47258]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:38:17 compute-0 sudo[47256]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:17 compute-0 sudo[47408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frywoezjtprpaqnppxrqwlehigsardcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934697.5318549-140-263210428087866/AnsiballZ_ini_file.py'
Dec 05 11:38:17 compute-0 sudo[47408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:18 compute-0 python3.9[47410]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:38:18 compute-0 sudo[47408]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:18 compute-0 sudo[47560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtdlefamczfpanzvemngapjoitsgpcea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934698.2648506-140-54004458636536/AnsiballZ_ini_file.py'
Dec 05 11:38:18 compute-0 sudo[47560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:18 compute-0 python3.9[47562]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:38:18 compute-0 sudo[47560]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:19 compute-0 sudo[47712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tatbtilfrmjovcuiwrruickgkxdzjmql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934699.0039878-140-118294207815769/AnsiballZ_ini_file.py'
Dec 05 11:38:19 compute-0 sudo[47712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:19 compute-0 python3.9[47714]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:38:19 compute-0 sudo[47712]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:20 compute-0 python3.9[47864]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:38:21 compute-0 sudo[48016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stonwvmgkucdrvyokdxzkxxyxrjtipuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934700.8393767-180-198184929423660/AnsiballZ_dnf.py'
Dec 05 11:38:21 compute-0 sudo[48016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:21 compute-0 python3.9[48018]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:38:22 compute-0 sudo[48016]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:23 compute-0 sudo[48169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quqwtboorizbzemsskawrhdvrtsbnewr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934702.8766434-188-202649755182900/AnsiballZ_dnf.py'
Dec 05 11:38:23 compute-0 sudo[48169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:23 compute-0 python3.9[48171]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:38:25 compute-0 sudo[48169]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:26 compute-0 sudo[48329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axqpzwmcurqmkzhaulmzqcogozmodfkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934705.829146-198-40804996160892/AnsiballZ_dnf.py'
Dec 05 11:38:26 compute-0 sudo[48329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:26 compute-0 python3.9[48331]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:38:27 compute-0 sudo[48329]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:28 compute-0 sudo[48482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnnfsflqkchubayvzzgfwhfmnlsmstyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934707.8667438-207-23638402034689/AnsiballZ_dnf.py'
Dec 05 11:38:28 compute-0 sudo[48482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:28 compute-0 python3.9[48484]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:38:29 compute-0 sudo[48482]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:30 compute-0 sudo[48635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfznatdogtieuybzlesuplrprgopdrlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934710.0299838-218-125988473020500/AnsiballZ_dnf.py'
Dec 05 11:38:30 compute-0 sudo[48635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:30 compute-0 python3.9[48637]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:38:32 compute-0 sudo[48635]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:32 compute-0 sudo[48791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umunfdnwqatjmwuxnlbrokddyslcechq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934712.3731394-226-59181921099153/AnsiballZ_dnf.py'
Dec 05 11:38:32 compute-0 sudo[48791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:32 compute-0 python3.9[48793]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:38:40 compute-0 sudo[48791]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:40 compute-0 sudo[48960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxzdvryiywabkmhjkhtkivmxqeteqamc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934720.3958573-235-166412172979853/AnsiballZ_dnf.py'
Dec 05 11:38:40 compute-0 sudo[48960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:40 compute-0 python3.9[48962]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:38:42 compute-0 sudo[48960]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:42 compute-0 sudo[49113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjmmiuatrtqvnugxobyddzjsmyaefif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934722.5668118-244-113669369648759/AnsiballZ_dnf.py'
Dec 05 11:38:42 compute-0 sudo[49113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:43 compute-0 python3.9[49115]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:38:53 compute-0 sudo[49113]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:53 compute-0 sudo[49450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqdwjitfehrqrihvichxwjpspgrrvvkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934733.6256912-253-131168354352550/AnsiballZ_dnf.py'
Dec 05 11:38:53 compute-0 sudo[49450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:54 compute-0 python3.9[49452]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:38:55 compute-0 sudo[49450]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:56 compute-0 sudo[49606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beteitdgaudvrqnfdadqynxjwcxmiunh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934735.953733-264-16785844433423/AnsiballZ_file.py'
Dec 05 11:38:56 compute-0 sudo[49606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:56 compute-0 python3.9[49608]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:38:56 compute-0 sudo[49606]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:57 compute-0 sudo[49781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmemojejwjidjiqsbenkbnkfckytuuuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934736.6732306-272-96495391628341/AnsiballZ_stat.py'
Dec 05 11:38:57 compute-0 sudo[49781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:57 compute-0 python3.9[49783]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:38:57 compute-0 sudo[49781]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:57 compute-0 sudo[49904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yibcybfczhrenmtlvlsstgllqyvvyvun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934736.6732306-272-96495391628341/AnsiballZ_copy.py'
Dec 05 11:38:57 compute-0 sudo[49904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:57 compute-0 python3.9[49906]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764934736.6732306-272-96495391628341/.source.json _original_basename=.lgkpuosj follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:38:57 compute-0 sudo[49904]: pam_unix(sudo:session): session closed for user root
Dec 05 11:38:58 compute-0 sudo[50056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ishiyymjvgqdinzstuabqvhpopskyxpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934738.1352637-290-91927670326672/AnsiballZ_podman_image.py'
Dec 05 11:38:58 compute-0 sudo[50056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:38:58 compute-0 python3.9[50058]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 11:38:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1785861986-lower\x2dmapped.mount: Deactivated successfully.
Dec 05 11:39:04 compute-0 podman[50070]: 2025-12-05 11:39:04.249958626 +0000 UTC m=+5.270625729 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 05 11:39:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:04 compute-0 sudo[50056]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:05 compute-0 sudo[50365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-domebebxgleftsjxgpikgjsitnfafxxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934744.899554-301-248893206874336/AnsiballZ_podman_image.py'
Dec 05 11:39:05 compute-0 sudo[50365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:05 compute-0 python3.9[50367]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 11:39:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:14 compute-0 podman[50379]: 2025-12-05 11:39:14.901894501 +0000 UTC m=+9.385652312 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 11:39:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:15 compute-0 sudo[50365]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:15 compute-0 sudo[50702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjdopxhffqpiuwqpsrmbtskejodrtgff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934755.490487-311-64588660969381/AnsiballZ_podman_image.py'
Dec 05 11:39:15 compute-0 sudo[50702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:16 compute-0 python3.9[50704]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 11:39:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:17 compute-0 podman[50716]: 2025-12-05 11:39:17.327395775 +0000 UTC m=+1.240997671 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 05 11:39:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:17 compute-0 sudo[50702]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:18 compute-0 sudo[50954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjfotajyasrigbxbkxmqlkjlegghbtwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934757.7900035-320-224612423438580/AnsiballZ_podman_image.py'
Dec 05 11:39:18 compute-0 sudo[50954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:18 compute-0 python3.9[50956]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 11:39:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:19 compute-0 sshd-session[50729]: Connection reset by authenticating user root 45.140.17.124 port 49874 [preauth]
Dec 05 11:39:20 compute-0 sshd-session[50989]: Invalid user dev from 45.140.17.124 port 49886
Dec 05 11:39:21 compute-0 sshd-session[50989]: Connection reset by invalid user dev 45.140.17.124 port 49886 [preauth]
Dec 05 11:39:23 compute-0 sshd-session[51012]: Connection reset by authenticating user root 45.140.17.124 port 49894 [preauth]
Dec 05 11:39:29 compute-0 sshd-session[51054]: Connection reset by authenticating user root 45.140.17.124 port 60660 [preauth]
Dec 05 11:39:29 compute-0 podman[50969]: 2025-12-05 11:39:29.722889517 +0000 UTC m=+11.322952254 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 05 11:39:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:29 compute-0 sudo[50954]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:30 compute-0 sudo[51255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utqbzaycewstrurxjtiicujzxouvqdfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934770.5162187-331-181266076329052/AnsiballZ_podman_image.py'
Dec 05 11:39:30 compute-0 sudo[51255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:31 compute-0 python3.9[51257]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 11:39:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:33 compute-0 sshd-session[51056]: Connection reset by authenticating user root 45.140.17.124 port 60664 [preauth]
Dec 05 11:39:33 compute-0 podman[51269]: 2025-12-05 11:39:33.8991738 +0000 UTC m=+2.656184221 image pull 343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 05 11:39:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:34 compute-0 sudo[51255]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:34 compute-0 sudo[51523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udavibcomfaesmbafkkgqazrpgepaykz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934774.4295232-331-108642060536263/AnsiballZ_podman_image.py'
Dec 05 11:39:34 compute-0 sudo[51523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:35 compute-0 python3.9[51525]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 11:39:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:36 compute-0 podman[51539]: 2025-12-05 11:39:36.251561527 +0000 UTC m=+1.190867474 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 05 11:39:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:39:36 compute-0 sudo[51523]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:36 compute-0 sshd-session[45217]: Connection closed by 192.168.122.30 port 36106
Dec 05 11:39:36 compute-0 sshd-session[45214]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:39:36 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Dec 05 11:39:36 compute-0 systemd[1]: session-10.scope: Consumed 1min 52.772s CPU time.
Dec 05 11:39:36 compute-0 systemd-logind[792]: Session 10 logged out. Waiting for processes to exit.
Dec 05 11:39:36 compute-0 systemd-logind[792]: Removed session 10.
Dec 05 11:39:44 compute-0 sshd-session[51689]: Accepted publickey for zuul from 192.168.122.30 port 38730 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:39:44 compute-0 systemd-logind[792]: New session 11 of user zuul.
Dec 05 11:39:44 compute-0 systemd[1]: Started Session 11 of User zuul.
Dec 05 11:39:44 compute-0 sshd-session[51689]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:39:45 compute-0 python3.9[51842]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:39:46 compute-0 sudo[51996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtdnhahjtgbhmdzksvirktikzcrbtrou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934786.0614734-36-194979797156616/AnsiballZ_getent.py'
Dec 05 11:39:46 compute-0 sudo[51996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:46 compute-0 python3.9[51998]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 05 11:39:46 compute-0 sudo[51996]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:47 compute-0 sudo[52149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjxcrudreuoojcmcoeejwgfhvrwzmmqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934787.0543506-44-75679605318126/AnsiballZ_group.py'
Dec 05 11:39:47 compute-0 sudo[52149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:47 compute-0 python3.9[52151]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 11:39:48 compute-0 groupadd[52152]: group added to /etc/group: name=openvswitch, GID=42476
Dec 05 11:39:48 compute-0 groupadd[52152]: group added to /etc/gshadow: name=openvswitch
Dec 05 11:39:48 compute-0 groupadd[52152]: new group: name=openvswitch, GID=42476
Dec 05 11:39:48 compute-0 sudo[52149]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:48 compute-0 sudo[52307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osbdnczgnsqnaurimhgtqcohjkqqmibr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934788.2114935-52-196325862652474/AnsiballZ_user.py'
Dec 05 11:39:48 compute-0 sudo[52307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:49 compute-0 python3.9[52309]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 11:39:49 compute-0 useradd[52311]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 11:39:49 compute-0 useradd[52311]: add 'openvswitch' to group 'hugetlbfs'
Dec 05 11:39:49 compute-0 useradd[52311]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 05 11:39:49 compute-0 sudo[52307]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:49 compute-0 sudo[52467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qurvdwdenolwkwweaslbvkqgylxgjnmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934789.4285069-62-249602245113781/AnsiballZ_setup.py'
Dec 05 11:39:49 compute-0 sudo[52467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:50 compute-0 python3.9[52469]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:39:50 compute-0 sudo[52467]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:50 compute-0 sudo[52551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxkancewgajiwlmplhkgvdjkkkfydyyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934789.4285069-62-249602245113781/AnsiballZ_dnf.py'
Dec 05 11:39:50 compute-0 sudo[52551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:51 compute-0 python3.9[52553]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:39:52 compute-0 sudo[52551]: pam_unix(sudo:session): session closed for user root
Dec 05 11:39:53 compute-0 sudo[52713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boftqbtzinhtjqbwnatpcykspzziofag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934792.9301202-76-269679903260617/AnsiballZ_dnf.py'
Dec 05 11:39:53 compute-0 sudo[52713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:39:53 compute-0 python3.9[52715]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:40:04 compute-0 kernel: SELinux:  Converting 2731 SID table entries...
Dec 05 11:40:04 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:40:04 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 11:40:04 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:40:04 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:40:04 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:40:04 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:40:04 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:40:05 compute-0 groupadd[52738]: group added to /etc/group: name=unbound, GID=993
Dec 05 11:40:05 compute-0 groupadd[52738]: group added to /etc/gshadow: name=unbound
Dec 05 11:40:05 compute-0 groupadd[52738]: new group: name=unbound, GID=993
Dec 05 11:40:05 compute-0 useradd[52745]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 05 11:40:05 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 05 11:40:05 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 05 11:40:06 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 11:40:06 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 11:40:06 compute-0 systemd[1]: Reloading.
Dec 05 11:40:06 compute-0 systemd-rc-local-generator[53243]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:40:06 compute-0 systemd-sysv-generator[53246]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:40:07 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 11:40:07 compute-0 sudo[52713]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:07 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 11:40:07 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 11:40:07 compute-0 systemd[1]: run-r65682e9384264f44bd70efc1f403fb72.service: Deactivated successfully.
Dec 05 11:40:08 compute-0 sudo[53811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojvehupdbwathqjvkugfqeuhkwpminwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934807.8683813-84-100356081922651/AnsiballZ_systemd.py'
Dec 05 11:40:08 compute-0 sudo[53811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:08 compute-0 python3.9[53813]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 11:40:08 compute-0 systemd[1]: Reloading.
Dec 05 11:40:08 compute-0 systemd-rc-local-generator[53845]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:40:08 compute-0 systemd-sysv-generator[53849]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:40:09 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Dec 05 11:40:09 compute-0 chown[53855]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 05 11:40:09 compute-0 ovs-ctl[53860]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 05 11:40:09 compute-0 ovs-ctl[53860]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 05 11:40:09 compute-0 ovs-ctl[53860]: Starting ovsdb-server [  OK  ]
Dec 05 11:40:09 compute-0 ovs-vsctl[53909]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 05 11:40:09 compute-0 ovs-vsctl[53929]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2686fa45-e88c-4058-8865-e810ceb89d95\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 05 11:40:09 compute-0 ovs-ctl[53860]: Configuring Open vSwitch system IDs [  OK  ]
Dec 05 11:40:09 compute-0 ovs-vsctl[53935]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 05 11:40:09 compute-0 ovs-ctl[53860]: Enabling remote OVSDB managers [  OK  ]
Dec 05 11:40:09 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Dec 05 11:40:09 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 05 11:40:09 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 05 11:40:09 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 05 11:40:09 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Dec 05 11:40:09 compute-0 ovs-ctl[53979]: Inserting openvswitch module [  OK  ]
Dec 05 11:40:09 compute-0 ovs-ctl[53948]: Starting ovs-vswitchd [  OK  ]
Dec 05 11:40:09 compute-0 ovs-vsctl[53999]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 05 11:40:09 compute-0 ovs-ctl[53948]: Enabling remote OVSDB managers [  OK  ]
Dec 05 11:40:09 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 05 11:40:09 compute-0 systemd[1]: Starting Open vSwitch...
Dec 05 11:40:09 compute-0 systemd[1]: Finished Open vSwitch.
Dec 05 11:40:09 compute-0 sudo[53811]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:10 compute-0 python3.9[54151]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:40:11 compute-0 sudo[54301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brfeziqmbiqdbjfqfkwbvgrvhdbiprbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934811.0703514-102-269610753894079/AnsiballZ_sefcontext.py'
Dec 05 11:40:11 compute-0 sudo[54301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:11 compute-0 python3.9[54303]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 05 11:40:12 compute-0 kernel: SELinux:  Converting 2745 SID table entries...
Dec 05 11:40:12 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:40:12 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 11:40:12 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:40:12 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:40:12 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:40:12 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:40:12 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:40:13 compute-0 sudo[54301]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:14 compute-0 python3.9[54458]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:40:14 compute-0 sudo[54614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyszusvgravvtfubdssvvhjmcoranmxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934814.4969907-120-257658626291111/AnsiballZ_dnf.py'
Dec 05 11:40:14 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 05 11:40:14 compute-0 sudo[54614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:15 compute-0 python3.9[54616]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:40:16 compute-0 sudo[54614]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:17 compute-0 sudo[54767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjeaptwsaykufdvnacqzrjikytfuuyay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934816.5651467-128-96270494119353/AnsiballZ_command.py'
Dec 05 11:40:17 compute-0 sudo[54767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:17 compute-0 python3.9[54769]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:40:18 compute-0 sudo[54767]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:18 compute-0 sudo[55054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvznwbujbifdvuivdmamzpydxogacvkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934818.2032614-136-45571173276835/AnsiballZ_file.py'
Dec 05 11:40:18 compute-0 sudo[55054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:18 compute-0 python3.9[55056]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 05 11:40:18 compute-0 sudo[55054]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:19 compute-0 python3.9[55206]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:40:20 compute-0 sudo[55358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnmxurgglfrifbhbefassaeuoclguwmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934820.0009542-152-42520500134472/AnsiballZ_dnf.py'
Dec 05 11:40:20 compute-0 sudo[55358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:20 compute-0 python3.9[55360]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:40:22 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 11:40:22 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 11:40:22 compute-0 systemd[1]: Reloading.
Dec 05 11:40:22 compute-0 systemd-rc-local-generator[55396]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:40:22 compute-0 systemd-sysv-generator[55399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:40:22 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 11:40:22 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 11:40:22 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 11:40:22 compute-0 systemd[1]: run-rfc0f268c0020497c97395bc9fcbed7ae.service: Deactivated successfully.
Dec 05 11:40:22 compute-0 sudo[55358]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:23 compute-0 sudo[55676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agsjhornsvldottowptesugpihwzcxbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934823.1785226-160-84215927961276/AnsiballZ_systemd.py'
Dec 05 11:40:23 compute-0 sudo[55676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:23 compute-0 python3.9[55678]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:40:23 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 05 11:40:23 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Dec 05 11:40:23 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Dec 05 11:40:23 compute-0 systemd[1]: Stopping Network Manager...
Dec 05 11:40:23 compute-0 NetworkManager[7198]: <info>  [1764934823.8559] caught SIGTERM, shutting down normally.
Dec 05 11:40:23 compute-0 NetworkManager[7198]: <info>  [1764934823.8572] dhcp4 (eth0): canceled DHCP transaction
Dec 05 11:40:23 compute-0 NetworkManager[7198]: <info>  [1764934823.8572] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 11:40:23 compute-0 NetworkManager[7198]: <info>  [1764934823.8573] dhcp4 (eth0): state changed no lease
Dec 05 11:40:23 compute-0 NetworkManager[7198]: <info>  [1764934823.8574] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 11:40:23 compute-0 NetworkManager[7198]: <info>  [1764934823.8644] exiting (success)
Dec 05 11:40:23 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 11:40:23 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 11:40:23 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 05 11:40:23 compute-0 systemd[1]: Stopped Network Manager.
Dec 05 11:40:23 compute-0 systemd[1]: NetworkManager.service: Consumed 14.252s CPU time, 4.3M memory peak, read 0B from disk, written 36.0K to disk.
Dec 05 11:40:23 compute-0 systemd[1]: Starting Network Manager...
Dec 05 11:40:23 compute-0 NetworkManager[55691]: <info>  [1764934823.9526] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:f0f69436-bbfa-48e7-b73e-9b22f091bec6)
Dec 05 11:40:23 compute-0 NetworkManager[55691]: <info>  [1764934823.9530] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 05 11:40:23 compute-0 NetworkManager[55691]: <info>  [1764934823.9607] manager[0x55a7ba6c8090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 05 11:40:23 compute-0 systemd[1]: Starting Hostname Service...
Dec 05 11:40:24 compute-0 systemd[1]: Started Hostname Service.
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0786] hostname: hostname: using hostnamed
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0786] hostname: static hostname changed from (none) to "compute-0"
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0791] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0796] manager[0x55a7ba6c8090]: rfkill: Wi-Fi hardware radio set enabled
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0797] manager[0x55a7ba6c8090]: rfkill: WWAN hardware radio set enabled
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0818] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0826] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0827] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0828] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0828] manager: Networking is enabled by state file
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0830] settings: Loaded settings plugin: keyfile (internal)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0833] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0857] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0868] dhcp: init: Using DHCP client 'internal'
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0871] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0875] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0881] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0888] device (lo): Activation: starting connection 'lo' (0f41f2b7-1648-4484-8c07-96c2526b8b00)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0894] device (eth0): carrier: link connected
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0898] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0902] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0903] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0908] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0915] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0920] device (eth1): carrier: link connected
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0923] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0927] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c) (indicated)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0928] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0932] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0938] device (eth1): Activation: starting connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c)
Dec 05 11:40:24 compute-0 systemd[1]: Started Network Manager.
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0946] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0953] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0955] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0956] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0958] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0961] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0963] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0964] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0966] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0971] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0973] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0981] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.0992] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1003] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1005] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1009] device (lo): Activation: successful, device activated.
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1015] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1020] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1081] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1086] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1092] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1095] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 05 11:40:24 compute-0 systemd[1]: Starting Network Manager Wait Online...
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1097] device (eth1): Activation: successful, device activated.
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1129] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1130] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1133] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1135] device (eth0): Activation: successful, device activated.
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1138] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 05 11:40:24 compute-0 NetworkManager[55691]: <info>  [1764934824.1141] manager: startup complete
Dec 05 11:40:24 compute-0 systemd[1]: Finished Network Manager Wait Online.
Dec 05 11:40:24 compute-0 sudo[55676]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:24 compute-0 sudo[55902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smmcgbccuqqwkjnchktpziekigrcvkom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934824.4010665-168-249919912534324/AnsiballZ_dnf.py'
Dec 05 11:40:24 compute-0 sudo[55902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:24 compute-0 python3.9[55904]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:40:29 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 11:40:29 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 11:40:29 compute-0 systemd[1]: Reloading.
Dec 05 11:40:30 compute-0 systemd-sysv-generator[55961]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:40:30 compute-0 systemd-rc-local-generator[55957]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:40:30 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 11:40:30 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 11:40:30 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 11:40:30 compute-0 systemd[1]: run-ra2880eb5d01b4bc281576c080b6847aa.service: Deactivated successfully.
Dec 05 11:40:31 compute-0 sudo[55902]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:31 compute-0 sudo[56363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maqgaqbdsqdczutwcscwxixfflpzzlew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934831.3868968-180-58755907466282/AnsiballZ_stat.py'
Dec 05 11:40:31 compute-0 sudo[56363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:31 compute-0 python3.9[56365]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:40:31 compute-0 sudo[56363]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:32 compute-0 sudo[56515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjzutsxcfvrbfnxcambaukxvpmthboou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934832.0921974-189-45034082481396/AnsiballZ_ini_file.py'
Dec 05 11:40:32 compute-0 sudo[56515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:32 compute-0 python3.9[56517]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:32 compute-0 sudo[56515]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:33 compute-0 sudo[56669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwvdhcikersykskxxoyzgtmndceplfal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934833.0553758-199-262060719450236/AnsiballZ_ini_file.py'
Dec 05 11:40:33 compute-0 sudo[56669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:33 compute-0 python3.9[56671]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:33 compute-0 sudo[56669]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:34 compute-0 sudo[56821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkfhathnakiykcrxyxkaokajsgifhkwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934833.730064-199-86902121530838/AnsiballZ_ini_file.py'
Dec 05 11:40:34 compute-0 sudo[56821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:34 compute-0 python3.9[56823]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:34 compute-0 sudo[56821]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:34 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 11:40:34 compute-0 sudo[56973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzgslpmbdpgvppfxvcfvphvdrffkyshr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934834.4618435-214-38843600860854/AnsiballZ_ini_file.py'
Dec 05 11:40:34 compute-0 sudo[56973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:34 compute-0 python3.9[56975]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:34 compute-0 sudo[56973]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:35 compute-0 sudo[57125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgcnjyjhtigcpgijlqjaxzgtrkkinznf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934835.1085382-214-121340938873800/AnsiballZ_ini_file.py'
Dec 05 11:40:35 compute-0 sudo[57125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:35 compute-0 python3.9[57127]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:35 compute-0 sudo[57125]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:36 compute-0 sudo[57277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htpyipfjwqoxjgnijrawptmburyefwbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934835.906281-229-18206664741806/AnsiballZ_stat.py'
Dec 05 11:40:36 compute-0 sudo[57277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:36 compute-0 python3.9[57279]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:40:36 compute-0 sudo[57277]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:37 compute-0 sudo[57400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqgclebtknnolcxorvioxsfxdfsklshl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934835.906281-229-18206664741806/AnsiballZ_copy.py'
Dec 05 11:40:37 compute-0 sudo[57400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:37 compute-0 python3.9[57402]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934835.906281-229-18206664741806/.source _original_basename=.lswvfp1y follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:37 compute-0 sudo[57400]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:37 compute-0 sudo[57552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yorznipqrbvlwfekgjuscbslztdecfrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934837.499137-244-146757052010715/AnsiballZ_file.py'
Dec 05 11:40:37 compute-0 sudo[57552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:37 compute-0 python3.9[57554]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:38 compute-0 sudo[57552]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:38 compute-0 sudo[57704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zafjzojkvcovdzisvqqbcmhiimxowtih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934838.189718-252-59967181503186/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 05 11:40:38 compute-0 sudo[57704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:38 compute-0 python3.9[57706]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 05 11:40:38 compute-0 sudo[57704]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:39 compute-0 sudo[57856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sebrdyrysloqsoowjhwtjmokousrmkoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934839.0679724-261-16383256970559/AnsiballZ_file.py'
Dec 05 11:40:39 compute-0 sudo[57856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:39 compute-0 python3.9[57858]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:39 compute-0 sudo[57856]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:40 compute-0 sudo[58008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwrpucjocafapimbilbfarmdyztufroh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934839.9438744-271-161982700936656/AnsiballZ_stat.py'
Dec 05 11:40:40 compute-0 sudo[58008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:40 compute-0 sudo[58008]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:40 compute-0 sudo[58131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctvhkmeztrbdgroytzbymqdsbklbkzbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934839.9438744-271-161982700936656/AnsiballZ_copy.py'
Dec 05 11:40:40 compute-0 sudo[58131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:41 compute-0 sudo[58131]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:41 compute-0 sudo[58283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfjcrilpedwneeekystfpliteqolwinb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934841.3394115-286-197841052539313/AnsiballZ_slurp.py'
Dec 05 11:40:41 compute-0 sudo[58283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:41 compute-0 python3.9[58285]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 05 11:40:42 compute-0 sudo[58283]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:42 compute-0 sudo[58458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kitlvyotiylibujsfiwusajqqfsadinw ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934842.2626042-295-234296007961692/async_wrapper.py j388437315305 300 /home/zuul/.ansible/tmp/ansible-tmp-1764934842.2626042-295-234296007961692/AnsiballZ_edpm_os_net_config.py _'
Dec 05 11:40:42 compute-0 sudo[58458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:43 compute-0 ansible-async_wrapper.py[58460]: Invoked with j388437315305 300 /home/zuul/.ansible/tmp/ansible-tmp-1764934842.2626042-295-234296007961692/AnsiballZ_edpm_os_net_config.py _
Dec 05 11:40:43 compute-0 ansible-async_wrapper.py[58463]: Starting module and watcher
Dec 05 11:40:43 compute-0 ansible-async_wrapper.py[58463]: Start watching 58464 (300)
Dec 05 11:40:43 compute-0 ansible-async_wrapper.py[58464]: Start module (58464)
Dec 05 11:40:43 compute-0 ansible-async_wrapper.py[58460]: Return async_wrapper task started.
Dec 05 11:40:43 compute-0 sudo[58458]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:43 compute-0 python3.9[58465]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 05 11:40:43 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 05 11:40:43 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 05 11:40:43 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 05 11:40:43 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 05 11:40:43 compute-0 kernel: cfg80211: failed to load regulatory.db
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1040] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1068] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1653] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1655] audit: op="connection-add" uuid="4ddb60d7-2a3c-4aa4-9561-51cffb00bfbe" name="br-ex-br" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1675] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1676] audit: op="connection-add" uuid="13c6488b-9eba-4cf8-953e-4c13e4705605" name="br-ex-port" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1695] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1696] audit: op="connection-add" uuid="1045c5c4-a78d-40b3-8826-cb9933e50aab" name="eth1-port" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1714] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1715] audit: op="connection-add" uuid="cb27fa5c-e07f-493f-8997-0129d5ca9f4d" name="vlan20-port" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1732] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1733] audit: op="connection-add" uuid="f4d91186-8b58-4d63-adfd-dd5722e9b835" name="vlan21-port" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1750] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1751] audit: op="connection-add" uuid="61eb692f-aa48-43ca-aa37-ad3fa95b4c5b" name="vlan22-port" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1774] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1794] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1795] audit: op="connection-add" uuid="c342017d-ca50-45e6-93f0-ae90b33fdcef" name="br-ex-if" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1873] audit: op="connection-update" uuid="67f3aebf-819d-5f9b-8650-6c559580f88c" name="ci-private-network" args="connection.timestamp,connection.master,connection.controller,connection.port-type,connection.slave-type,ovs-external-ids.data,ovs-interface.type,ipv4.method,ipv4.addresses,ipv4.routing-rules,ipv4.never-default,ipv4.dns,ipv4.routes,ipv6.method,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routing-rules,ipv6.dns,ipv6.routes" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1891] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1892] audit: op="connection-add" uuid="1e55f780-b484-4e60-a006-55c2c35ecc05" name="vlan20-if" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1910] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1912] audit: op="connection-add" uuid="f970735e-bc09-4f31-b1a0-4eda24e9d39f" name="vlan21-if" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1933] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1934] audit: op="connection-add" uuid="fe2e9cbd-1f58-45f9-af7a-bbc2221796bf" name="vlan22-if" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1948] audit: op="connection-delete" uuid="93f975fe-2181-3e88-a16c-de115f1f749a" name="Wired connection 1" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1962] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1973] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1976] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (4ddb60d7-2a3c-4aa4-9561-51cffb00bfbe)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1976] audit: op="connection-activate" uuid="4ddb60d7-2a3c-4aa4-9561-51cffb00bfbe" name="br-ex-br" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1978] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1985] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1989] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (13c6488b-9eba-4cf8-953e-4c13e4705605)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1991] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.1996] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2000] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (1045c5c4-a78d-40b3-8826-cb9933e50aab)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2002] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2008] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2013] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (cb27fa5c-e07f-493f-8997-0129d5ca9f4d)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2014] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2021] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2025] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (f4d91186-8b58-4d63-adfd-dd5722e9b835)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2027] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2034] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2037] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (61eb692f-aa48-43ca-aa37-ad3fa95b4c5b)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2038] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2041] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2042] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2049] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2053] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2057] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (c342017d-ca50-45e6-93f0-ae90b33fdcef)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2058] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2061] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2062] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2063] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2065] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2076] device (eth1): disconnecting for new activation request.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2077] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2079] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2081] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2082] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2085] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2089] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2093] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (1e55f780-b484-4e60-a006-55c2c35ecc05)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2094] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2097] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2098] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2099] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2102] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2106] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2110] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (f970735e-bc09-4f31-b1a0-4eda24e9d39f)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2111] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2114] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2115] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2116] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2119] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2123] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2127] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (fe2e9cbd-1f58-45f9-af7a-bbc2221796bf)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2128] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2131] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2132] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2134] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2135] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2152] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2154] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2157] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2158] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2166] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2171] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2185] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2191] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2193] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2198] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 kernel: ovs-system: entered promiscuous mode
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2202] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2205] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2206] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2210] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2213] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2216] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2218] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2223] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2228] dhcp4 (eth0): canceled DHCP transaction
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2228] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2228] dhcp4 (eth0): state changed no lease
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2230] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 05 11:40:45 compute-0 kernel: Timeout policy base is empty
Dec 05 11:40:45 compute-0 systemd-udevd[58470]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2243] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2247] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58466 uid=0 result="fail" reason="Device is not activated"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2254] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2300] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2308] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2314] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 05 11:40:45 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2369] device (eth1): disconnecting for new activation request.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2371] audit: op="connection-activate" uuid="67f3aebf-819d-5f9b-8650-6c559580f88c" name="ci-private-network" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2395] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec 05 11:40:45 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2534] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 05 11:40:45 compute-0 kernel: br-ex: entered promiscuous mode
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2650] device (eth1): Activation: starting connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c)
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2655] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2673] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2680] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2687] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2693] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2704] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2706] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2708] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 kernel: vlan22: entered promiscuous mode
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2717] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2718] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2725] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 systemd-udevd[58472]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2735] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2740] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2746] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2751] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2757] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2764] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2770] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2777] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 11:40:45 compute-0 kernel: vlan20: entered promiscuous mode
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2784] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 systemd-udevd[58471]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2806] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2823] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 05 11:40:45 compute-0 kernel: vlan21: entered promiscuous mode
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2835] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2845] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 05 11:40:45 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2884] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2892] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2902] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2909] device (eth1): Activation: successful, device activated.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2918] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2924] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2930] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2937] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.2952] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3009] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3009] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3013] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3016] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3021] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3045] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3053] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3097] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3099] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3103] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3108] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3114] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 11:40:45 compute-0 NetworkManager[55691]: <info>  [1764934845.3119] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 11:40:46 compute-0 NetworkManager[55691]: <info>  [1764934846.4765] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec 05 11:40:46 compute-0 NetworkManager[55691]: <info>  [1764934846.6696] checkpoint[0x55a7ba69d950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 05 11:40:46 compute-0 NetworkManager[55691]: <info>  [1764934846.6698] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec 05 11:40:46 compute-0 sudo[58798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urbjcgwlroabbixxhxijqqoqvjgkmims ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934846.3189034-295-26546364606256/AnsiballZ_async_status.py'
Dec 05 11:40:46 compute-0 sudo[58798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:46 compute-0 NetworkManager[55691]: <info>  [1764934846.9305] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58466 uid=0 result="success"
Dec 05 11:40:46 compute-0 NetworkManager[55691]: <info>  [1764934846.9318] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58466 uid=0 result="success"
Dec 05 11:40:47 compute-0 python3.9[58800]: ansible-ansible.legacy.async_status Invoked with jid=j388437315305.58460 mode=status _async_dir=/root/.ansible_async
Dec 05 11:40:47 compute-0 NetworkManager[55691]: <info>  [1764934847.3233] audit: op="networking-control" arg="global-dns-configuration" pid=58466 uid=0 result="success"
Dec 05 11:40:47 compute-0 NetworkManager[55691]: <info>  [1764934847.3258] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 05 11:40:47 compute-0 NetworkManager[55691]: <info>  [1764934847.3286] audit: op="networking-control" arg="global-dns-configuration" pid=58466 uid=0 result="success"
Dec 05 11:40:47 compute-0 NetworkManager[55691]: <info>  [1764934847.3317] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58466 uid=0 result="success"
Dec 05 11:40:47 compute-0 sudo[58798]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:47 compute-0 NetworkManager[55691]: <info>  [1764934847.4534] checkpoint[0x55a7ba69da20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 05 11:40:47 compute-0 NetworkManager[55691]: <info>  [1764934847.4538] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58466 uid=0 result="success"
Dec 05 11:40:47 compute-0 ansible-async_wrapper.py[58464]: Module complete (58464)
Dec 05 11:40:48 compute-0 ansible-async_wrapper.py[58463]: Done in kid B.
Dec 05 11:40:50 compute-0 sudo[58904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eixligwrihxrplcitnazdplwrtpljgxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934846.3189034-295-26546364606256/AnsiballZ_async_status.py'
Dec 05 11:40:50 compute-0 sudo[58904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:50 compute-0 python3.9[58906]: ansible-ansible.legacy.async_status Invoked with jid=j388437315305.58460 mode=status _async_dir=/root/.ansible_async
Dec 05 11:40:50 compute-0 sudo[58904]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:51 compute-0 sudo[59003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxqtfchwwstbwfhfmrkkrujstyjsuoqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934846.3189034-295-26546364606256/AnsiballZ_async_status.py'
Dec 05 11:40:51 compute-0 sudo[59003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:51 compute-0 python3.9[59005]: ansible-ansible.legacy.async_status Invoked with jid=j388437315305.58460 mode=cleanup _async_dir=/root/.ansible_async
Dec 05 11:40:51 compute-0 sudo[59003]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:52 compute-0 sudo[59155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tponvsxfbbvhyrwhfpqutknqymaqosgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934851.6832664-322-215790950088430/AnsiballZ_stat.py'
Dec 05 11:40:52 compute-0 sudo[59155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:52 compute-0 python3.9[59157]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:40:52 compute-0 sudo[59155]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:52 compute-0 sudo[59278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjwssktbjqzioywizqvytrgmlnnihfwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934851.6832664-322-215790950088430/AnsiballZ_copy.py'
Dec 05 11:40:52 compute-0 sudo[59278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:52 compute-0 python3.9[59280]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934851.6832664-322-215790950088430/.source.returncode _original_basename=.6z5k1ypq follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:52 compute-0 sudo[59278]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:53 compute-0 sudo[59430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alqjympwqvocefgoyiuovhucgipyiqds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934853.1310487-338-47066533708795/AnsiballZ_stat.py'
Dec 05 11:40:53 compute-0 sudo[59430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:53 compute-0 python3.9[59432]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:40:53 compute-0 sudo[59430]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:54 compute-0 sudo[59554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrryliqvdlqyzfzfbhtyfmrfmwrohpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934853.1310487-338-47066533708795/AnsiballZ_copy.py'
Dec 05 11:40:54 compute-0 sudo[59554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:54 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 11:40:54 compute-0 python3.9[59556]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934853.1310487-338-47066533708795/.source.cfg _original_basename=.0dxhckwd follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:40:54 compute-0 sudo[59554]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:54 compute-0 sudo[59708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saozjstyexdnchnsqmongmttagbpfoxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934854.4012983-353-27449497358450/AnsiballZ_systemd.py'
Dec 05 11:40:54 compute-0 sudo[59708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:40:55 compute-0 python3.9[59710]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:40:55 compute-0 systemd[1]: Reloading Network Manager...
Dec 05 11:40:55 compute-0 NetworkManager[55691]: <info>  [1764934855.0926] audit: op="reload" arg="0" pid=59714 uid=0 result="success"
Dec 05 11:40:55 compute-0 NetworkManager[55691]: <info>  [1764934855.0932] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 05 11:40:55 compute-0 systemd[1]: Reloaded Network Manager.
Dec 05 11:40:55 compute-0 sudo[59708]: pam_unix(sudo:session): session closed for user root
Dec 05 11:40:55 compute-0 sshd-session[51692]: Connection closed by 192.168.122.30 port 38730
Dec 05 11:40:55 compute-0 sshd-session[51689]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:40:55 compute-0 systemd-logind[792]: Session 11 logged out. Waiting for processes to exit.
Dec 05 11:40:55 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Dec 05 11:40:55 compute-0 systemd[1]: session-11.scope: Consumed 52.823s CPU time.
Dec 05 11:40:55 compute-0 systemd-logind[792]: Removed session 11.
Dec 05 11:41:00 compute-0 sshd-session[59745]: Accepted publickey for zuul from 192.168.122.30 port 43674 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:41:00 compute-0 systemd-logind[792]: New session 12 of user zuul.
Dec 05 11:41:00 compute-0 systemd[1]: Started Session 12 of User zuul.
Dec 05 11:41:00 compute-0 sshd-session[59745]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:41:01 compute-0 python3.9[59898]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:41:03 compute-0 python3.9[60052]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:41:04 compute-0 python3.9[60242]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:41:04 compute-0 sshd-session[59748]: Connection closed by 192.168.122.30 port 43674
Dec 05 11:41:04 compute-0 sshd-session[59745]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:41:04 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Dec 05 11:41:04 compute-0 systemd[1]: session-12.scope: Consumed 2.531s CPU time.
Dec 05 11:41:04 compute-0 systemd-logind[792]: Session 12 logged out. Waiting for processes to exit.
Dec 05 11:41:04 compute-0 systemd-logind[792]: Removed session 12.
Dec 05 11:41:05 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 11:41:09 compute-0 sshd-session[60271]: Accepted publickey for zuul from 192.168.122.30 port 35660 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:41:09 compute-0 systemd-logind[792]: New session 13 of user zuul.
Dec 05 11:41:09 compute-0 systemd[1]: Started Session 13 of User zuul.
Dec 05 11:41:09 compute-0 sshd-session[60271]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:41:10 compute-0 python3.9[60424]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:41:11 compute-0 python3.9[60578]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:41:12 compute-0 sudo[60732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmtfcnirkbzpwrbtvrctdbxkbgdwxewl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934872.2407892-40-99905540159607/AnsiballZ_setup.py'
Dec 05 11:41:12 compute-0 sudo[60732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:12 compute-0 python3.9[60734]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:41:13 compute-0 sudo[60732]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:13 compute-0 sudo[60817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfeysdevmzwcyzzwupbxyadzwjuywszn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934872.2407892-40-99905540159607/AnsiballZ_dnf.py'
Dec 05 11:41:13 compute-0 sudo[60817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:13 compute-0 python3.9[60819]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:41:15 compute-0 sudo[60817]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:15 compute-0 sudo[60970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnehoermafdtkgukgouzhthgqebmixlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934875.2379022-52-202819040542115/AnsiballZ_setup.py'
Dec 05 11:41:15 compute-0 sudo[60970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:15 compute-0 python3.9[60972]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:41:16 compute-0 sudo[60970]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:16 compute-0 sudo[61162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glyrmdkqhhcrvvraqgufqqugalfbmjlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934876.3877184-63-258145466477455/AnsiballZ_file.py'
Dec 05 11:41:16 compute-0 sudo[61162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:17 compute-0 python3.9[61164]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:41:17 compute-0 sudo[61162]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:17 compute-0 sudo[61314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qesbboidkogydhjdfaqdveimsrwddthh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934877.2833178-71-215774292920541/AnsiballZ_command.py'
Dec 05 11:41:17 compute-0 sudo[61314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:17 compute-0 python3.9[61316]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:41:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:41:17 compute-0 sudo[61314]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:18 compute-0 sudo[61478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omtoppphzrjsayzibxzvgnnlkhaydyui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934878.1423278-79-240541161313451/AnsiballZ_stat.py'
Dec 05 11:41:18 compute-0 sudo[61478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:18 compute-0 python3.9[61480]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:41:18 compute-0 sudo[61478]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:18 compute-0 sudo[61556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llmjrkqaevwtpumkjeffwwqrkwuwxcsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934878.1423278-79-240541161313451/AnsiballZ_file.py'
Dec 05 11:41:18 compute-0 sudo[61556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:19 compute-0 python3.9[61558]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:41:19 compute-0 sudo[61556]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:19 compute-0 sudo[61708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiyrggxnrjnzvraiostugiybydzxgcho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934879.3399272-91-169132812012611/AnsiballZ_stat.py'
Dec 05 11:41:19 compute-0 sudo[61708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:19 compute-0 python3.9[61710]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:41:19 compute-0 sudo[61708]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:20 compute-0 sudo[61786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyvrpxkhsjyrvjbmvasphtnxylrffsii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934879.3399272-91-169132812012611/AnsiballZ_file.py'
Dec 05 11:41:20 compute-0 sudo[61786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:20 compute-0 python3.9[61788]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:41:20 compute-0 sudo[61786]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:20 compute-0 sudo[61938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhtwpappajhthaghjefmzqzoddsxrvvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934880.4734137-104-62157807530274/AnsiballZ_ini_file.py'
Dec 05 11:41:20 compute-0 sudo[61938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:21 compute-0 python3.9[61940]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:41:21 compute-0 sudo[61938]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:21 compute-0 sudo[62090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryhfqbqrrqvgfjnkkwhsbootdbbusfdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934881.2634199-104-238821869162263/AnsiballZ_ini_file.py'
Dec 05 11:41:21 compute-0 sudo[62090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:21 compute-0 python3.9[62092]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:41:21 compute-0 sudo[62090]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:22 compute-0 sudo[62242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnibpujrqlvzbpphtdvdranhndtxpiuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934881.879471-104-201671140092366/AnsiballZ_ini_file.py'
Dec 05 11:41:22 compute-0 sudo[62242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:22 compute-0 python3.9[62244]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:41:22 compute-0 sudo[62242]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:22 compute-0 sudo[62394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klvscluopljtyldwsbflqjedfqczrgik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934882.5224328-104-122170620176438/AnsiballZ_ini_file.py'
Dec 05 11:41:22 compute-0 sudo[62394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:23 compute-0 python3.9[62396]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:41:23 compute-0 sudo[62394]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:23 compute-0 sudo[62546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdcstxhqdxoibjiqeqklbuumbzhdnwul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934883.3097644-135-22667882490297/AnsiballZ_dnf.py'
Dec 05 11:41:23 compute-0 sudo[62546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:23 compute-0 python3.9[62548]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:41:25 compute-0 sudo[62546]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:25 compute-0 sudo[62699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjcmndyydbbggwfvevlthijunzsvuxcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934885.5260053-146-236911527775918/AnsiballZ_setup.py'
Dec 05 11:41:25 compute-0 sudo[62699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:26 compute-0 python3.9[62701]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:41:26 compute-0 sudo[62699]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:26 compute-0 sudo[62853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdtmnxdmcmarpdefdllitdjtmjhpbjsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934886.3388565-154-8969904224857/AnsiballZ_stat.py'
Dec 05 11:41:26 compute-0 sudo[62853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:26 compute-0 python3.9[62855]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:41:26 compute-0 sudo[62853]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:27 compute-0 sudo[63005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptvbpwgorblcijhszicjfhwppjchwafn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934887.1203735-163-86638421320070/AnsiballZ_stat.py'
Dec 05 11:41:27 compute-0 sudo[63005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:27 compute-0 python3.9[63007]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:41:27 compute-0 sudo[63005]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:28 compute-0 sudo[63157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atwwqvnyvtvjerbpdgfhujjbggoedcdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934887.920297-173-111759622894208/AnsiballZ_command.py'
Dec 05 11:41:28 compute-0 sudo[63157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:28 compute-0 python3.9[63159]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:41:28 compute-0 sudo[63157]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:29 compute-0 sudo[63310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvlifxzokltxijtuznxwswjxldyfiwvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934888.724206-183-42082069160046/AnsiballZ_service_facts.py'
Dec 05 11:41:29 compute-0 sudo[63310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:29 compute-0 python3.9[63312]: ansible-service_facts Invoked
Dec 05 11:41:29 compute-0 network[63329]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 11:41:29 compute-0 network[63330]: 'network-scripts' will be removed from distribution in near future.
Dec 05 11:41:29 compute-0 network[63331]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 11:41:32 compute-0 sudo[63310]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:33 compute-0 sudo[63614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwbxrgldibumqkkshmktsiyhqpvthrdl ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764934892.8751235-198-222475069979823/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764934892.8751235-198-222475069979823/args'
Dec 05 11:41:33 compute-0 sudo[63614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:33 compute-0 sudo[63614]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:33 compute-0 sudo[63782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onfpxsbopwohucgkqvpoguqdumlbrppm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934893.5861435-209-37510788945485/AnsiballZ_dnf.py'
Dec 05 11:41:33 compute-0 sudo[63782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:34 compute-0 python3.9[63784]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:41:35 compute-0 sudo[63782]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:36 compute-0 sudo[63935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwsloefjxofmvlrclzmsqyeqzuqroiii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934895.7070212-222-243957734945945/AnsiballZ_package_facts.py'
Dec 05 11:41:36 compute-0 sudo[63935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:36 compute-0 python3.9[63937]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 05 11:41:36 compute-0 sudo[63935]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:37 compute-0 sudo[64088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxvroinjaxmgoxubxrclrcklggzmnmww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934897.2952838-232-41830739616004/AnsiballZ_stat.py'
Dec 05 11:41:37 compute-0 sudo[64088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:37 compute-0 python3.9[64090]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:41:37 compute-0 sudo[64088]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:38 compute-0 sudo[64213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peojqqlpgegmvdsaerdrgsbifxzloqtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934897.2952838-232-41830739616004/AnsiballZ_copy.py'
Dec 05 11:41:38 compute-0 sudo[64213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:38 compute-0 python3.9[64215]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934897.2952838-232-41830739616004/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:41:38 compute-0 sudo[64213]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:39 compute-0 sudo[64367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lugfbqbblgisdylijljgypkictmjfwxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934898.8885202-247-179010942012333/AnsiballZ_stat.py'
Dec 05 11:41:39 compute-0 sudo[64367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:39 compute-0 python3.9[64369]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:41:39 compute-0 sudo[64367]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:39 compute-0 sudo[64492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibsutktnycovjaewhuxlkivmgujuebsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934898.8885202-247-179010942012333/AnsiballZ_copy.py'
Dec 05 11:41:39 compute-0 sudo[64492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:39 compute-0 python3.9[64494]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934898.8885202-247-179010942012333/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:41:40 compute-0 sudo[64492]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:40 compute-0 sudo[64646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkaghnewsyredigtmhcdwwwnusvlajoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934900.4667222-268-273242670367515/AnsiballZ_lineinfile.py'
Dec 05 11:41:40 compute-0 sudo[64646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:41 compute-0 python3.9[64648]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:41:41 compute-0 sudo[64646]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:41 compute-0 sudo[64800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebaqogjiolemjkcsghhcadtoayzsngwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934901.676564-283-253543614895938/AnsiballZ_setup.py'
Dec 05 11:41:41 compute-0 sudo[64800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:42 compute-0 python3.9[64802]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:41:42 compute-0 sudo[64800]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:43 compute-0 sudo[64884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntyznnlpwgyyizxzdsapjhsclmaszjyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934901.676564-283-253543614895938/AnsiballZ_systemd.py'
Dec 05 11:41:43 compute-0 sudo[64884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:43 compute-0 python3.9[64886]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:41:43 compute-0 sudo[64884]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:44 compute-0 sudo[65038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azmzrqsjkltbilpavqmqnwfgztkehfwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934903.903656-299-138215067801232/AnsiballZ_setup.py'
Dec 05 11:41:44 compute-0 sudo[65038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:44 compute-0 python3.9[65040]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:41:44 compute-0 sudo[65038]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:45 compute-0 sudo[65122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zndsaheznvehpznzzydgrshhdwwoeksr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934903.903656-299-138215067801232/AnsiballZ_systemd.py'
Dec 05 11:41:45 compute-0 sudo[65122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:45 compute-0 python3.9[65124]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:41:45 compute-0 chronyd[781]: chronyd exiting
Dec 05 11:41:45 compute-0 systemd[1]: Stopping NTP client/server...
Dec 05 11:41:45 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Dec 05 11:41:45 compute-0 systemd[1]: Stopped NTP client/server.
Dec 05 11:41:45 compute-0 systemd[1]: Starting NTP client/server...
Dec 05 11:41:45 compute-0 chronyd[65133]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 05 11:41:45 compute-0 chronyd[65133]: Frequency -28.217 +/- 0.157 ppm read from /var/lib/chrony/drift
Dec 05 11:41:45 compute-0 chronyd[65133]: Loaded seccomp filter (level 2)
Dec 05 11:41:45 compute-0 systemd[1]: Started NTP client/server.
Dec 05 11:41:45 compute-0 sudo[65122]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:45 compute-0 sshd-session[60274]: Connection closed by 192.168.122.30 port 35660
Dec 05 11:41:45 compute-0 sshd-session[60271]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:41:45 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Dec 05 11:41:45 compute-0 systemd[1]: session-13.scope: Consumed 26.167s CPU time.
Dec 05 11:41:45 compute-0 systemd-logind[792]: Session 13 logged out. Waiting for processes to exit.
Dec 05 11:41:45 compute-0 systemd-logind[792]: Removed session 13.
Dec 05 11:41:51 compute-0 sshd-session[65159]: Accepted publickey for zuul from 192.168.122.30 port 36116 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:41:51 compute-0 systemd-logind[792]: New session 14 of user zuul.
Dec 05 11:41:51 compute-0 systemd[1]: Started Session 14 of User zuul.
Dec 05 11:41:51 compute-0 sshd-session[65159]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:41:52 compute-0 python3.9[65312]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:41:53 compute-0 sudo[65466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnbaoorzndhxnacplraytdmxcmywptlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934912.7251132-33-206765705798853/AnsiballZ_file.py'
Dec 05 11:41:53 compute-0 sudo[65466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:53 compute-0 python3.9[65468]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:41:53 compute-0 sudo[65466]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:54 compute-0 sudo[65641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gooocauzauallfcpgqmeljmrtibdscut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934913.7304108-41-107651532250321/AnsiballZ_stat.py'
Dec 05 11:41:54 compute-0 sudo[65641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:54 compute-0 python3.9[65643]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:41:54 compute-0 sudo[65641]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:54 compute-0 sudo[65719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kckfmcbdmyqdyyofqwpevuwidtobkyrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934913.7304108-41-107651532250321/AnsiballZ_file.py'
Dec 05 11:41:54 compute-0 sudo[65719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:54 compute-0 python3.9[65721]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.u26zqskb recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:41:54 compute-0 sudo[65719]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:55 compute-0 sudo[65871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpufeivzwihynkmyggryapilaxayxvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934915.2488701-61-19703514054571/AnsiballZ_stat.py'
Dec 05 11:41:55 compute-0 sudo[65871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:55 compute-0 python3.9[65873]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:41:55 compute-0 sudo[65871]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:56 compute-0 sudo[65994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjvnpwhpnsjdblwwirxuwgcrgmknypkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934915.2488701-61-19703514054571/AnsiballZ_copy.py'
Dec 05 11:41:56 compute-0 sudo[65994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:56 compute-0 python3.9[65996]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934915.2488701-61-19703514054571/.source _original_basename=.3n7_alwn follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:41:56 compute-0 sudo[65994]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:56 compute-0 sudo[66146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moumcyjdttsvxrrobqkfxhggvxhwnlno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934916.5240934-77-92936810025786/AnsiballZ_file.py'
Dec 05 11:41:56 compute-0 sudo[66146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:57 compute-0 python3.9[66148]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:41:57 compute-0 sudo[66146]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:57 compute-0 sudo[66298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maizxvsobnuyjkbdwepzirubijynhmwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934917.1844738-85-18260389674718/AnsiballZ_stat.py'
Dec 05 11:41:57 compute-0 sudo[66298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:57 compute-0 python3.9[66300]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:41:57 compute-0 sudo[66298]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:58 compute-0 sudo[66421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epzyhowivmxisdazvuphzqqigjadclsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934917.1844738-85-18260389674718/AnsiballZ_copy.py'
Dec 05 11:41:58 compute-0 sudo[66421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:58 compute-0 python3.9[66423]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934917.1844738-85-18260389674718/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:41:58 compute-0 sudo[66421]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:58 compute-0 sudo[66573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jojoulghsiquxefqmblscbntdgclqofq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934918.5976343-85-127597446393764/AnsiballZ_stat.py'
Dec 05 11:41:58 compute-0 sudo[66573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:59 compute-0 python3.9[66575]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:41:59 compute-0 sudo[66573]: pam_unix(sudo:session): session closed for user root
Dec 05 11:41:59 compute-0 sudo[66696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odxirxdutvfiyhppvaaepytexqxoquvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934918.5976343-85-127597446393764/AnsiballZ_copy.py'
Dec 05 11:41:59 compute-0 sudo[66696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:41:59 compute-0 python3.9[66698]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934918.5976343-85-127597446393764/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:41:59 compute-0 sudo[66696]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:00 compute-0 sudo[66848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fohkskptedvwppsksfiueidvslphynxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934919.82586-114-45462896400116/AnsiballZ_file.py'
Dec 05 11:42:00 compute-0 sudo[66848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:00 compute-0 python3.9[66850]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:00 compute-0 sudo[66848]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:00 compute-0 sudo[67000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mizedwjaknqhjhoyzbinjxhsggttnxgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934920.685287-122-259358495079560/AnsiballZ_stat.py'
Dec 05 11:42:00 compute-0 sudo[67000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:01 compute-0 python3.9[67002]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:01 compute-0 sudo[67000]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:01 compute-0 sudo[67123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugthunlehcnfykjydlrdppelnmsidwir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934920.685287-122-259358495079560/AnsiballZ_copy.py'
Dec 05 11:42:01 compute-0 sudo[67123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:01 compute-0 python3.9[67125]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934920.685287-122-259358495079560/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:01 compute-0 sudo[67123]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:02 compute-0 sudo[67275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tylwnarvxmbmkubraehnrykimxerytfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934921.9063-137-141056654860834/AnsiballZ_stat.py'
Dec 05 11:42:02 compute-0 sudo[67275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:02 compute-0 python3.9[67277]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:02 compute-0 sudo[67275]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:02 compute-0 sudo[67398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hajwabgtcsosqztnestcnrmmlpwzniha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934921.9063-137-141056654860834/AnsiballZ_copy.py'
Dec 05 11:42:02 compute-0 sudo[67398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:03 compute-0 python3.9[67400]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934921.9063-137-141056654860834/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:03 compute-0 sudo[67398]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:03 compute-0 sudo[67550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbjkydvfswsmalxvxvcazqscjvntvcof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934923.2552416-152-242555369055945/AnsiballZ_systemd.py'
Dec 05 11:42:03 compute-0 sudo[67550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:04 compute-0 python3.9[67552]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:42:04 compute-0 systemd[1]: Reloading.
Dec 05 11:42:04 compute-0 systemd-rc-local-generator[67579]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:42:04 compute-0 systemd-sysv-generator[67582]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:42:04 compute-0 systemd[1]: Reloading.
Dec 05 11:42:04 compute-0 systemd-rc-local-generator[67617]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:42:04 compute-0 systemd-sysv-generator[67620]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:42:04 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Dec 05 11:42:04 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Dec 05 11:42:04 compute-0 sudo[67550]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:05 compute-0 sudo[67778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dapjotgekktciarthybrvqmzxuoydoll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934924.8423848-160-269010773904548/AnsiballZ_stat.py'
Dec 05 11:42:05 compute-0 sudo[67778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:05 compute-0 python3.9[67780]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:05 compute-0 sudo[67778]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:05 compute-0 sudo[67901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsgwquupyljpzbqqnklwzzugvciiwgeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934924.8423848-160-269010773904548/AnsiballZ_copy.py'
Dec 05 11:42:05 compute-0 sudo[67901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:05 compute-0 python3.9[67903]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934924.8423848-160-269010773904548/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:05 compute-0 sudo[67901]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:06 compute-0 sudo[68053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acqxhussfqnogdtrmokezrhsxhomyhrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934926.0552874-175-49225559708041/AnsiballZ_stat.py'
Dec 05 11:42:06 compute-0 sudo[68053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:06 compute-0 python3.9[68055]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:06 compute-0 sudo[68053]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:06 compute-0 sudo[68176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uskjlseqroozghqeoxlziihgidqfmerh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934926.0552874-175-49225559708041/AnsiballZ_copy.py'
Dec 05 11:42:06 compute-0 sudo[68176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:07 compute-0 python3.9[68178]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934926.0552874-175-49225559708041/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:07 compute-0 sudo[68176]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:07 compute-0 sudo[68328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejltolnhmuxzsdbchbyhygdgrcnndspi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934927.364761-190-38591991612430/AnsiballZ_systemd.py'
Dec 05 11:42:07 compute-0 sudo[68328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:07 compute-0 python3.9[68330]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:42:07 compute-0 systemd[1]: Reloading.
Dec 05 11:42:08 compute-0 systemd-rc-local-generator[68358]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:42:08 compute-0 systemd-sysv-generator[68361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:42:08 compute-0 systemd[1]: Reloading.
Dec 05 11:42:08 compute-0 systemd-rc-local-generator[68395]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:42:08 compute-0 systemd-sysv-generator[68398]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:42:08 compute-0 systemd[1]: Starting Create netns directory...
Dec 05 11:42:08 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 11:42:08 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 11:42:08 compute-0 systemd[1]: Finished Create netns directory.
Dec 05 11:42:08 compute-0 sudo[68328]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:09 compute-0 python3.9[68556]: ansible-ansible.builtin.service_facts Invoked
Dec 05 11:42:09 compute-0 network[68573]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 11:42:09 compute-0 network[68574]: 'network-scripts' will be removed from distribution in near future.
Dec 05 11:42:09 compute-0 network[68575]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 11:42:12 compute-0 sudo[68835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqjswwafxbvtdgacinkqekwlsywlqhvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934932.160063-206-86567336954647/AnsiballZ_systemd.py'
Dec 05 11:42:12 compute-0 sudo[68835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:12 compute-0 python3.9[68837]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:42:12 compute-0 systemd[1]: Reloading.
Dec 05 11:42:12 compute-0 systemd-rc-local-generator[68870]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:42:12 compute-0 systemd-sysv-generator[68874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:42:13 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 05 11:42:13 compute-0 iptables.init[68879]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 05 11:42:13 compute-0 iptables.init[68879]: iptables: Flushing firewall rules: [  OK  ]
Dec 05 11:42:13 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Dec 05 11:42:13 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 05 11:42:13 compute-0 sudo[68835]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:13 compute-0 sudo[69073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxhdcrnmwxvzzmrjovnemtufhsovqpra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934933.6377048-206-29140553164331/AnsiballZ_systemd.py'
Dec 05 11:42:13 compute-0 sudo[69073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:14 compute-0 python3.9[69075]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:42:14 compute-0 sudo[69073]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:14 compute-0 sudo[69227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfsnfycrqmsgrdzfnmynzahskqbapulv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934934.589965-222-86530281170466/AnsiballZ_systemd.py'
Dec 05 11:42:14 compute-0 sudo[69227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:15 compute-0 python3.9[69229]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:42:15 compute-0 systemd[1]: Reloading.
Dec 05 11:42:15 compute-0 systemd-sysv-generator[69263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:42:15 compute-0 systemd-rc-local-generator[69259]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:42:15 compute-0 systemd[1]: Starting Netfilter Tables...
Dec 05 11:42:15 compute-0 systemd[1]: Finished Netfilter Tables.
Dec 05 11:42:15 compute-0 sudo[69227]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:16 compute-0 sudo[69419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebdwhaxuqbjizvlygtwgkbynrdooygrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934935.761289-230-136179969026132/AnsiballZ_command.py'
Dec 05 11:42:16 compute-0 sudo[69419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:16 compute-0 python3.9[69421]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:42:16 compute-0 sudo[69419]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:17 compute-0 sudo[69572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvtunalqgazeneypljvfeqcidalwfeah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934936.8009057-244-23077902056028/AnsiballZ_stat.py'
Dec 05 11:42:17 compute-0 sudo[69572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:17 compute-0 python3.9[69574]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:17 compute-0 sudo[69572]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:17 compute-0 sudo[69697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehmfvbfpvbrjfhxdasvzfyeygwerwxfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934936.8009057-244-23077902056028/AnsiballZ_copy.py'
Dec 05 11:42:17 compute-0 sudo[69697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:17 compute-0 python3.9[69699]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934936.8009057-244-23077902056028/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:17 compute-0 sudo[69697]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:18 compute-0 sudo[69850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcgwpqvfnjlcbvwkdbbrhdhjqglykalz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934937.9965515-259-256183186142275/AnsiballZ_systemd.py'
Dec 05 11:42:18 compute-0 sudo[69850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:18 compute-0 python3.9[69852]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:42:18 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Dec 05 11:42:18 compute-0 sshd[1005]: Received SIGHUP; restarting.
Dec 05 11:42:18 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Dec 05 11:42:18 compute-0 sshd[1005]: Server listening on 0.0.0.0 port 22.
Dec 05 11:42:18 compute-0 sshd[1005]: Server listening on :: port 22.
Dec 05 11:42:18 compute-0 sudo[69850]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:19 compute-0 sudo[70006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liyoipcujmpuwqpuerciekbfxwzirbdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934938.915736-267-235431973543045/AnsiballZ_file.py'
Dec 05 11:42:19 compute-0 sudo[70006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:19 compute-0 python3.9[70008]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:19 compute-0 sudo[70006]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:19 compute-0 sudo[70158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrymajvcibrturmjbkylwgagnvvhxeob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934939.5496688-275-172429782715528/AnsiballZ_stat.py'
Dec 05 11:42:19 compute-0 sudo[70158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:19 compute-0 python3.9[70160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:20 compute-0 sudo[70158]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:20 compute-0 sudo[70281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxsnxfotnnygfunqozdmtvodfvlwktzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934939.5496688-275-172429782715528/AnsiballZ_copy.py'
Dec 05 11:42:20 compute-0 sudo[70281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:20 compute-0 python3.9[70283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934939.5496688-275-172429782715528/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:20 compute-0 sudo[70281]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:21 compute-0 sudo[70433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcdpujrhvffdcjpcutfdzgscsmfvrjjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934940.854753-293-184854121477311/AnsiballZ_timezone.py'
Dec 05 11:42:21 compute-0 sudo[70433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:21 compute-0 python3.9[70435]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 05 11:42:21 compute-0 systemd[1]: Starting Time & Date Service...
Dec 05 11:42:21 compute-0 systemd[1]: Started Time & Date Service.
Dec 05 11:42:21 compute-0 sudo[70433]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:22 compute-0 sudo[70589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlqgjmlrhzflejyiwqwfqpgwqxrfswhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934941.8818798-302-41639435766739/AnsiballZ_file.py'
Dec 05 11:42:22 compute-0 sudo[70589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:22 compute-0 python3.9[70591]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:22 compute-0 sudo[70589]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:22 compute-0 sudo[70741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upuiivquhkxlgjuqkxtjcrsamgpidiwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934942.4963765-310-225825422247797/AnsiballZ_stat.py'
Dec 05 11:42:22 compute-0 sudo[70741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:22 compute-0 python3.9[70743]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:22 compute-0 sudo[70741]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:23 compute-0 sudo[70864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggobeowiajmjlengmcxppfokunqbwjlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934942.4963765-310-225825422247797/AnsiballZ_copy.py'
Dec 05 11:42:23 compute-0 sudo[70864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:23 compute-0 python3.9[70866]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934942.4963765-310-225825422247797/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:23 compute-0 sudo[70864]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:23 compute-0 sudo[71016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oamtrkgrcwfdvmohnfneddpckifreeiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934943.6060627-325-172707249628996/AnsiballZ_stat.py'
Dec 05 11:42:23 compute-0 sudo[71016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:24 compute-0 python3.9[71018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:24 compute-0 sudo[71016]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:24 compute-0 sudo[71139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfqkgaeuvvexieuirmpykvtnrecpfigj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934943.6060627-325-172707249628996/AnsiballZ_copy.py'
Dec 05 11:42:24 compute-0 sudo[71139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:24 compute-0 python3.9[71141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934943.6060627-325-172707249628996/.source.yaml _original_basename=.6qka8te6 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:24 compute-0 sudo[71139]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:25 compute-0 sudo[71291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnnvpxjcykdlshqxayoshyjarxnvhxyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934944.9808588-340-170395583055235/AnsiballZ_stat.py'
Dec 05 11:42:25 compute-0 sudo[71291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:25 compute-0 python3.9[71293]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:25 compute-0 sudo[71291]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:25 compute-0 sudo[71414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtvfitrpcthtcfgvpzvvovziqedqlehk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934944.9808588-340-170395583055235/AnsiballZ_copy.py'
Dec 05 11:42:25 compute-0 sudo[71414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:25 compute-0 python3.9[71416]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934944.9808588-340-170395583055235/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:25 compute-0 sudo[71414]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:26 compute-0 sudo[71566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcyvbqatoekhotkxmzxjfplqjemgxzky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934946.1526868-355-123199135381855/AnsiballZ_command.py'
Dec 05 11:42:26 compute-0 sudo[71566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:26 compute-0 python3.9[71568]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:42:26 compute-0 sudo[71566]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:27 compute-0 sudo[71719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tksyieypsqproollgafgrtuspgarmfmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934946.8377085-363-278461415153481/AnsiballZ_command.py'
Dec 05 11:42:27 compute-0 sudo[71719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:27 compute-0 python3.9[71721]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:42:27 compute-0 sudo[71719]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:28 compute-0 sudo[71872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrqgehamoseqcrxzppkvxszjkovhlhxo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764934947.5752501-371-136742305646555/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 11:42:28 compute-0 sudo[71872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:28 compute-0 python3[71874]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 11:42:28 compute-0 sudo[71872]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:28 compute-0 sudo[72024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjcsmgsxyrniiozxseqmtszznxpctbfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934948.5632546-379-257552289614687/AnsiballZ_stat.py'
Dec 05 11:42:28 compute-0 sudo[72024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:29 compute-0 python3.9[72026]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:29 compute-0 sudo[72024]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:29 compute-0 sudo[72147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoyrkhtiihgzhvsabhgebbubbmsftzij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934948.5632546-379-257552289614687/AnsiballZ_copy.py'
Dec 05 11:42:29 compute-0 sudo[72147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:29 compute-0 python3.9[72149]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934948.5632546-379-257552289614687/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:29 compute-0 sudo[72147]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:30 compute-0 sudo[72299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxuhwldmlxqmmwawicklztmedkuwmjue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934949.821446-394-104587280866089/AnsiballZ_stat.py'
Dec 05 11:42:30 compute-0 sudo[72299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:30 compute-0 python3.9[72301]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:30 compute-0 sudo[72299]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:30 compute-0 sudo[72422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndbouhowlsbsonuokyewzxsghpwsmiow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934949.821446-394-104587280866089/AnsiballZ_copy.py'
Dec 05 11:42:30 compute-0 sudo[72422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:30 compute-0 python3.9[72424]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934949.821446-394-104587280866089/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:31 compute-0 sudo[72422]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:31 compute-0 sudo[72574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irhzkgswdbepgfvqnrrbzxqlubrwceqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934951.1600826-409-43989239146490/AnsiballZ_stat.py'
Dec 05 11:42:31 compute-0 sudo[72574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:31 compute-0 python3.9[72576]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:31 compute-0 sudo[72574]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:31 compute-0 sudo[72697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqasdhalnrnrdfnjxceomdglpvldttaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934951.1600826-409-43989239146490/AnsiballZ_copy.py'
Dec 05 11:42:31 compute-0 sudo[72697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:32 compute-0 python3.9[72699]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934951.1600826-409-43989239146490/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:32 compute-0 sudo[72697]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:32 compute-0 sudo[72849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfgllgtagjwkymklifbcvtfgouoelwgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934952.314305-424-183097609432172/AnsiballZ_stat.py'
Dec 05 11:42:32 compute-0 sudo[72849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:32 compute-0 python3.9[72851]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:32 compute-0 sudo[72849]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:33 compute-0 sudo[72972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqkqmjgqyqhgsbeiwwshtcxbsekkpwmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934952.314305-424-183097609432172/AnsiballZ_copy.py'
Dec 05 11:42:33 compute-0 sudo[72972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:33 compute-0 python3.9[72974]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934952.314305-424-183097609432172/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:33 compute-0 sudo[72972]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:33 compute-0 sudo[73124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yagxpkklorqrvvycfrgcpaljxsvhconv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934953.4934638-439-248623026112351/AnsiballZ_stat.py'
Dec 05 11:42:33 compute-0 sudo[73124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:34 compute-0 python3.9[73126]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:42:34 compute-0 sudo[73124]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:34 compute-0 sudo[73247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhuqxtrxyympzjvayaclkbnsphojajva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934953.4934638-439-248623026112351/AnsiballZ_copy.py'
Dec 05 11:42:34 compute-0 sudo[73247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:34 compute-0 python3.9[73249]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934953.4934638-439-248623026112351/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:34 compute-0 sudo[73247]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:35 compute-0 sudo[73399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jksbxkujeehzrqbfkhtpmbyadagcbnqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934955.0571868-454-98727522679812/AnsiballZ_file.py'
Dec 05 11:42:35 compute-0 sudo[73399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:35 compute-0 python3.9[73401]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:35 compute-0 sudo[73399]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:36 compute-0 sudo[73551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzmxgacdfcyghkrzhunfbptiznqcqmgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934955.719614-462-245005080188516/AnsiballZ_command.py'
Dec 05 11:42:36 compute-0 sudo[73551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:36 compute-0 python3.9[73553]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:42:36 compute-0 sudo[73551]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:37 compute-0 sudo[73710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awvmoaidvuuywfhwnewkvnwyyspnummy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934956.6061664-470-216460198952904/AnsiballZ_blockinfile.py'
Dec 05 11:42:37 compute-0 sudo[73710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:37 compute-0 python3.9[73712]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:37 compute-0 sudo[73710]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:37 compute-0 sudo[73863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-titxweuhxsrmhjszvniwqwmjkyazeulo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934957.629965-479-252515844660295/AnsiballZ_file.py'
Dec 05 11:42:37 compute-0 sudo[73863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:38 compute-0 python3.9[73865]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:38 compute-0 sudo[73863]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:38 compute-0 sudo[74015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xollpwvjjsjzeucxvfslaajksykzzlem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934958.2893996-479-276468919748754/AnsiballZ_file.py'
Dec 05 11:42:38 compute-0 sudo[74015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:38 compute-0 python3.9[74017]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:38 compute-0 sudo[74015]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:39 compute-0 sudo[74167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvysdrmsfchikdhgsshnnsnoqevgfjaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934959.0551574-494-93549812601975/AnsiballZ_mount.py'
Dec 05 11:42:39 compute-0 sudo[74167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:39 compute-0 python3.9[74169]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 05 11:42:39 compute-0 sudo[74167]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:39 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 11:42:39 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 11:42:40 compute-0 sudo[74321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dctdojtvsfleptawqdxknwfldmwhrkys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934959.8751907-494-132787723154549/AnsiballZ_mount.py'
Dec 05 11:42:40 compute-0 sudo[74321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:40 compute-0 python3.9[74323]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 05 11:42:40 compute-0 sudo[74321]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:40 compute-0 sshd-session[65162]: Connection closed by 192.168.122.30 port 36116
Dec 05 11:42:40 compute-0 sshd-session[65159]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:42:40 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Dec 05 11:42:40 compute-0 systemd[1]: session-14.scope: Consumed 35.779s CPU time.
Dec 05 11:42:40 compute-0 systemd-logind[792]: Session 14 logged out. Waiting for processes to exit.
Dec 05 11:42:40 compute-0 systemd-logind[792]: Removed session 14.
Dec 05 11:42:45 compute-0 sshd-session[74349]: Accepted publickey for zuul from 192.168.122.30 port 41372 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:42:45 compute-0 systemd-logind[792]: New session 15 of user zuul.
Dec 05 11:42:45 compute-0 systemd[1]: Started Session 15 of User zuul.
Dec 05 11:42:45 compute-0 sshd-session[74349]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:42:46 compute-0 sudo[74502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfhtjfjkjxafqqahruyzbyvqcmiwlrfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934965.798381-16-47400853618276/AnsiballZ_tempfile.py'
Dec 05 11:42:46 compute-0 sudo[74502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:46 compute-0 python3.9[74504]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 05 11:42:46 compute-0 sudo[74502]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:47 compute-0 sudo[74654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sihyhbtjzicgkpiwmyhgjlqtrdiagkmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934966.7102945-28-129130371195653/AnsiballZ_stat.py'
Dec 05 11:42:47 compute-0 sudo[74654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:47 compute-0 python3.9[74656]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:42:47 compute-0 sudo[74654]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:48 compute-0 sudo[74808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsikdmcsznevrqillydnlwbslwtwbweb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934967.566009-38-182132453181780/AnsiballZ_setup.py'
Dec 05 11:42:48 compute-0 sudo[74808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:48 compute-0 sshd-session[74657]: Received disconnect from 189.47.10.39 port 41426:11: Bye Bye [preauth]
Dec 05 11:42:48 compute-0 sshd-session[74657]: Disconnected from authenticating user root 189.47.10.39 port 41426 [preauth]
Dec 05 11:42:48 compute-0 python3.9[74810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:42:48 compute-0 sudo[74808]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:49 compute-0 sudo[74961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkgpkwikpeubqepkpfnryszjerjjwpxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934968.715684-47-95911848594375/AnsiballZ_blockinfile.py'
Dec 05 11:42:49 compute-0 sudo[74961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:49 compute-0 python3.9[74963]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDOOJVLNbUbXdj/4hnrYG4cBJ/XBhnOoBFKwcsgC1GK+qApIMq4AN9kvFcb69ro6VcgRjtYliCnG5TgVk5pEiS1s8BUyor5fl0SA7FEpF0wOrdG4O4svVl67EKJIrumCPiiYqOkFmxa0uzeWPYlZ9KqmqEFUnfIiBTd3g6oXgX3xUSLM5zhum9rmbo9Wyct2IWSctskbdxSj61pQz84UKinUZfbFbt19R+7hSrz0o7kIRXpX5+BscttG0pmvh21pzIH9KboW12wgqdcPLZTCL4ZLUUBWBzaSkzpbPxeyg1EbbhJyVwuLVOeuFOUblr0KGmQRtOK1XN/BaC0kpkyJEKbqz3tih8cv6n8Fu/lKoNaEratukKNtnRn1v+UD25/a3bMr2Nap67lLNSKPb8hVksVc8I5GfqPl6mpDnsUffi6+2DGh/lXs79VfAJHNhMDo947VU8ntcLR7oebE+e1CWNvLHDxVg2UFD2KjXGUogS+6G2FC7a7LJCHjL+Ul7C0SGE=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHRqsyTsoXMhvSezBrIRTqtqwVV7Nl5EvVW1GsgltpO/
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNPaTPF1sw2zD0sNQJEE1DG4TP9pcJarMUZH8Q9jzRRo4RTGVHJcz3S0FZ2fsO8PHdQzacHi17HUsogXh1a47/E=
                                             create=True mode=0644 path=/tmp/ansible.gfdz24ir state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:49 compute-0 sudo[74961]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:49 compute-0 sudo[75113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acbbptzigpjflmfozhtswlzgxwqvjxaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934969.5242083-55-171390346875799/AnsiballZ_command.py'
Dec 05 11:42:49 compute-0 sudo[75113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:50 compute-0 python3.9[75115]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.gfdz24ir' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:42:50 compute-0 sudo[75113]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:50 compute-0 sudo[75267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pitupzdxiqvstkecjcbdwmsskatobdtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934970.2581947-63-13739002272661/AnsiballZ_file.py'
Dec 05 11:42:50 compute-0 sudo[75267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:50 compute-0 python3.9[75269]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.gfdz24ir state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:42:50 compute-0 sudo[75267]: pam_unix(sudo:session): session closed for user root
Dec 05 11:42:51 compute-0 sshd-session[74352]: Connection closed by 192.168.122.30 port 41372
Dec 05 11:42:51 compute-0 sshd-session[74349]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:42:51 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Dec 05 11:42:51 compute-0 systemd[1]: session-15.scope: Consumed 3.365s CPU time.
Dec 05 11:42:51 compute-0 systemd-logind[792]: Session 15 logged out. Waiting for processes to exit.
Dec 05 11:42:51 compute-0 systemd-logind[792]: Removed session 15.
Dec 05 11:42:51 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 05 11:42:57 compute-0 sshd-session[75296]: Accepted publickey for zuul from 192.168.122.30 port 54880 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:42:57 compute-0 systemd-logind[792]: New session 16 of user zuul.
Dec 05 11:42:57 compute-0 systemd[1]: Started Session 16 of User zuul.
Dec 05 11:42:57 compute-0 sshd-session[75296]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:42:58 compute-0 python3.9[75449]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:42:59 compute-0 sudo[75603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojodbyltveclynrtbmspwjpqxnvzdojh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934978.6727157-32-119915257045055/AnsiballZ_systemd.py'
Dec 05 11:42:59 compute-0 sudo[75603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:42:59 compute-0 python3.9[75605]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 05 11:42:59 compute-0 sudo[75603]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:00 compute-0 sudo[75757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwltgxldcelaaxnvnsvvbvswejlzbiku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934979.7760162-40-173185674347814/AnsiballZ_systemd.py'
Dec 05 11:43:00 compute-0 sudo[75757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:00 compute-0 python3.9[75759]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:43:00 compute-0 sudo[75757]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:01 compute-0 sudo[75910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlxyskvbbuqxwpyavdzzalarcvcaxand ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934980.674378-49-157668504521592/AnsiballZ_command.py'
Dec 05 11:43:01 compute-0 sudo[75910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:01 compute-0 python3.9[75912]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:43:01 compute-0 sudo[75910]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:02 compute-0 sudo[76063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gegcfmtkkgkpwmmnyyshczkonowspyxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934981.5379658-57-180830170898687/AnsiballZ_stat.py'
Dec 05 11:43:02 compute-0 sudo[76063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:02 compute-0 python3.9[76065]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:43:02 compute-0 sudo[76063]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:02 compute-0 sudo[76217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdnifdytgyorjwhkfldslvbrdahmpwbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934982.5683637-65-259323838790875/AnsiballZ_command.py'
Dec 05 11:43:02 compute-0 sudo[76217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:03 compute-0 python3.9[76219]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:43:03 compute-0 sudo[76217]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:03 compute-0 sudo[76372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnoojfomkrfjvbgzyfzxgbdczidsrqnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934983.3358543-73-35437663676637/AnsiballZ_file.py'
Dec 05 11:43:03 compute-0 sudo[76372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:03 compute-0 python3.9[76374]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:03 compute-0 sudo[76372]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:04 compute-0 sshd-session[75299]: Connection closed by 192.168.122.30 port 54880
Dec 05 11:43:04 compute-0 sshd-session[75296]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:43:04 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Dec 05 11:43:04 compute-0 systemd[1]: session-16.scope: Consumed 4.652s CPU time.
Dec 05 11:43:04 compute-0 systemd-logind[792]: Session 16 logged out. Waiting for processes to exit.
Dec 05 11:43:04 compute-0 systemd-logind[792]: Removed session 16.
Dec 05 11:43:10 compute-0 sshd-session[76399]: Accepted publickey for zuul from 192.168.122.30 port 36844 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:43:10 compute-0 systemd-logind[792]: New session 17 of user zuul.
Dec 05 11:43:10 compute-0 systemd[1]: Started Session 17 of User zuul.
Dec 05 11:43:10 compute-0 sshd-session[76399]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:43:11 compute-0 python3.9[76552]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:43:11 compute-0 sudo[76706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifudlhpcbzyjzdjhufuufpdnhxzcnjcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934991.5451093-34-165642398418966/AnsiballZ_setup.py'
Dec 05 11:43:11 compute-0 sudo[76706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:12 compute-0 python3.9[76708]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:43:12 compute-0 sudo[76706]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:12 compute-0 sudo[76790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aynvcozlrhvcwdfuvouvjfopcgejvizx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764934991.5451093-34-165642398418966/AnsiballZ_dnf.py'
Dec 05 11:43:12 compute-0 sudo[76790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:12 compute-0 python3.9[76792]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 11:43:14 compute-0 sudo[76790]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:15 compute-0 python3.9[76943]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:43:16 compute-0 python3.9[77094]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 11:43:17 compute-0 python3.9[77244]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:43:17 compute-0 python3.9[77394]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:43:18 compute-0 sshd-session[76402]: Connection closed by 192.168.122.30 port 36844
Dec 05 11:43:18 compute-0 sshd-session[76399]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:43:18 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Dec 05 11:43:18 compute-0 systemd[1]: session-17.scope: Consumed 5.914s CPU time.
Dec 05 11:43:18 compute-0 systemd-logind[792]: Session 17 logged out. Waiting for processes to exit.
Dec 05 11:43:18 compute-0 systemd-logind[792]: Removed session 17.
Dec 05 11:43:23 compute-0 sshd-session[77419]: Accepted publickey for zuul from 192.168.122.30 port 33422 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:43:23 compute-0 systemd-logind[792]: New session 18 of user zuul.
Dec 05 11:43:24 compute-0 systemd[1]: Started Session 18 of User zuul.
Dec 05 11:43:24 compute-0 sshd-session[77419]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:43:24 compute-0 python3.9[77572]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:43:26 compute-0 sudo[77726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckpbpplqhfyjfyambpppojzfpmlanbkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935006.0354147-50-192326799043072/AnsiballZ_file.py'
Dec 05 11:43:26 compute-0 sudo[77726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:26 compute-0 python3.9[77728]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:26 compute-0 sudo[77726]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:27 compute-0 sudo[77878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olihpohgiudfmkflrmcmhtdetzygxvsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935006.8139205-50-27087499700682/AnsiballZ_file.py'
Dec 05 11:43:27 compute-0 sudo[77878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:27 compute-0 python3.9[77880]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:27 compute-0 sudo[77878]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:27 compute-0 sudo[78030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waarhlswszfjykybxjmmbnwmllqynpeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935007.4767478-65-235229637528272/AnsiballZ_stat.py'
Dec 05 11:43:27 compute-0 sudo[78030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:28 compute-0 python3.9[78032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:28 compute-0 sudo[78030]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:28 compute-0 sudo[78153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygcotwtrovwrckflzkuoqlshugowuzzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935007.4767478-65-235229637528272/AnsiballZ_copy.py'
Dec 05 11:43:28 compute-0 sudo[78153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:28 compute-0 python3.9[78155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935007.4767478-65-235229637528272/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=00d5fa7b0776dc8691dd5500aa71eb90319347cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:28 compute-0 sudo[78153]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:29 compute-0 sudo[78305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwpnyokpuccuzvnlokicdenlqtlumwcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935008.9395278-65-231967888147611/AnsiballZ_stat.py'
Dec 05 11:43:29 compute-0 sudo[78305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:29 compute-0 python3.9[78307]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:29 compute-0 sudo[78305]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:29 compute-0 sudo[78428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruyjkpxgvshrcjkcfgzxbapcjkbyigac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935008.9395278-65-231967888147611/AnsiballZ_copy.py'
Dec 05 11:43:29 compute-0 sudo[78428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:29 compute-0 python3.9[78430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935008.9395278-65-231967888147611/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=0a4f92add1239ac937855979d4dc1394f101fc06 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:30 compute-0 sudo[78428]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:30 compute-0 sudo[78580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssjflszbxuhzxxlvhhwgcmgrevmtadon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935010.1437294-65-168593357188580/AnsiballZ_stat.py'
Dec 05 11:43:30 compute-0 sudo[78580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:30 compute-0 python3.9[78582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:30 compute-0 sudo[78580]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:31 compute-0 sudo[78703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txjyzdgvrkpgrkzzryvauqkdyxdvjgkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935010.1437294-65-168593357188580/AnsiballZ_copy.py'
Dec 05 11:43:31 compute-0 sudo[78703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:31 compute-0 python3.9[78705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935010.1437294-65-168593357188580/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=fa258e12043967341bc94c488e45ada802ba7070 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:31 compute-0 sudo[78703]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:31 compute-0 sudo[78855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqxugpwiyyizmjwbojduzcsgrvtgvpex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935011.4362833-109-266393594616864/AnsiballZ_file.py'
Dec 05 11:43:31 compute-0 sudo[78855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:31 compute-0 python3.9[78857]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:32 compute-0 sudo[78855]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:32 compute-0 sudo[79007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsirfbkslhfxrmxuzkwnprogpggzszot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935012.2107553-109-134641611301330/AnsiballZ_file.py'
Dec 05 11:43:32 compute-0 sudo[79007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:32 compute-0 python3.9[79009]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:32 compute-0 sudo[79007]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:33 compute-0 sudo[79159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egjuswosyshwafwfrxntilnyacjttfkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935012.9734132-124-167366774484041/AnsiballZ_stat.py'
Dec 05 11:43:33 compute-0 sudo[79159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:33 compute-0 python3.9[79161]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:33 compute-0 sudo[79159]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:33 compute-0 sudo[79282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqgvmkvsssctuzteeqryjrkgqbjsawrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935012.9734132-124-167366774484041/AnsiballZ_copy.py'
Dec 05 11:43:33 compute-0 sudo[79282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:34 compute-0 python3.9[79284]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935012.9734132-124-167366774484041/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=bee7f316921a290d1ed3611022ac3ada626797f9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:34 compute-0 sudo[79282]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:34 compute-0 sudo[79434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzffyafpuawoyzssqwvbhevitbltjfyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935014.3690202-124-244652408961771/AnsiballZ_stat.py'
Dec 05 11:43:34 compute-0 sudo[79434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:34 compute-0 python3.9[79436]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:34 compute-0 sudo[79434]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:35 compute-0 sudo[79557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cypaykmmaedxurkefsueyffffdlubnzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935014.3690202-124-244652408961771/AnsiballZ_copy.py'
Dec 05 11:43:35 compute-0 sudo[79557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:35 compute-0 python3.9[79559]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935014.3690202-124-244652408961771/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=97dce75f31f2e462f65a5480df34ce7f592c6afa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:35 compute-0 sudo[79557]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:35 compute-0 sudo[79709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttahicrzpffsquqbozxwhmhidltshdrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935015.5865023-124-66794612019804/AnsiballZ_stat.py'
Dec 05 11:43:35 compute-0 sudo[79709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:36 compute-0 python3.9[79711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:36 compute-0 sudo[79709]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:36 compute-0 sudo[79832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbqkazynllmljoldtprxwddqiyabkbrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935015.5865023-124-66794612019804/AnsiballZ_copy.py'
Dec 05 11:43:36 compute-0 sudo[79832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:36 compute-0 python3.9[79834]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935015.5865023-124-66794612019804/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=0a40c100a83cd94b03805d76969b458dfcb20500 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:36 compute-0 sudo[79832]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:37 compute-0 sudo[79984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afwtpkfknfyzeykfjdangxrbcnasxfcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935016.8655047-168-205213212578224/AnsiballZ_file.py'
Dec 05 11:43:37 compute-0 sudo[79984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:37 compute-0 python3.9[79986]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:37 compute-0 sudo[79984]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:37 compute-0 sudo[80136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utethsdrgylxwdrvgjhxusekqbtznqui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935017.5342429-168-66344875552772/AnsiballZ_file.py'
Dec 05 11:43:37 compute-0 sudo[80136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:38 compute-0 python3.9[80138]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:38 compute-0 sudo[80136]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:38 compute-0 sudo[80288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoegawuamoupgsxfeqfsegyjklgosjok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935018.2274585-183-235252584515357/AnsiballZ_stat.py'
Dec 05 11:43:38 compute-0 sudo[80288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:38 compute-0 python3.9[80290]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:38 compute-0 sudo[80288]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:39 compute-0 sudo[80411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skoaigttrfjsiohgofhqiiecgoyiesds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935018.2274585-183-235252584515357/AnsiballZ_copy.py'
Dec 05 11:43:39 compute-0 sudo[80411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:39 compute-0 python3.9[80413]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935018.2274585-183-235252584515357/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=9b2aaaa14407d7741dbcde8ff451c94476d44925 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:39 compute-0 sudo[80411]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:39 compute-0 sudo[80563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caxaaygtkxqbwahapzwtipbedyxalxia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935019.4720778-183-52161796388886/AnsiballZ_stat.py'
Dec 05 11:43:39 compute-0 sudo[80563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:39 compute-0 python3.9[80565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:39 compute-0 sudo[80563]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:40 compute-0 sudo[80686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikdbkpkvgrfojuqunjxxbaokmqkoeywl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935019.4720778-183-52161796388886/AnsiballZ_copy.py'
Dec 05 11:43:40 compute-0 sudo[80686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:40 compute-0 python3.9[80688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935019.4720778-183-52161796388886/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=966f06ea34331b21eb0c32c6650a16517e61449e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:40 compute-0 sudo[80686]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:40 compute-0 sudo[80838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zttsbzimtjreljsyrfbytzjfsusvzxpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935020.5904458-183-178846736807947/AnsiballZ_stat.py'
Dec 05 11:43:40 compute-0 sudo[80838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:41 compute-0 python3.9[80840]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:41 compute-0 sudo[80838]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:41 compute-0 sudo[80961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxpscskbczqqimuwvachnohnajujwitm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935020.5904458-183-178846736807947/AnsiballZ_copy.py'
Dec 05 11:43:41 compute-0 sudo[80961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:41 compute-0 python3.9[80963]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935020.5904458-183-178846736807947/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=805083cb4c57119c8e2c771d0c5426076bb5774c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:41 compute-0 sudo[80961]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:42 compute-0 sudo[81113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubatqycpjylqkyzriieacnytykgxemuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935021.8903449-227-181856143798388/AnsiballZ_file.py'
Dec 05 11:43:42 compute-0 sudo[81113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:42 compute-0 python3.9[81115]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:42 compute-0 sudo[81113]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:42 compute-0 sudo[81265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rubgmqtszbdiepqqgalwcxvvrvvdmdtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935022.5565426-227-164357110382547/AnsiballZ_file.py'
Dec 05 11:43:42 compute-0 sudo[81265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:43 compute-0 python3.9[81267]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:43 compute-0 sudo[81265]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:43 compute-0 sudo[81417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbzeqhjrdzjfjvsuhwdcjzyyehnzdwfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935023.2261763-242-41129542772871/AnsiballZ_stat.py'
Dec 05 11:43:43 compute-0 sudo[81417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:43 compute-0 python3.9[81419]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:43 compute-0 sudo[81417]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:44 compute-0 sudo[81540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwkasnjfkbjtpvewnkpvhikpoptrsuxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935023.2261763-242-41129542772871/AnsiballZ_copy.py'
Dec 05 11:43:44 compute-0 sudo[81540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:44 compute-0 python3.9[81542]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935023.2261763-242-41129542772871/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=57d2215fb64641d772369ce08602177c050bbe39 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:44 compute-0 sudo[81540]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:44 compute-0 sudo[81692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofrwauceeggmlusvaodprplcvxgbwnou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935024.3339856-242-59466079919143/AnsiballZ_stat.py'
Dec 05 11:43:44 compute-0 sudo[81692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:44 compute-0 python3.9[81694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:44 compute-0 sudo[81692]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:45 compute-0 sudo[81815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqiipchchkkbtzpzxlsaxyaqsgfnnmso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935024.3339856-242-59466079919143/AnsiballZ_copy.py'
Dec 05 11:43:45 compute-0 sudo[81815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:45 compute-0 python3.9[81817]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935024.3339856-242-59466079919143/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=966f06ea34331b21eb0c32c6650a16517e61449e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:45 compute-0 sudo[81815]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:45 compute-0 sudo[81967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlemiufsytmqbawiypjnsenvntkatubw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935025.4169595-242-167977443450319/AnsiballZ_stat.py'
Dec 05 11:43:45 compute-0 sudo[81967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:45 compute-0 python3.9[81969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:45 compute-0 sudo[81967]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:46 compute-0 sudo[82090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeaqnansqkznpltlfjngerbgkcjmuibv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935025.4169595-242-167977443450319/AnsiballZ_copy.py'
Dec 05 11:43:46 compute-0 sudo[82090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:46 compute-0 python3.9[82092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935025.4169595-242-167977443450319/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=8a3533d254239bf4ebc081bbaae4dd07fea0d059 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:46 compute-0 sudo[82090]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:47 compute-0 sudo[82244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nstqljtrnifhlegoqojwotoyzzhxhbfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935027.2210963-302-205734158441826/AnsiballZ_file.py'
Dec 05 11:43:47 compute-0 sudo[82244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:47 compute-0 python3.9[82246]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:47 compute-0 sudo[82244]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:48 compute-0 sudo[82396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phrivlomuicmlbquyrkqivfqcnbxrkia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935027.8516166-310-240948883901478/AnsiballZ_stat.py'
Dec 05 11:43:48 compute-0 sudo[82396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:48 compute-0 python3.9[82398]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:48 compute-0 sudo[82396]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:48 compute-0 sudo[82519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilnrkjxopkreohfvqdvmzsffbjncicgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935027.8516166-310-240948883901478/AnsiballZ_copy.py'
Dec 05 11:43:48 compute-0 sudo[82519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:48 compute-0 python3.9[82521]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935027.8516166-310-240948883901478/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:48 compute-0 sudo[82519]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:49 compute-0 sshd-session[82117]: Connection reset by authenticating user root 45.135.232.92 port 24196 [preauth]
Dec 05 11:43:49 compute-0 sudo[82672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffvgfenudxnbhkihmgtqqwdgaknzqrgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935029.1155257-326-156645433397783/AnsiballZ_file.py'
Dec 05 11:43:49 compute-0 sudo[82672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:49 compute-0 python3.9[82674]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:49 compute-0 sudo[82672]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:50 compute-0 sudo[82825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnheiwtkylmvezrrlqahyaierpcyajcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935029.7569757-334-115794623404925/AnsiballZ_stat.py'
Dec 05 11:43:50 compute-0 sudo[82825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:50 compute-0 python3.9[82827]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:50 compute-0 sudo[82825]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:50 compute-0 sudo[82948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynmsanzlcpkncaphdyxpsuwuwcdmknuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935029.7569757-334-115794623404925/AnsiballZ_copy.py'
Dec 05 11:43:50 compute-0 sudo[82948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:50 compute-0 python3.9[82950]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935029.7569757-334-115794623404925/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:50 compute-0 sudo[82948]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:51 compute-0 sshd-session[82661]: Invalid user admin from 45.135.232.92 port 24200
Dec 05 11:43:51 compute-0 sudo[83100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyzytwjyfeovdwlgognchqxmdzkbxyhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935031.0570502-350-160391442863701/AnsiballZ_file.py'
Dec 05 11:43:51 compute-0 sudo[83100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:51 compute-0 sshd-session[82661]: Connection reset by invalid user admin 45.135.232.92 port 24200 [preauth]
Dec 05 11:43:51 compute-0 python3.9[83102]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:51 compute-0 sudo[83100]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:52 compute-0 sudo[83254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyvscaxqxcilxhglglqdtyhkxunruios ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935031.7262492-358-46087594749467/AnsiballZ_stat.py'
Dec 05 11:43:52 compute-0 sudo[83254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:52 compute-0 python3.9[83256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:52 compute-0 sudo[83254]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:52 compute-0 sudo[83377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywkjlzwzucukdadgonndufmxnyovxevu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935031.7262492-358-46087594749467/AnsiballZ_copy.py'
Dec 05 11:43:52 compute-0 sudo[83377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:52 compute-0 python3.9[83379]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935031.7262492-358-46087594749467/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:52 compute-0 sudo[83377]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:53 compute-0 sudo[83529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dflcjfbxkalcoogcwdrzaabjpwowmtua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935032.9511714-374-19135961956166/AnsiballZ_file.py'
Dec 05 11:43:53 compute-0 sudo[83529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:53 compute-0 sshd-session[83103]: Connection reset by authenticating user root 45.135.232.92 port 24212 [preauth]
Dec 05 11:43:53 compute-0 python3.9[83531]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:53 compute-0 sudo[83529]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:53 compute-0 sudo[83683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugkwxqcvypurhemecpxffmujfzmkrvfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935033.6556215-382-141298749513856/AnsiballZ_stat.py'
Dec 05 11:43:53 compute-0 sudo[83683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:54 compute-0 python3.9[83685]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:54 compute-0 sudo[83683]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:54 compute-0 sudo[83806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eefiefkjjcawrfevdzspjmqruyxvvebg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935033.6556215-382-141298749513856/AnsiballZ_copy.py'
Dec 05 11:43:54 compute-0 sudo[83806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:54 compute-0 python3.9[83808]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935033.6556215-382-141298749513856/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:54 compute-0 sudo[83806]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:55 compute-0 sudo[83958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxaykgherbwdbxsoocgmojpjbctuxqgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935035.0078936-398-114033867420216/AnsiballZ_file.py'
Dec 05 11:43:55 compute-0 sudo[83958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:55 compute-0 sshd-session[83532]: Invalid user user from 45.135.232.92 port 24222
Dec 05 11:43:55 compute-0 python3.9[83960]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:55 compute-0 sudo[83958]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:55 compute-0 chronyd[65133]: Selected source 138.197.135.239 (pool.ntp.org)
Dec 05 11:43:55 compute-0 sshd-session[83532]: Connection reset by invalid user user 45.135.232.92 port 24222 [preauth]
Dec 05 11:43:56 compute-0 sudo[84110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbavwaqxekkqwuhzaulavwffjmgieivf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935035.7788403-406-177490227712518/AnsiballZ_stat.py'
Dec 05 11:43:56 compute-0 sudo[84110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:56 compute-0 python3.9[84112]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:56 compute-0 sudo[84110]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:56 compute-0 sudo[84235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezwwrmevterebvbutqnactgdzhqojeuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935035.7788403-406-177490227712518/AnsiballZ_copy.py'
Dec 05 11:43:56 compute-0 sudo[84235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:56 compute-0 python3.9[84237]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935035.7788403-406-177490227712518/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:56 compute-0 sudo[84235]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:57 compute-0 sudo[84387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vthkxfowrdpjgiilspzsejrfmcbsytld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935037.143141-422-274444310953755/AnsiballZ_file.py'
Dec 05 11:43:57 compute-0 sudo[84387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:57 compute-0 python3.9[84389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:57 compute-0 sudo[84387]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:58 compute-0 sudo[84539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkicblfewkkdxyjmdokencvznxsinrja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935037.797954-430-32997905733313/AnsiballZ_stat.py'
Dec 05 11:43:58 compute-0 sudo[84539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:58 compute-0 python3.9[84541]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:43:58 compute-0 sudo[84539]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:58 compute-0 sshd-session[84113]: Connection reset by authenticating user root 45.135.232.92 port 42876 [preauth]
Dec 05 11:43:58 compute-0 sudo[84662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogcruqujgfmyhphhvwfvhkuxmtovjmel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935037.797954-430-32997905733313/AnsiballZ_copy.py'
Dec 05 11:43:58 compute-0 sudo[84662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:58 compute-0 python3.9[84664]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935037.797954-430-32997905733313/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:43:59 compute-0 sudo[84662]: pam_unix(sudo:session): session closed for user root
Dec 05 11:43:59 compute-0 sudo[84814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxlqsgviejqmjxcyyyxnmxgqqrqdybyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935039.2005978-446-198050307070600/AnsiballZ_file.py'
Dec 05 11:43:59 compute-0 sudo[84814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:43:59 compute-0 python3.9[84816]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:43:59 compute-0 sudo[84814]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:00 compute-0 sudo[84966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdfqwywvqljbsrcendzhicsofrwocmzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935039.968499-454-281143613470509/AnsiballZ_stat.py'
Dec 05 11:44:00 compute-0 sudo[84966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:00 compute-0 python3.9[84968]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:00 compute-0 sudo[84966]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:00 compute-0 sudo[85089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjvfrlfaqqbxnefkhkuivemtvpicaafk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935039.968499-454-281143613470509/AnsiballZ_copy.py'
Dec 05 11:44:00 compute-0 sudo[85089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:01 compute-0 python3.9[85091]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935039.968499-454-281143613470509/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:01 compute-0 sudo[85089]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:01 compute-0 sshd-session[77422]: Connection closed by 192.168.122.30 port 33422
Dec 05 11:44:01 compute-0 sshd-session[77419]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:44:01 compute-0 systemd-logind[792]: Session 18 logged out. Waiting for processes to exit.
Dec 05 11:44:01 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Dec 05 11:44:01 compute-0 systemd[1]: session-18.scope: Consumed 29.093s CPU time.
Dec 05 11:44:01 compute-0 systemd-logind[792]: Removed session 18.
Dec 05 11:44:06 compute-0 sshd-session[85116]: Accepted publickey for zuul from 192.168.122.30 port 46464 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:44:06 compute-0 systemd-logind[792]: New session 19 of user zuul.
Dec 05 11:44:06 compute-0 systemd[1]: Started Session 19 of User zuul.
Dec 05 11:44:06 compute-0 sshd-session[85116]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:44:07 compute-0 python3.9[85269]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:44:08 compute-0 sudo[85423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzgwowvbspkejsypmtsqdgkrioshilks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935048.1963387-34-238294338850160/AnsiballZ_file.py'
Dec 05 11:44:08 compute-0 sudo[85423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:08 compute-0 python3.9[85425]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:44:08 compute-0 sudo[85423]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:09 compute-0 sudo[85575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufinodavbyoeolndvtkesejukfmsatrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935049.0244942-34-146332554205172/AnsiballZ_file.py'
Dec 05 11:44:09 compute-0 sudo[85575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:09 compute-0 python3.9[85577]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:44:09 compute-0 sudo[85575]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:10 compute-0 python3.9[85727]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:44:10 compute-0 sudo[85877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzwumcgtfblmgtkkfcozmysdzjgfyhnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935050.566823-57-139531741977763/AnsiballZ_seboolean.py'
Dec 05 11:44:10 compute-0 sudo[85877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:11 compute-0 python3.9[85879]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 05 11:44:12 compute-0 sudo[85877]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:13 compute-0 sudo[86033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zekosxuiuzvillsjpogaeqjgrcffvfpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935052.8820953-67-232116379039315/AnsiballZ_setup.py'
Dec 05 11:44:13 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 05 11:44:13 compute-0 sudo[86033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:13 compute-0 python3.9[86035]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:44:13 compute-0 sudo[86033]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:14 compute-0 sudo[86117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccoivhtsaexgnkxdectuzcdlcvbsuyaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935052.8820953-67-232116379039315/AnsiballZ_dnf.py'
Dec 05 11:44:14 compute-0 sudo[86117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:14 compute-0 python3.9[86119]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:44:15 compute-0 sudo[86117]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:16 compute-0 sudo[86270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdcmwvxyjmwjgbcapvkodlllpdxupqgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935055.696496-79-66144203956718/AnsiballZ_systemd.py'
Dec 05 11:44:16 compute-0 sudo[86270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:16 compute-0 python3.9[86272]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 11:44:16 compute-0 sudo[86270]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:17 compute-0 sudo[86425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmtdonhqzcdarqnnuoufcuiwkblmhakn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935056.8261516-87-80032399592509/AnsiballZ_edpm_nftables_snippet.py'
Dec 05 11:44:17 compute-0 sudo[86425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:17 compute-0 python3[86427]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 05 11:44:17 compute-0 sudo[86425]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:17 compute-0 sudo[86577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytveyxtgmkbftdayrxyqhumbbhhnsnqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935057.7350833-96-164372199270902/AnsiballZ_file.py'
Dec 05 11:44:17 compute-0 sudo[86577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:18 compute-0 python3.9[86579]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:18 compute-0 sudo[86577]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:18 compute-0 sudo[86729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfiblqmqsnwgqzjnokmjxwizdrunqgpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935058.3390257-104-184924595214542/AnsiballZ_stat.py'
Dec 05 11:44:18 compute-0 sudo[86729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:18 compute-0 python3.9[86731]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:18 compute-0 sudo[86729]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:19 compute-0 sudo[86807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omdatyqbsuxryyvakhikkvwfqygcduvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935058.3390257-104-184924595214542/AnsiballZ_file.py'
Dec 05 11:44:19 compute-0 sudo[86807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:19 compute-0 python3.9[86809]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:19 compute-0 sudo[86807]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:19 compute-0 sudo[86959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjkocmsulpbwyjautxngtujbkcumlsmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935059.5865605-116-68969065658895/AnsiballZ_stat.py'
Dec 05 11:44:19 compute-0 sudo[86959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:20 compute-0 python3.9[86961]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:20 compute-0 sudo[86959]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:20 compute-0 sudo[87037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqilndsiijelsmfponzkmohpdjlwmnjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935059.5865605-116-68969065658895/AnsiballZ_file.py'
Dec 05 11:44:20 compute-0 sudo[87037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:20 compute-0 python3.9[87039]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.g67b295g recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:20 compute-0 sudo[87037]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:21 compute-0 sudo[87189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhrzpycnnvnmkshecufcgwewvydpbfdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935060.7553334-128-229528167595569/AnsiballZ_stat.py'
Dec 05 11:44:21 compute-0 sudo[87189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:21 compute-0 python3.9[87191]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:21 compute-0 sudo[87189]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:21 compute-0 sudo[87267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnhvauoohkgjubknihnqcnnaytetguzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935060.7553334-128-229528167595569/AnsiballZ_file.py'
Dec 05 11:44:21 compute-0 sudo[87267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:21 compute-0 python3.9[87269]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:21 compute-0 sudo[87267]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:22 compute-0 sudo[87419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jumrleqigaqekqbwsgwxeuloeuwjgxnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935061.84193-141-254094237071930/AnsiballZ_command.py'
Dec 05 11:44:22 compute-0 sudo[87419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:22 compute-0 python3.9[87421]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:44:22 compute-0 sudo[87419]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:23 compute-0 sudo[87572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-queeoqpdikfeatqtbauljwgstzdouwwl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935062.6859608-149-69175229128987/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 11:44:23 compute-0 sudo[87572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:23 compute-0 python3[87574]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 11:44:23 compute-0 sudo[87572]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:23 compute-0 sudo[87724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaqhhwcibntldbmoqoiczivkgeyrjwck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935063.4905515-157-124107120358979/AnsiballZ_stat.py'
Dec 05 11:44:23 compute-0 sudo[87724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:24 compute-0 python3.9[87726]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:24 compute-0 sudo[87724]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:24 compute-0 sudo[87849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkulqgfylohtmjwawipbpfkynomhbkid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935063.4905515-157-124107120358979/AnsiballZ_copy.py'
Dec 05 11:44:24 compute-0 sudo[87849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:24 compute-0 python3.9[87851]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935063.4905515-157-124107120358979/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:24 compute-0 sudo[87849]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:25 compute-0 sudo[88001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whkoctlsmtoddoqogsrcfkcvjzraayhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935064.9747233-172-142704256555088/AnsiballZ_stat.py'
Dec 05 11:44:25 compute-0 sudo[88001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:25 compute-0 python3.9[88003]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:25 compute-0 sudo[88001]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:25 compute-0 sudo[88126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psvfdbavhsczewhregxuatbnyefaynzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935064.9747233-172-142704256555088/AnsiballZ_copy.py'
Dec 05 11:44:25 compute-0 sudo[88126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:26 compute-0 python3.9[88128]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935064.9747233-172-142704256555088/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:26 compute-0 sudo[88126]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:26 compute-0 sudo[88278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogaykrxplvzofhahnhyibjdznkjyjihh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935066.3137581-187-279013907552183/AnsiballZ_stat.py'
Dec 05 11:44:26 compute-0 sudo[88278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:26 compute-0 python3.9[88280]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:26 compute-0 sudo[88278]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:27 compute-0 sudo[88403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvgolqfexbkgyqdizpakvkzonqbakyip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935066.3137581-187-279013907552183/AnsiballZ_copy.py'
Dec 05 11:44:27 compute-0 sudo[88403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:27 compute-0 python3.9[88405]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935066.3137581-187-279013907552183/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:27 compute-0 sudo[88403]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:27 compute-0 sudo[88555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otqnynzmxvfxwpqxbnvfgmopxcbawnmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935067.5185306-202-69376180708121/AnsiballZ_stat.py'
Dec 05 11:44:27 compute-0 sudo[88555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:28 compute-0 python3.9[88557]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:28 compute-0 sudo[88555]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:28 compute-0 sudo[88680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qphyajauxtjfcsdmwydhqotnwfxhavyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935067.5185306-202-69376180708121/AnsiballZ_copy.py'
Dec 05 11:44:28 compute-0 sudo[88680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:28 compute-0 python3.9[88682]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935067.5185306-202-69376180708121/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:28 compute-0 sudo[88680]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:29 compute-0 sudo[88832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdlgzqjjqouxquasvmqmyhvcweszvnoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935068.7542396-217-212900420981031/AnsiballZ_stat.py'
Dec 05 11:44:29 compute-0 sudo[88832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:29 compute-0 python3.9[88834]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:29 compute-0 sudo[88832]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:29 compute-0 sudo[88957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvskabygjgjauybcumpjcvludfupwwij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935068.7542396-217-212900420981031/AnsiballZ_copy.py'
Dec 05 11:44:29 compute-0 sudo[88957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:30 compute-0 python3.9[88959]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935068.7542396-217-212900420981031/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:30 compute-0 sudo[88957]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:30 compute-0 sudo[89109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwosftcztgsxmvgwszazpotgqlkxghnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935070.424365-232-202272300141770/AnsiballZ_file.py'
Dec 05 11:44:30 compute-0 sudo[89109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:30 compute-0 python3.9[89111]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:30 compute-0 sudo[89109]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:31 compute-0 sudo[89261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsrwbjmsvdcshjliibkwszkopjkaiome ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935071.0371668-240-197246082512207/AnsiballZ_command.py'
Dec 05 11:44:31 compute-0 sudo[89261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:31 compute-0 python3.9[89263]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:44:31 compute-0 sudo[89261]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:32 compute-0 sudo[89416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsfpzdrcacnlcghudbgzcmsclcrqfras ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935071.7021313-248-74496384732219/AnsiballZ_blockinfile.py'
Dec 05 11:44:32 compute-0 sudo[89416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:32 compute-0 python3.9[89418]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:32 compute-0 sudo[89416]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:32 compute-0 sudo[89568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejgylrhuzqxjtkkkmhypfmchoiqpkwcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935072.615094-257-214669194327753/AnsiballZ_command.py'
Dec 05 11:44:32 compute-0 sudo[89568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:33 compute-0 python3.9[89570]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:44:33 compute-0 sudo[89568]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:33 compute-0 sudo[89721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzcbvuwgiwvisklzvexmmjcxahmcwpoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935073.2765706-265-221303412426021/AnsiballZ_stat.py'
Dec 05 11:44:33 compute-0 sudo[89721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:33 compute-0 python3.9[89723]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:44:33 compute-0 sudo[89721]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:34 compute-0 sudo[89875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sexanfqezdflygtkfwogbjpgoxvzcjcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935073.9348233-273-48027198086965/AnsiballZ_command.py'
Dec 05 11:44:34 compute-0 sudo[89875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:34 compute-0 python3.9[89877]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:44:34 compute-0 sudo[89875]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:34 compute-0 sudo[90030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqlrpgqpudchkjwktwlyjnbhadljrksk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935074.6217442-281-148026511523173/AnsiballZ_file.py'
Dec 05 11:44:34 compute-0 sudo[90030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:35 compute-0 python3.9[90032]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:35 compute-0 sudo[90030]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:36 compute-0 python3.9[90182]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:44:37 compute-0 sudo[90333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otqqketimujvbmdtzbbnknczmbvkfxde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935076.7027874-321-205157769922306/AnsiballZ_command.py'
Dec 05 11:44:37 compute-0 sudo[90333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:37 compute-0 python3.9[90335]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:44:37 compute-0 ovs-vsctl[90336]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 05 11:44:37 compute-0 sudo[90333]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:37 compute-0 sudo[90486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hepqsseqflbswyqabgblhmuadtrdpqrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935077.5461247-330-271990472993532/AnsiballZ_command.py'
Dec 05 11:44:37 compute-0 sudo[90486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:38 compute-0 python3.9[90488]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:44:38 compute-0 sudo[90486]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:38 compute-0 sudo[90641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amhviujyatjmusejmhmffyuuhapvnahi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935078.1952608-338-241611597238261/AnsiballZ_command.py'
Dec 05 11:44:38 compute-0 sudo[90641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:38 compute-0 python3.9[90643]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:44:38 compute-0 ovs-vsctl[90644]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 05 11:44:38 compute-0 sudo[90641]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:39 compute-0 python3.9[90794]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:44:39 compute-0 sudo[90946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfyfejqdbprhchhiaudxkgflywaqjluw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935079.6822782-355-17418062147876/AnsiballZ_file.py'
Dec 05 11:44:39 compute-0 sudo[90946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:40 compute-0 python3.9[90948]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:44:40 compute-0 sudo[90946]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:40 compute-0 sudo[91098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzaidsnmnxuxqfizruiyqhsqhefukywh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935080.4030104-363-16918494668465/AnsiballZ_stat.py'
Dec 05 11:44:40 compute-0 sudo[91098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:40 compute-0 python3.9[91100]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:40 compute-0 sudo[91098]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:41 compute-0 sudo[91176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvzeogbvathshtaisyndierikwigzhtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935080.4030104-363-16918494668465/AnsiballZ_file.py'
Dec 05 11:44:41 compute-0 sudo[91176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:41 compute-0 python3.9[91178]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:44:41 compute-0 sudo[91176]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:41 compute-0 sudo[91329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmxepfcmrfprvxbjdakjzflbodvsawbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935081.532445-363-22520415259578/AnsiballZ_stat.py'
Dec 05 11:44:41 compute-0 sudo[91329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:41 compute-0 sshd-session[91302]: Connection closed by 43.225.159.111 port 52892
Dec 05 11:44:42 compute-0 python3.9[91331]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:42 compute-0 sudo[91329]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:42 compute-0 sudo[91407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuenidomhwlutrbewloxrcbaaitsohnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935081.532445-363-22520415259578/AnsiballZ_file.py'
Dec 05 11:44:42 compute-0 sudo[91407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:42 compute-0 python3.9[91409]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:44:42 compute-0 sudo[91407]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:42 compute-0 sudo[91559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofhtpvdrzgmwfoqwffgdmixfbowzwico ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935082.706282-386-29170403104987/AnsiballZ_file.py'
Dec 05 11:44:43 compute-0 sudo[91559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:43 compute-0 python3.9[91561]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:43 compute-0 sudo[91559]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:43 compute-0 sudo[91711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnzpxrbxhsntvajlteqfecdzyvwbuciu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935083.3657184-394-74860955572919/AnsiballZ_stat.py'
Dec 05 11:44:43 compute-0 sudo[91711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:43 compute-0 python3.9[91713]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:43 compute-0 sudo[91711]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:44 compute-0 sudo[91789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tohmyfmmaxqdqgfeftxrnfzaqazfkpdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935083.3657184-394-74860955572919/AnsiballZ_file.py'
Dec 05 11:44:44 compute-0 sudo[91789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:44 compute-0 python3.9[91791]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:44 compute-0 sudo[91789]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:44 compute-0 sudo[91941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lojwmzdfchodupxgfamyfuqffjvdbosf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935084.4581656-406-218618187963567/AnsiballZ_stat.py'
Dec 05 11:44:44 compute-0 sudo[91941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:44 compute-0 python3.9[91943]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:44 compute-0 sudo[91941]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:45 compute-0 sudo[92019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hflcfbamhtlqllhugaaupvnfuosfzoma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935084.4581656-406-218618187963567/AnsiballZ_file.py'
Dec 05 11:44:45 compute-0 sudo[92019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:45 compute-0 python3.9[92021]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:45 compute-0 sudo[92019]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:45 compute-0 sudo[92171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzpiesesppqbpgxwfkyeuedzwuctwhea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935085.5386307-418-162600123814613/AnsiballZ_systemd.py'
Dec 05 11:44:45 compute-0 sudo[92171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:46 compute-0 python3.9[92173]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:44:46 compute-0 systemd[1]: Reloading.
Dec 05 11:44:46 compute-0 systemd-rc-local-generator[92199]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:44:46 compute-0 systemd-sysv-generator[92203]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:44:46 compute-0 sudo[92171]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:46 compute-0 sudo[92359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqerzhqdmponuwzekgacbgmeitkyfawe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935086.6902502-426-78346777656470/AnsiballZ_stat.py'
Dec 05 11:44:46 compute-0 sudo[92359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:47 compute-0 python3.9[92361]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:47 compute-0 sudo[92359]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:47 compute-0 sudo[92437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbpblfmnejxpesnwlxrwtzoldsztyggd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935086.6902502-426-78346777656470/AnsiballZ_file.py'
Dec 05 11:44:47 compute-0 sudo[92437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:47 compute-0 python3.9[92439]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:47 compute-0 sudo[92437]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:48 compute-0 sudo[92589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwolzsoarjoutxeyfxlhcrhlspzmjqsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935088.167063-438-190298281830765/AnsiballZ_stat.py'
Dec 05 11:44:48 compute-0 sudo[92589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:48 compute-0 python3.9[92591]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:48 compute-0 sudo[92589]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:48 compute-0 sudo[92667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiwxqvytglqevvlldnxfohafysadhvnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935088.167063-438-190298281830765/AnsiballZ_file.py'
Dec 05 11:44:48 compute-0 sudo[92667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:49 compute-0 python3.9[92669]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:49 compute-0 sudo[92667]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:49 compute-0 sudo[92819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jppqldfhnxtjewcmofiydtuqdyjcpqsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935089.3639967-450-172984691691700/AnsiballZ_systemd.py'
Dec 05 11:44:49 compute-0 sudo[92819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:50 compute-0 python3.9[92821]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:44:50 compute-0 systemd[1]: Reloading.
Dec 05 11:44:50 compute-0 systemd-rc-local-generator[92848]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:44:50 compute-0 systemd-sysv-generator[92852]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:44:50 compute-0 systemd[1]: Starting Create netns directory...
Dec 05 11:44:50 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 11:44:50 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 11:44:50 compute-0 systemd[1]: Finished Create netns directory.
Dec 05 11:44:50 compute-0 sudo[92819]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:50 compute-0 sudo[93012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrkwrmpbbltfpnazjagujwpqtflmlgro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935090.5832155-460-47916251093998/AnsiballZ_file.py'
Dec 05 11:44:50 compute-0 sudo[93012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:51 compute-0 python3.9[93014]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:44:51 compute-0 sudo[93012]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:51 compute-0 sudo[93164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtkihpueegbytjuqchzfxkkqyrlfrgue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935091.2254436-468-69485999255828/AnsiballZ_stat.py'
Dec 05 11:44:51 compute-0 sudo[93164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:51 compute-0 python3.9[93166]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:51 compute-0 sudo[93164]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:52 compute-0 sudo[93287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyvamvvzgguunqjdmmmlxjcpqbmjgjbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935091.2254436-468-69485999255828/AnsiballZ_copy.py'
Dec 05 11:44:52 compute-0 sudo[93287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:52 compute-0 python3.9[93289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935091.2254436-468-69485999255828/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:44:52 compute-0 sudo[93287]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:52 compute-0 sudo[93439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olwkyqdeqkquxcegxxdnwldewcquukdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935092.6082544-485-234866127724825/AnsiballZ_file.py'
Dec 05 11:44:52 compute-0 sudo[93439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:53 compute-0 python3.9[93441]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:44:53 compute-0 sudo[93439]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:53 compute-0 sudo[93591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bitqciirqsaiyciitxehcvizfzyveoaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935093.3306-493-210654007584453/AnsiballZ_stat.py'
Dec 05 11:44:53 compute-0 sudo[93591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:53 compute-0 python3.9[93593]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:44:53 compute-0 sudo[93591]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:54 compute-0 sudo[93714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geaqgzcxydtoypyyjebniwjgnrfbdmqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935093.3306-493-210654007584453/AnsiballZ_copy.py'
Dec 05 11:44:54 compute-0 sudo[93714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:54 compute-0 python3.9[93716]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935093.3306-493-210654007584453/.source.json _original_basename=.oqg518va follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:54 compute-0 sudo[93714]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:54 compute-0 sudo[93866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egncossozxqhecebegkabltzxwnsgyks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935094.4851234-508-10978982487978/AnsiballZ_file.py'
Dec 05 11:44:54 compute-0 sudo[93866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:55 compute-0 python3.9[93868]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:44:55 compute-0 sudo[93866]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:55 compute-0 sudo[94018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmvrnqnowqknlezjzogyqaqoxqvjducm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935095.2207046-516-69368011349845/AnsiballZ_stat.py'
Dec 05 11:44:55 compute-0 sudo[94018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:55 compute-0 sudo[94018]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:56 compute-0 sudo[94141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqyddexfptglfkspfsetjysgaeadswmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935095.2207046-516-69368011349845/AnsiballZ_copy.py'
Dec 05 11:44:56 compute-0 sudo[94141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:56 compute-0 sudo[94141]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:57 compute-0 sudo[94293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkwmhyqczwfdbkzynbafvrstktkhgrkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935096.62153-533-64799315491973/AnsiballZ_container_config_data.py'
Dec 05 11:44:57 compute-0 sudo[94293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:57 compute-0 python3.9[94295]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 05 11:44:57 compute-0 sudo[94293]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:57 compute-0 sudo[94445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlevzthmlqbrgnrhxqnkjapbguuhflxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935097.5290954-542-161140297068394/AnsiballZ_container_config_hash.py'
Dec 05 11:44:57 compute-0 sudo[94445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:58 compute-0 python3.9[94447]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 11:44:58 compute-0 sudo[94445]: pam_unix(sudo:session): session closed for user root
Dec 05 11:44:58 compute-0 sudo[94597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdgsvwakehnvdiikazgmdmtvyanyifne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935098.4813554-551-41932080640277/AnsiballZ_podman_container_info.py'
Dec 05 11:44:58 compute-0 sudo[94597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:44:59 compute-0 python3.9[94599]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 11:44:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:44:59 compute-0 sudo[94597]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:00 compute-0 sudo[94759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avxjsspihncckgdjorysrnjrgdzfamtx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935099.717721-564-157538230071508/AnsiballZ_edpm_container_manage.py'
Dec 05 11:45:00 compute-0 sudo[94759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:00 compute-0 python3[94761]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 11:45:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:45:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:45:00 compute-0 podman[94796]: 2025-12-05 11:45:00.694457544 +0000 UTC m=+0.054561807 container create 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 05 11:45:00 compute-0 podman[94796]: 2025-12-05 11:45:00.667372357 +0000 UTC m=+0.027476600 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 05 11:45:00 compute-0 python3[94761]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 05 11:45:00 compute-0 sudo[94759]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:01 compute-0 sudo[94984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouwqkryncehbxsbacisvmfugfinixsdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935100.960529-572-255435814279161/AnsiballZ_stat.py'
Dec 05 11:45:01 compute-0 sudo[94984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:01 compute-0 python3.9[94986]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:45:01 compute-0 sudo[94984]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 11:45:02 compute-0 sudo[95138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htxqwugdotceqfhjppavgersnozhgudx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935101.6767263-581-249809704932346/AnsiballZ_file.py'
Dec 05 11:45:02 compute-0 sudo[95138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:02 compute-0 python3.9[95140]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:02 compute-0 sudo[95138]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:02 compute-0 sudo[95214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbmujtleocwfigqbshhmonsqurzvvrrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935101.6767263-581-249809704932346/AnsiballZ_stat.py'
Dec 05 11:45:02 compute-0 sudo[95214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:02 compute-0 python3.9[95216]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:45:02 compute-0 sudo[95214]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:03 compute-0 sudo[95365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeozsmahwyhcobpxjqhpbgvujgzlhvfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935102.7339227-581-30475193930869/AnsiballZ_copy.py'
Dec 05 11:45:03 compute-0 sudo[95365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:03 compute-0 python3.9[95367]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935102.7339227-581-30475193930869/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:03 compute-0 sudo[95365]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:03 compute-0 sudo[95441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-calxzztkxzeyjljglwinngskluzlckcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935102.7339227-581-30475193930869/AnsiballZ_systemd.py'
Dec 05 11:45:03 compute-0 sudo[95441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:03 compute-0 python3.9[95443]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:45:03 compute-0 systemd[1]: Reloading.
Dec 05 11:45:04 compute-0 systemd-rc-local-generator[95471]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:45:04 compute-0 systemd-sysv-generator[95474]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:45:04 compute-0 sudo[95441]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:04 compute-0 sudo[95552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrwqgyqrmckjwgkblvxfvuaxcbykenfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935102.7339227-581-30475193930869/AnsiballZ_systemd.py'
Dec 05 11:45:04 compute-0 sudo[95552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:04 compute-0 python3.9[95554]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:45:04 compute-0 systemd[1]: Reloading.
Dec 05 11:45:04 compute-0 systemd-rc-local-generator[95584]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:45:04 compute-0 systemd-sysv-generator[95587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:45:05 compute-0 systemd[1]: Starting ovn_controller container...
Dec 05 11:45:05 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 05 11:45:05 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:45:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10948cb30dd006528842e9fd92fe4b809ecc45dabd60697bd859f375e9a51494/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 05 11:45:05 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.
Dec 05 11:45:05 compute-0 podman[95595]: 2025-12-05 11:45:05.702916139 +0000 UTC m=+0.631561113 container init 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 11:45:05 compute-0 ovn_controller[95610]: + sudo -E kolla_set_configs
Dec 05 11:45:05 compute-0 podman[95595]: 2025-12-05 11:45:05.736542984 +0000 UTC m=+0.665187918 container start 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:45:05 compute-0 edpm-start-podman-container[95595]: ovn_controller
Dec 05 11:45:05 compute-0 systemd[1]: Created slice User Slice of UID 0.
Dec 05 11:45:05 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 05 11:45:05 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 05 11:45:05 compute-0 systemd[1]: Starting User Manager for UID 0...
Dec 05 11:45:05 compute-0 edpm-start-podman-container[95594]: Creating additional drop-in dependency for "ovn_controller" (6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698)
Dec 05 11:45:05 compute-0 systemd[95652]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 05 11:45:05 compute-0 podman[95616]: 2025-12-05 11:45:05.818176387 +0000 UTC m=+0.070562106 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 05 11:45:05 compute-0 systemd[1]: Reloading.
Dec 05 11:45:05 compute-0 systemd-sysv-generator[95700]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:45:05 compute-0 systemd[95652]: Queued start job for default target Main User Target.
Dec 05 11:45:05 compute-0 systemd-rc-local-generator[95697]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:45:05 compute-0 systemd[95652]: Created slice User Application Slice.
Dec 05 11:45:05 compute-0 systemd[95652]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 05 11:45:05 compute-0 systemd[95652]: Started Daily Cleanup of User's Temporary Directories.
Dec 05 11:45:05 compute-0 systemd[95652]: Reached target Paths.
Dec 05 11:45:05 compute-0 systemd[95652]: Reached target Timers.
Dec 05 11:45:05 compute-0 systemd[95652]: Starting D-Bus User Message Bus Socket...
Dec 05 11:45:05 compute-0 systemd[95652]: Starting Create User's Volatile Files and Directories...
Dec 05 11:45:05 compute-0 systemd[95652]: Finished Create User's Volatile Files and Directories.
Dec 05 11:45:05 compute-0 systemd[95652]: Listening on D-Bus User Message Bus Socket.
Dec 05 11:45:05 compute-0 systemd[95652]: Reached target Sockets.
Dec 05 11:45:05 compute-0 systemd[95652]: Reached target Basic System.
Dec 05 11:45:05 compute-0 systemd[95652]: Reached target Main User Target.
Dec 05 11:45:05 compute-0 systemd[95652]: Startup finished in 129ms.
Dec 05 11:45:06 compute-0 systemd[1]: Started User Manager for UID 0.
Dec 05 11:45:06 compute-0 systemd[1]: Started ovn_controller container.
Dec 05 11:45:06 compute-0 systemd[1]: 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698-6ce87739bdca08f5.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 11:45:06 compute-0 systemd[1]: 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698-6ce87739bdca08f5.service: Failed with result 'exit-code'.
Dec 05 11:45:06 compute-0 systemd[1]: Started Session c1 of User root.
Dec 05 11:45:06 compute-0 sudo[95552]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:06 compute-0 ovn_controller[95610]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 11:45:06 compute-0 ovn_controller[95610]: INFO:__main__:Validating config file
Dec 05 11:45:06 compute-0 ovn_controller[95610]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 11:45:06 compute-0 ovn_controller[95610]: INFO:__main__:Writing out command to execute
Dec 05 11:45:06 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 05 11:45:06 compute-0 ovn_controller[95610]: ++ cat /run_command
Dec 05 11:45:06 compute-0 ovn_controller[95610]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 05 11:45:06 compute-0 ovn_controller[95610]: + ARGS=
Dec 05 11:45:06 compute-0 ovn_controller[95610]: + sudo kolla_copy_cacerts
Dec 05 11:45:06 compute-0 systemd[1]: Started Session c2 of User root.
Dec 05 11:45:06 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 05 11:45:06 compute-0 ovn_controller[95610]: + [[ ! -n '' ]]
Dec 05 11:45:06 compute-0 ovn_controller[95610]: + . kolla_extend_start
Dec 05 11:45:06 compute-0 ovn_controller[95610]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 05 11:45:06 compute-0 ovn_controller[95610]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 05 11:45:06 compute-0 ovn_controller[95610]: + umask 0022
Dec 05 11:45:06 compute-0 ovn_controller[95610]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00010|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00011|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00013|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec 05 11:45:06 compute-0 NetworkManager[55691]: <info>  [1764935106.2628] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec 05 11:45:06 compute-0 NetworkManager[55691]: <info>  [1764935106.2636] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 05 11:45:06 compute-0 NetworkManager[55691]: <info>  [1764935106.2644] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec 05 11:45:06 compute-0 NetworkManager[55691]: <info>  [1764935106.2648] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec 05 11:45:06 compute-0 NetworkManager[55691]: <info>  [1764935106.2651] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00018|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00019|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 05 11:45:06 compute-0 kernel: br-int: entered promiscuous mode
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00020|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00021|features|INFO|OVS Feature: ct_flush, state: supported
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00022|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 05 11:45:06 compute-0 ovn_controller[95610]: 2025-12-05T11:45:06Z|00023|main|INFO|OVS feature set changed, force recompute.
Dec 05 11:45:06 compute-0 systemd-udevd[95799]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:45:06 compute-0 sudo[95875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjxpxqmkgieoqorrzdcptxwjmkqpeyyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935106.235834-609-222814743673733/AnsiballZ_command.py'
Dec 05 11:45:06 compute-0 sudo[95875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:06 compute-0 python3.9[95877]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:45:06 compute-0 ovs-vsctl[95878]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 05 11:45:06 compute-0 sudo[95875]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:07 compute-0 sudo[96028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptotitnljclrjiqnhdntxyydogxejqha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935106.975413-617-236800695928443/AnsiballZ_command.py'
Dec 05 11:45:07 compute-0 sudo[96028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00024|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00025|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00026|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00028|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00029|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00030|main|INFO|OVS feature set changed, force recompute.
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 11:45:07 compute-0 ovn_controller[95610]: 2025-12-05T11:45:07Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 11:45:07 compute-0 NetworkManager[55691]: <info>  [1764935107.2687] manager: (ovn-1cd8e1-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 05 11:45:07 compute-0 systemd-udevd[95801]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:45:07 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Dec 05 11:45:07 compute-0 NetworkManager[55691]: <info>  [1764935107.2847] device (genev_sys_6081): carrier: link connected
Dec 05 11:45:07 compute-0 NetworkManager[55691]: <info>  [1764935107.2850] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec 05 11:45:07 compute-0 python3.9[96030]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:45:07 compute-0 ovs-vsctl[96035]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 05 11:45:07 compute-0 sudo[96028]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:08 compute-0 sudo[96186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvrruyrvtuxxaejdlvjrihcuxrgsvzfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935107.8870642-631-82413483244959/AnsiballZ_command.py'
Dec 05 11:45:08 compute-0 sudo[96186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:08 compute-0 python3.9[96188]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:45:08 compute-0 ovs-vsctl[96189]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 05 11:45:08 compute-0 sudo[96186]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:08 compute-0 sshd-session[85119]: Connection closed by 192.168.122.30 port 46464
Dec 05 11:45:08 compute-0 sshd-session[85116]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:45:08 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Dec 05 11:45:08 compute-0 systemd[1]: session-19.scope: Consumed 45.349s CPU time.
Dec 05 11:45:08 compute-0 systemd-logind[792]: Session 19 logged out. Waiting for processes to exit.
Dec 05 11:45:08 compute-0 systemd-logind[792]: Removed session 19.
Dec 05 11:45:14 compute-0 sshd-session[96214]: Accepted publickey for zuul from 192.168.122.30 port 53570 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:45:14 compute-0 systemd-logind[792]: New session 21 of user zuul.
Dec 05 11:45:14 compute-0 systemd[1]: Started Session 21 of User zuul.
Dec 05 11:45:14 compute-0 sshd-session[96214]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:45:15 compute-0 python3.9[96367]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:45:16 compute-0 systemd[1]: Stopping User Manager for UID 0...
Dec 05 11:45:16 compute-0 systemd[95652]: Activating special unit Exit the Session...
Dec 05 11:45:16 compute-0 systemd[95652]: Stopped target Main User Target.
Dec 05 11:45:16 compute-0 systemd[95652]: Stopped target Basic System.
Dec 05 11:45:16 compute-0 systemd[95652]: Stopped target Paths.
Dec 05 11:45:16 compute-0 systemd[95652]: Stopped target Sockets.
Dec 05 11:45:16 compute-0 systemd[95652]: Stopped target Timers.
Dec 05 11:45:16 compute-0 systemd[95652]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 05 11:45:16 compute-0 systemd[95652]: Closed D-Bus User Message Bus Socket.
Dec 05 11:45:16 compute-0 systemd[95652]: Stopped Create User's Volatile Files and Directories.
Dec 05 11:45:16 compute-0 systemd[95652]: Removed slice User Application Slice.
Dec 05 11:45:16 compute-0 systemd[95652]: Reached target Shutdown.
Dec 05 11:45:16 compute-0 systemd[95652]: Finished Exit the Session.
Dec 05 11:45:16 compute-0 systemd[95652]: Reached target Exit the Session.
Dec 05 11:45:16 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Dec 05 11:45:16 compute-0 systemd[1]: Stopped User Manager for UID 0.
Dec 05 11:45:16 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 05 11:45:16 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 05 11:45:16 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 05 11:45:16 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 05 11:45:16 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Dec 05 11:45:16 compute-0 sudo[96523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrzadapmerlxmcnvynkuwhelqtrvtddu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935115.928844-34-156381672720676/AnsiballZ_file.py'
Dec 05 11:45:16 compute-0 sudo[96523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:16 compute-0 python3.9[96525]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:16 compute-0 sudo[96523]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:17 compute-0 sudo[96675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xecooggamapmyssqapsmyejxutzyflhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935116.9221525-34-104768639141525/AnsiballZ_file.py'
Dec 05 11:45:17 compute-0 sudo[96675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:17 compute-0 python3.9[96677]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:17 compute-0 sudo[96675]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:18 compute-0 sudo[96827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhflfrxhzzckyeyrffzfxhgitwkmldlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935117.7294476-34-79557022193084/AnsiballZ_file.py'
Dec 05 11:45:18 compute-0 sudo[96827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:18 compute-0 python3.9[96829]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:18 compute-0 sudo[96827]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:18 compute-0 sudo[96979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keiqhlhnopssrpckdhzowxtlfqbpixmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935118.414995-34-201444772935386/AnsiballZ_file.py'
Dec 05 11:45:18 compute-0 sudo[96979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:18 compute-0 python3.9[96981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:18 compute-0 sudo[96979]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:19 compute-0 sudo[97131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxsspvqfokotmcyroyzpgntjxueuirzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935119.0604267-34-103978113067041/AnsiballZ_file.py'
Dec 05 11:45:19 compute-0 sudo[97131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:19 compute-0 python3.9[97133]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:19 compute-0 sudo[97131]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:20 compute-0 python3.9[97283]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:45:20 compute-0 sudo[97433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmzwmhmtmbmyqfoajkcsdsrgcptiqjhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935120.5002573-78-143913837894400/AnsiballZ_seboolean.py'
Dec 05 11:45:20 compute-0 sudo[97433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:21 compute-0 python3.9[97435]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 05 11:45:21 compute-0 sudo[97433]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:22 compute-0 python3.9[97585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:23 compute-0 python3.9[97706]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935121.9069054-86-229035617534459/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:23 compute-0 python3.9[97856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:24 compute-0 python3.9[97977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935123.3868709-101-215872605882899/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:25 compute-0 sudo[98127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpehlmnijehdckpimdurrlqcbcrxpqxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935124.7776577-118-262303245737264/AnsiballZ_setup.py'
Dec 05 11:45:25 compute-0 sudo[98127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:25 compute-0 python3.9[98129]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:45:25 compute-0 sudo[98127]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:26 compute-0 sudo[98212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhlprmnbucwxefdtyxffoqlwtpbewkap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935124.7776577-118-262303245737264/AnsiballZ_dnf.py'
Dec 05 11:45:26 compute-0 sudo[98212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:26 compute-0 python3.9[98214]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:45:27 compute-0 sudo[98212]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:28 compute-0 sudo[98365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nikuoiwhrmcxlcdypfjxrnbcvelvdrsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935127.6987672-130-137775785059056/AnsiballZ_systemd.py'
Dec 05 11:45:28 compute-0 sudo[98365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:28 compute-0 python3.9[98367]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 11:45:28 compute-0 sudo[98365]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:29 compute-0 python3.9[98520]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:29 compute-0 python3.9[98641]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935128.8890836-138-85324542870571/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:30 compute-0 python3.9[98791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:31 compute-0 python3.9[98912]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935130.0398328-138-46926094524425/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:32 compute-0 python3.9[99062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:32 compute-0 python3.9[99183]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935131.7976916-182-216134272145339/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:33 compute-0 python3.9[99333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:33 compute-0 python3.9[99454]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935132.9427874-182-70459949224904/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:34 compute-0 python3.9[99604]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:45:35 compute-0 sudo[99756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkjglyulypcwbftfjtmwmkcpswqmgaqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935134.8070397-220-194633444757937/AnsiballZ_file.py'
Dec 05 11:45:35 compute-0 sudo[99756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:35 compute-0 python3.9[99758]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:35 compute-0 sudo[99756]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:35 compute-0 sudo[99908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jycgduiwiwupdfrysnhijwjxjvkqqkkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935135.4672773-228-187932474439564/AnsiballZ_stat.py'
Dec 05 11:45:35 compute-0 sudo[99908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:35 compute-0 python3.9[99910]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:35 compute-0 sudo[99908]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:36 compute-0 sudo[99999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrdwrojldmxinwgeauwicrrjnckwwylu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935135.4672773-228-187932474439564/AnsiballZ_file.py'
Dec 05 11:45:36 compute-0 sudo[99999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:36 compute-0 ovn_controller[95610]: 2025-12-05T11:45:36Z|00031|memory|INFO|16000 kB peak resident set size after 30.0 seconds
Dec 05 11:45:36 compute-0 ovn_controller[95610]: 2025-12-05T11:45:36Z|00032|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Dec 05 11:45:36 compute-0 podman[99960]: 2025-12-05 11:45:36.246268129 +0000 UTC m=+0.104535092 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Dec 05 11:45:36 compute-0 python3.9[100005]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:36 compute-0 sudo[99999]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:36 compute-0 sudo[100162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slobviyfrsdmlzeyjjauanrscbtfypfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935136.550125-228-19776218604304/AnsiballZ_stat.py'
Dec 05 11:45:36 compute-0 sudo[100162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:37 compute-0 python3.9[100164]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:37 compute-0 sudo[100162]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:37 compute-0 sudo[100240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrghszrzdcpkwfqzdhgxvbholoercgnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935136.550125-228-19776218604304/AnsiballZ_file.py'
Dec 05 11:45:37 compute-0 sudo[100240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:37 compute-0 python3.9[100242]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:37 compute-0 sudo[100240]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:38 compute-0 sudo[100392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvitwgkesaagkvofqonmkcaodprlhtfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935137.7153409-251-225996498861492/AnsiballZ_file.py'
Dec 05 11:45:38 compute-0 sudo[100392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:38 compute-0 python3.9[100394]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:38 compute-0 sudo[100392]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:38 compute-0 sudo[100544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiriuqcztxirinjiariakfsidnriwanf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935138.3905513-259-126836118438579/AnsiballZ_stat.py'
Dec 05 11:45:38 compute-0 sudo[100544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:38 compute-0 python3.9[100546]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:38 compute-0 sudo[100544]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:39 compute-0 sudo[100622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akrqauzgonwiikkumrnesbisckvkbatd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935138.3905513-259-126836118438579/AnsiballZ_file.py'
Dec 05 11:45:39 compute-0 sudo[100622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:39 compute-0 python3.9[100624]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:39 compute-0 sudo[100622]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:39 compute-0 sudo[100774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrbwcatddshkuxczoabljjxoeavuswyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935139.5899184-271-176058438145367/AnsiballZ_stat.py'
Dec 05 11:45:39 compute-0 sudo[100774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:40 compute-0 python3.9[100776]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:40 compute-0 sudo[100774]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:40 compute-0 sudo[100852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bafkgbyiholwlbeqsblfoheiqhsaphfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935139.5899184-271-176058438145367/AnsiballZ_file.py'
Dec 05 11:45:40 compute-0 sudo[100852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:40 compute-0 python3.9[100854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:40 compute-0 sudo[100852]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:40 compute-0 sudo[101004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kttesdmhvrjwypvwmnwemvaiqbgyrjkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935140.6771293-283-193430423207802/AnsiballZ_systemd.py'
Dec 05 11:45:40 compute-0 sudo[101004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:41 compute-0 python3.9[101006]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:45:41 compute-0 systemd[1]: Reloading.
Dec 05 11:45:41 compute-0 systemd-rc-local-generator[101031]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:45:41 compute-0 systemd-sysv-generator[101034]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:45:41 compute-0 sudo[101004]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:42 compute-0 sudo[101194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynzfuhbakgrarzldemnpqdcleeaempsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935141.7047648-291-142980576081986/AnsiballZ_stat.py'
Dec 05 11:45:42 compute-0 sudo[101194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:42 compute-0 python3.9[101196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:42 compute-0 sudo[101194]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:42 compute-0 sudo[101272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omtsgogidvijmobdzwzietldmlhtatux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935141.7047648-291-142980576081986/AnsiballZ_file.py'
Dec 05 11:45:42 compute-0 sudo[101272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:42 compute-0 python3.9[101274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:42 compute-0 sudo[101272]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:43 compute-0 sudo[101424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abzfjmwcylwsfnnexmhvszlvuohzzvfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935142.8739016-303-38682656181876/AnsiballZ_stat.py'
Dec 05 11:45:43 compute-0 sudo[101424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:43 compute-0 python3.9[101426]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:43 compute-0 sudo[101424]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:43 compute-0 sudo[101502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muzliiphagnawuraeajerszzfvicdsos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935142.8739016-303-38682656181876/AnsiballZ_file.py'
Dec 05 11:45:43 compute-0 sudo[101502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:43 compute-0 python3.9[101504]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:43 compute-0 sudo[101502]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:44 compute-0 sudo[101654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjflavxxlobhjgkejzvgrkmsqplhjhvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935144.0059578-315-179949088049245/AnsiballZ_systemd.py'
Dec 05 11:45:44 compute-0 sudo[101654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:44 compute-0 python3.9[101656]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:45:44 compute-0 systemd[1]: Reloading.
Dec 05 11:45:44 compute-0 systemd-rc-local-generator[101681]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:45:44 compute-0 systemd-sysv-generator[101685]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:45:44 compute-0 systemd[1]: Starting Create netns directory...
Dec 05 11:45:44 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 11:45:44 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 11:45:44 compute-0 systemd[1]: Finished Create netns directory.
Dec 05 11:45:45 compute-0 sudo[101654]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:45 compute-0 sudo[101847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoovynggzsczpmxhuyfbniqfwkkdyypr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935145.292718-325-56003730452844/AnsiballZ_file.py'
Dec 05 11:45:45 compute-0 sudo[101847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:45 compute-0 python3.9[101849]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:45 compute-0 sudo[101847]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:46 compute-0 sudo[101999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvymytcyoowucvpszbqatvmvofozufyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935146.007944-333-192737364057845/AnsiballZ_stat.py'
Dec 05 11:45:46 compute-0 sudo[101999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:46 compute-0 python3.9[102001]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:46 compute-0 sudo[101999]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:46 compute-0 sudo[102122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjtkawxnpeeehulzyqlaepvhecjckvzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935146.007944-333-192737364057845/AnsiballZ_copy.py'
Dec 05 11:45:46 compute-0 sudo[102122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:47 compute-0 python3.9[102124]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935146.007944-333-192737364057845/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:47 compute-0 sudo[102122]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:47 compute-0 sudo[102274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzeuipawxnrmifclqpnsksnigrwgiczi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935147.5509057-350-172563675027270/AnsiballZ_file.py'
Dec 05 11:45:47 compute-0 sudo[102274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:48 compute-0 python3.9[102276]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:45:48 compute-0 sudo[102274]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:49 compute-0 sudo[102427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtqwwlrcjkxfgkzkdwbwgbdqlwiwvdoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935148.3624232-358-257335484211806/AnsiballZ_stat.py'
Dec 05 11:45:49 compute-0 sudo[102427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:49 compute-0 python3.9[102429]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:45:49 compute-0 sudo[102427]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:49 compute-0 sudo[102550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngxwvytvnjvfzdgigxuhgbuuwaoouicj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935148.3624232-358-257335484211806/AnsiballZ_copy.py'
Dec 05 11:45:49 compute-0 sudo[102550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:49 compute-0 python3.9[102552]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935148.3624232-358-257335484211806/.source.json _original_basename=.one4b8ki follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:49 compute-0 sudo[102550]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:50 compute-0 sudo[102702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmcpfxmxxrgrmprihcagmjatjsfhxdrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935150.1228766-373-223457596524949/AnsiballZ_file.py'
Dec 05 11:45:50 compute-0 sudo[102702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:50 compute-0 python3.9[102704]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:50 compute-0 sudo[102702]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:51 compute-0 sudo[102854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozginzwhykojspfyovtqsutcqhivcrpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935150.8978834-381-46132211444713/AnsiballZ_stat.py'
Dec 05 11:45:51 compute-0 sudo[102854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:51 compute-0 sudo[102854]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:51 compute-0 sudo[102977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fljcyacpttpsxlpapguzaqjcbqyzovbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935150.8978834-381-46132211444713/AnsiballZ_copy.py'
Dec 05 11:45:51 compute-0 sudo[102977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:51 compute-0 sudo[102977]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:52 compute-0 sudo[103129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mltiajxzbvfihfoupizlnplihrumxpvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935152.233067-398-83575070665756/AnsiballZ_container_config_data.py'
Dec 05 11:45:52 compute-0 sudo[103129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:52 compute-0 python3.9[103131]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 05 11:45:52 compute-0 sudo[103129]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:53 compute-0 sudo[103281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxzfqgwogkhovtdugeknkkxxluhuztdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935153.1742394-407-30994881195839/AnsiballZ_container_config_hash.py'
Dec 05 11:45:53 compute-0 sudo[103281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:53 compute-0 python3.9[103283]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 11:45:53 compute-0 sudo[103281]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:54 compute-0 sudo[103433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dypwgptdbbbfqkqczbsbpbrwhygocypy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935154.1782117-416-83092795421856/AnsiballZ_podman_container_info.py'
Dec 05 11:45:54 compute-0 sudo[103433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:54 compute-0 python3.9[103435]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 11:45:54 compute-0 sudo[103433]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:55 compute-0 sudo[103611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsxebkegtydusroogxfsxjlfidaivmcm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935155.4967015-429-76572551315743/AnsiballZ_edpm_container_manage.py'
Dec 05 11:45:55 compute-0 sudo[103611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:56 compute-0 python3[103613]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 11:45:56 compute-0 podman[103650]: 2025-12-05 11:45:56.413740844 +0000 UTC m=+0.048166884 container create de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Dec 05 11:45:56 compute-0 podman[103650]: 2025-12-05 11:45:56.390098605 +0000 UTC m=+0.024524675 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 11:45:56 compute-0 python3[103613]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 11:45:56 compute-0 sudo[103611]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:56 compute-0 sudo[103838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzmehnhwdbiauifaeqbdtzlduaigztjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935156.7119865-437-159633213057106/AnsiballZ_stat.py'
Dec 05 11:45:56 compute-0 sudo[103838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:57 compute-0 python3.9[103840]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:45:57 compute-0 sudo[103838]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:57 compute-0 sudo[103992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhtzzlozzizefwcymaixurvrlzjwkxrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935157.4007187-446-90178842482541/AnsiballZ_file.py'
Dec 05 11:45:57 compute-0 sudo[103992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:57 compute-0 python3.9[103994]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:57 compute-0 sudo[103992]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:58 compute-0 sudo[104068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhbbjqghasbuaolrgwvuxewlzvyoixiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935157.4007187-446-90178842482541/AnsiballZ_stat.py'
Dec 05 11:45:58 compute-0 sudo[104068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:58 compute-0 python3.9[104070]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:45:58 compute-0 sudo[104068]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:58 compute-0 sudo[104219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgsfsvkorfuvqowjihlprsbhxarqrvdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935158.3779027-446-33475902378353/AnsiballZ_copy.py'
Dec 05 11:45:58 compute-0 sudo[104219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:59 compute-0 python3.9[104221]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935158.3779027-446-33475902378353/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:45:59 compute-0 sudo[104219]: pam_unix(sudo:session): session closed for user root
Dec 05 11:45:59 compute-0 sudo[104295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpnpozncziwhwbtckcqjmjdnunufzpks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935158.3779027-446-33475902378353/AnsiballZ_systemd.py'
Dec 05 11:45:59 compute-0 sudo[104295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:45:59 compute-0 python3.9[104297]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:45:59 compute-0 systemd[1]: Reloading.
Dec 05 11:45:59 compute-0 systemd-rc-local-generator[104323]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:45:59 compute-0 systemd-sysv-generator[104329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:45:59 compute-0 sudo[104295]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:00 compute-0 sudo[104405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmljuodhbnqqtgpbjjtqxqtooibibnxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935158.3779027-446-33475902378353/AnsiballZ_systemd.py'
Dec 05 11:46:00 compute-0 sudo[104405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:00 compute-0 python3.9[104407]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:46:00 compute-0 systemd[1]: Reloading.
Dec 05 11:46:00 compute-0 systemd-sysv-generator[104442]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:46:00 compute-0 systemd-rc-local-generator[104439]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:46:00 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Dec 05 11:46:01 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:46:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/436161f6e5b01ff201ecbc4abee31e21a170e03160630140dbf046e60a098b2e/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 05 11:46:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/436161f6e5b01ff201ecbc4abee31e21a170e03160630140dbf046e60a098b2e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 11:46:01 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.
Dec 05 11:46:01 compute-0 podman[104450]: 2025-12-05 11:46:01.083328901 +0000 UTC m=+0.177925469 container init de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: + sudo -E kolla_set_configs
Dec 05 11:46:01 compute-0 podman[104450]: 2025-12-05 11:46:01.112791567 +0000 UTC m=+0.207388095 container start de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 05 11:46:01 compute-0 edpm-start-podman-container[104450]: ovn_metadata_agent
Dec 05 11:46:01 compute-0 edpm-start-podman-container[104449]: Creating additional drop-in dependency for "ovn_metadata_agent" (de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc)
Dec 05 11:46:01 compute-0 podman[104472]: 2025-12-05 11:46:01.178331588 +0000 UTC m=+0.052582890 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 11:46:01 compute-0 systemd[1]: Reloading.
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Validating config file
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Copying service configuration files
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Writing out command to execute
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: ++ cat /run_command
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: + CMD=neutron-ovn-metadata-agent
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: + ARGS=
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: + sudo kolla_copy_cacerts
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: + [[ ! -n '' ]]
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: + . kolla_extend_start
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: Running command: 'neutron-ovn-metadata-agent'
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: + umask 0022
Dec 05 11:46:01 compute-0 ovn_metadata_agent[104466]: + exec neutron-ovn-metadata-agent
Dec 05 11:46:01 compute-0 systemd-rc-local-generator[104545]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:46:01 compute-0 systemd-sysv-generator[104548]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:46:01 compute-0 systemd[1]: Started ovn_metadata_agent container.
Dec 05 11:46:01 compute-0 sudo[104405]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:01 compute-0 sshd-session[96217]: Connection closed by 192.168.122.30 port 53570
Dec 05 11:46:01 compute-0 sshd-session[96214]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:46:01 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Dec 05 11:46:01 compute-0 systemd[1]: session-21.scope: Consumed 33.945s CPU time.
Dec 05 11:46:01 compute-0 systemd-logind[792]: Session 21 logged out. Waiting for processes to exit.
Dec 05 11:46:01 compute-0 systemd-logind[792]: Removed session 21.
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.942 104471 INFO neutron.common.config [-] Logging enabled!
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.942 104471 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.943 104471 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.943 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.943 104471 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.943 104471 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.986 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.986 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.994 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.995 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.995 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.995 104471 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 05 11:46:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.995 104471 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.006 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2686fa45-e88c-4058-8865-e810ceb89d95 (UUID: 2686fa45-e88c-4058-8865-e810ceb89d95) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.044 104471 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.045 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.045 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.045 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.049 104471 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.057 104471 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.065 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2686fa45-e88c-4058-8865-e810ceb89d95'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], external_ids={}, name=2686fa45-e88c-4058-8865-e810ceb89d95, nb_cfg_timestamp=1764935115276, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.066 104471 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f189ffbadc0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.067 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.068 104471 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.068 104471 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.068 104471 INFO oslo_service.service [-] Starting 1 workers
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.073 104471 DEBUG oslo_service.service [-] Started child 104579 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.076 104579 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-66947572'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.077 104471 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp3e9pio4y/privsep.sock']
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.100 104579 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.101 104579 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.101 104579 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.104 104579 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.111 104579 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.119 104579 INFO eventlet.wsgi.server [-] (104579) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 05 11:46:03 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.720 104471 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.721 104471 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp3e9pio4y/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.585 104584 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.589 104584 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.592 104584 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.592 104584 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104584
Dec 05 11:46:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.726 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b5586c-ea9a-4246-bbad-5a51e446d47b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.202 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.202 104584 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.203 104584 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.726 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc972f7-2f9e-4c5a-90b6-a1bdad82a09e]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.729 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, column=external_ids, values=({'neutron:ovn-metadata-id': '2017f5d6-7c32-5b30-92fd-9f8ba19f80f8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.741 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.750 104471 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.751 104471 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.751 104471 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.752 104471 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.752 104471 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.752 104471 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.753 104471 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.753 104471 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.754 104471 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.754 104471 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.755 104471 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.755 104471 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.756 104471 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.756 104471 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.757 104471 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.757 104471 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.758 104471 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.758 104471 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.759 104471 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.759 104471 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.759 104471 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.760 104471 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.761 104471 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.761 104471 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.761 104471 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.761 104471 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.762 104471 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.762 104471 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.762 104471 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.762 104471 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.763 104471 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.763 104471 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.763 104471 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.763 104471 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.764 104471 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.764 104471 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.764 104471 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.764 104471 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.765 104471 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.765 104471 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.765 104471 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.765 104471 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:46:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 11:46:07 compute-0 sshd-session[104589]: Accepted publickey for zuul from 192.168.122.30 port 48946 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:46:07 compute-0 systemd-logind[792]: New session 22 of user zuul.
Dec 05 11:46:07 compute-0 systemd[1]: Started Session 22 of User zuul.
Dec 05 11:46:07 compute-0 sshd-session[104589]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:46:07 compute-0 podman[104591]: 2025-12-05 11:46:07.23169546 +0000 UTC m=+0.094690934 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:46:08 compute-0 python3.9[104769]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:46:09 compute-0 sudo[104923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tezlthhyjkskzcictsdkvaybeygolxml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935168.6990647-34-25454645187679/AnsiballZ_command.py'
Dec 05 11:46:09 compute-0 sudo[104923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:09 compute-0 python3.9[104925]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:46:09 compute-0 sudo[104923]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:10 compute-0 sudo[105088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmdmuawyemzhjholeppudeayempzlxqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935169.7931633-45-105996868142335/AnsiballZ_systemd_service.py'
Dec 05 11:46:10 compute-0 sudo[105088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:10 compute-0 python3.9[105090]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:46:10 compute-0 systemd[1]: Reloading.
Dec 05 11:46:10 compute-0 systemd-rc-local-generator[105118]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:46:10 compute-0 systemd-sysv-generator[105121]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:46:11 compute-0 sudo[105088]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:11 compute-0 python3.9[105275]: ansible-ansible.builtin.service_facts Invoked
Dec 05 11:46:12 compute-0 network[105292]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 11:46:12 compute-0 network[105293]: 'network-scripts' will be removed from distribution in near future.
Dec 05 11:46:12 compute-0 network[105294]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 11:46:18 compute-0 sudo[105553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnumlrnykohbvmavrcvvvqnkcugqrvrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935178.2565598-64-83957721545291/AnsiballZ_systemd_service.py'
Dec 05 11:46:18 compute-0 sudo[105553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:18 compute-0 python3.9[105555]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:46:18 compute-0 sudo[105553]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:19 compute-0 sudo[105706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfoyesnjmuqnvlctptggelyjrsdydgft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935179.0696003-64-280305513923559/AnsiballZ_systemd_service.py'
Dec 05 11:46:19 compute-0 sudo[105706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:19 compute-0 python3.9[105708]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:46:19 compute-0 sudo[105706]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:20 compute-0 sudo[105859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwhvmbuqyrsjufclnouahwhlybtwazbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935179.8790119-64-11886972831132/AnsiballZ_systemd_service.py'
Dec 05 11:46:20 compute-0 sudo[105859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:20 compute-0 python3.9[105861]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:46:20 compute-0 sudo[105859]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:21 compute-0 sudo[106012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuopiupmmothjixgpnlfczuxriffabna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935180.680641-64-201051667289840/AnsiballZ_systemd_service.py'
Dec 05 11:46:21 compute-0 sudo[106012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:21 compute-0 python3.9[106014]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:46:21 compute-0 sudo[106012]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:21 compute-0 sudo[106165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elrmlgxivtomeubdsrabdwxalrnzbovu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935181.5012991-64-70748753710422/AnsiballZ_systemd_service.py'
Dec 05 11:46:21 compute-0 sudo[106165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:22 compute-0 python3.9[106167]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:46:22 compute-0 sudo[106165]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:22 compute-0 sudo[106318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feheytkhlghrgvcotikxikofjlvdynie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935182.3081567-64-32381003284803/AnsiballZ_systemd_service.py'
Dec 05 11:46:22 compute-0 sudo[106318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:22 compute-0 python3.9[106320]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:46:22 compute-0 sudo[106318]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:23 compute-0 sudo[106471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icfzayulsltuznpffocmgckwbeldqrrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935183.11896-64-214839932350819/AnsiballZ_systemd_service.py'
Dec 05 11:46:23 compute-0 sudo[106471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:23 compute-0 python3.9[106473]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:46:23 compute-0 sudo[106471]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:24 compute-0 sudo[106624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqzbkswnkhxdowtqfvycquvmesfedydr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935184.0421505-116-209060764924844/AnsiballZ_file.py'
Dec 05 11:46:24 compute-0 sudo[106624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:24 compute-0 python3.9[106626]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:24 compute-0 sudo[106624]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:25 compute-0 sudo[106776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwoluqsfxnsrzuihrdyiepsopezahldd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935184.853995-116-30293775518486/AnsiballZ_file.py'
Dec 05 11:46:25 compute-0 sudo[106776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:25 compute-0 python3.9[106778]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:25 compute-0 sudo[106776]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:25 compute-0 sudo[106928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwbmmubahvlugrqwrsjfotijudkrlzyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935185.5269554-116-144652084265982/AnsiballZ_file.py'
Dec 05 11:46:25 compute-0 sudo[106928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:25 compute-0 python3.9[106930]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:25 compute-0 sudo[106928]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:26 compute-0 sudo[107080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsqnotpdjbqqspltzybrmxqlczjcuarh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935186.1023145-116-145168374423617/AnsiballZ_file.py'
Dec 05 11:46:26 compute-0 sudo[107080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:26 compute-0 python3.9[107082]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:26 compute-0 sudo[107080]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:26 compute-0 sudo[107232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwpxaexvbelspdmesnvpakcjayikpazo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935186.6404645-116-129298789200533/AnsiballZ_file.py'
Dec 05 11:46:26 compute-0 sudo[107232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:27 compute-0 python3.9[107234]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:27 compute-0 sudo[107232]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:27 compute-0 sudo[107384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilylqdbkcmugrpshzwlxlajqzcytfywo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935187.2215664-116-92287972211400/AnsiballZ_file.py'
Dec 05 11:46:27 compute-0 sudo[107384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:27 compute-0 python3.9[107386]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:27 compute-0 sudo[107384]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:28 compute-0 sudo[107536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwtdsdowjglboofasczonvyhrgyggrzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935187.9243462-116-195111481842318/AnsiballZ_file.py'
Dec 05 11:46:28 compute-0 sudo[107536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:28 compute-0 python3.9[107538]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:28 compute-0 sudo[107536]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:28 compute-0 sudo[107688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyjriujoinuvldhcohcwodrgeskwnsyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935188.5761464-166-79105273572448/AnsiballZ_file.py'
Dec 05 11:46:28 compute-0 sudo[107688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:29 compute-0 python3.9[107690]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:29 compute-0 sudo[107688]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:29 compute-0 sudo[107840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebetpowdrqijnngmiiyvyoqlfxsytqon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935189.170326-166-49834857933820/AnsiballZ_file.py'
Dec 05 11:46:29 compute-0 sudo[107840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:29 compute-0 python3.9[107842]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:29 compute-0 sudo[107840]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:30 compute-0 sudo[107992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkzkfpwlerjcjfbfzvuhhqsccwyoleac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935189.8245904-166-100792053387059/AnsiballZ_file.py'
Dec 05 11:46:30 compute-0 sudo[107992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:30 compute-0 python3.9[107994]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:30 compute-0 sudo[107992]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:30 compute-0 sudo[108144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roklweifzxxkkyokacwnotpatnmzbogd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935190.4888108-166-167805404892584/AnsiballZ_file.py'
Dec 05 11:46:30 compute-0 sudo[108144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:30 compute-0 python3.9[108146]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:30 compute-0 sudo[108144]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:31 compute-0 sudo[108308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syvgprnhtuiubnvhjragdbzlhtdpivbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935191.1314466-166-89005308311795/AnsiballZ_file.py'
Dec 05 11:46:31 compute-0 sudo[108308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:31 compute-0 podman[108270]: 2025-12-05 11:46:31.479866508 +0000 UTC m=+0.053112460 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 11:46:31 compute-0 python3.9[108316]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:31 compute-0 sudo[108308]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:32 compute-0 sudo[108467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozsnfldmybfhhpiyjcxuolnjjrlhmcuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935191.8041975-166-196814822043900/AnsiballZ_file.py'
Dec 05 11:46:32 compute-0 sudo[108467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:32 compute-0 python3.9[108469]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:32 compute-0 sudo[108467]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:32 compute-0 sudo[108619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqtzpzhtqgklvxsfkfgtbveyfnugrkbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935192.4018795-166-125038824093363/AnsiballZ_file.py'
Dec 05 11:46:32 compute-0 sudo[108619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:32 compute-0 python3.9[108621]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:46:32 compute-0 sudo[108619]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:33 compute-0 sudo[108771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylqyayphjjfmcmbsfppnlercwuceorqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935193.2146583-217-132544589363848/AnsiballZ_command.py'
Dec 05 11:46:33 compute-0 sudo[108771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:33 compute-0 python3.9[108773]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:46:33 compute-0 sudo[108771]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:34 compute-0 python3.9[108925]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 11:46:35 compute-0 sudo[109075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdezpgxygqndeabtpwgrcjkctdqcxgvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935194.758287-235-243772429971034/AnsiballZ_systemd_service.py'
Dec 05 11:46:35 compute-0 sudo[109075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:35 compute-0 python3.9[109077]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:46:35 compute-0 systemd[1]: Reloading.
Dec 05 11:46:35 compute-0 systemd-rc-local-generator[109103]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:46:35 compute-0 systemd-sysv-generator[109109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:46:35 compute-0 sudo[109075]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:36 compute-0 sudo[109262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jecpbexoasmbmgtfbeogrtzjrlpjxfxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935195.7635193-243-122347684323442/AnsiballZ_command.py'
Dec 05 11:46:36 compute-0 sudo[109262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:36 compute-0 python3.9[109264]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:46:36 compute-0 sudo[109262]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:36 compute-0 sudo[109415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utxwqyqtbdhononjtnunsebcyxnmedhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935196.4752939-243-137262560025871/AnsiballZ_command.py'
Dec 05 11:46:36 compute-0 sudo[109415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:36 compute-0 python3.9[109417]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:46:36 compute-0 sudo[109415]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:37 compute-0 sudo[109579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqtjtogcbdrqauuccvqqwbbzswastpcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935197.1071196-243-97550962656587/AnsiballZ_command.py'
Dec 05 11:46:37 compute-0 sudo[109579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:37 compute-0 podman[109542]: 2025-12-05 11:46:37.535967112 +0000 UTC m=+0.149789326 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 11:46:37 compute-0 python3.9[109581]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:46:37 compute-0 sudo[109579]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:38 compute-0 sudo[109745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efrmkzvyhxcxvkporedjsvzvxluozsdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935197.7878413-243-121953087628090/AnsiballZ_command.py'
Dec 05 11:46:38 compute-0 sudo[109745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:38 compute-0 python3.9[109747]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:46:38 compute-0 sudo[109745]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:38 compute-0 sudo[109898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvmgfnsltldxvggqsbfcdattuojaabdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935198.4029682-243-77062117490919/AnsiballZ_command.py'
Dec 05 11:46:38 compute-0 sudo[109898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:38 compute-0 python3.9[109900]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:46:38 compute-0 sudo[109898]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:39 compute-0 sudo[110051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgbkevsflpjbdtwthhgecqgkxhzgjnrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935199.178462-243-231651389632721/AnsiballZ_command.py'
Dec 05 11:46:39 compute-0 sudo[110051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:39 compute-0 python3.9[110053]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:46:39 compute-0 sudo[110051]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:40 compute-0 sudo[110204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldwzlvndglqnsqcmjapjxghmcuyizyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935199.826528-243-259109936066023/AnsiballZ_command.py'
Dec 05 11:46:40 compute-0 sudo[110204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:40 compute-0 python3.9[110206]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:46:40 compute-0 sudo[110204]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:41 compute-0 sudo[110357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lubaggtnfmkmaehdtitgmzxfmpkgrxqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935200.7417603-297-85363943765227/AnsiballZ_getent.py'
Dec 05 11:46:41 compute-0 sudo[110357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:41 compute-0 python3.9[110359]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 05 11:46:41 compute-0 sudo[110357]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:41 compute-0 sudo[110510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytrzsuqxkhoursymsltukkuxnhmldomy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935201.5781393-305-220287189280547/AnsiballZ_group.py'
Dec 05 11:46:41 compute-0 sudo[110510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:42 compute-0 python3.9[110512]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 11:46:42 compute-0 groupadd[110513]: group added to /etc/group: name=libvirt, GID=42473
Dec 05 11:46:42 compute-0 groupadd[110513]: group added to /etc/gshadow: name=libvirt
Dec 05 11:46:42 compute-0 groupadd[110513]: new group: name=libvirt, GID=42473
Dec 05 11:46:42 compute-0 sudo[110510]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:42 compute-0 sudo[110668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lspogpzloazcjbkbsaufqbihieapjudh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935202.442098-313-192633438354840/AnsiballZ_user.py'
Dec 05 11:46:42 compute-0 sudo[110668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:43 compute-0 python3.9[110670]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 11:46:43 compute-0 useradd[110672]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 11:46:43 compute-0 sudo[110668]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:43 compute-0 sudo[110828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niczdtwytwvggczsfylvcygegfinwxii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935203.5252383-324-11836823188561/AnsiballZ_setup.py'
Dec 05 11:46:43 compute-0 sudo[110828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:44 compute-0 python3.9[110830]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:46:44 compute-0 sudo[110828]: pam_unix(sudo:session): session closed for user root
Dec 05 11:46:44 compute-0 sudo[110912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byasmvmahfdpoqtlckpvyljoddcfdqej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935203.5252383-324-11836823188561/AnsiballZ_dnf.py'
Dec 05 11:46:44 compute-0 sudo[110912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:46:44 compute-0 python3.9[110914]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:47:01 compute-0 podman[111099]: 2025-12-05 11:47:01.790515582 +0000 UTC m=+0.087539571 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 11:47:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:47:02.988 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:47:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:47:02.988 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:47:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:47:02.989 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:47:08 compute-0 podman[111124]: 2025-12-05 11:47:08.246007172 +0000 UTC m=+0.095043453 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 11:47:13 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Dec 05 11:47:13 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:47:13 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 11:47:13 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:47:13 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:47:13 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:47:13 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:47:13 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:47:22 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Dec 05 11:47:22 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:47:22 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 11:47:22 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:47:22 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:47:22 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:47:22 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:47:22 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:47:32 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 05 11:47:32 compute-0 podman[111167]: 2025-12-05 11:47:32.239758116 +0000 UTC m=+0.077208780 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 05 11:47:39 compute-0 podman[114096]: 2025-12-05 11:47:39.250868383 +0000 UTC m=+0.093691341 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 11:48:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:48:02.989 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:48:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:48:02.989 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:48:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:48:02.990 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:48:03 compute-0 podman[128002]: 2025-12-05 11:48:03.205568674 +0000 UTC m=+0.057632373 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 11:48:10 compute-0 podman[128037]: 2025-12-05 11:48:10.22988219 +0000 UTC m=+0.082069776 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 05 11:48:15 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Dec 05 11:48:15 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 11:48:15 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 11:48:15 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 11:48:15 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 11:48:15 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 11:48:15 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 11:48:15 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 11:48:16 compute-0 groupadd[128076]: group added to /etc/group: name=dnsmasq, GID=992
Dec 05 11:48:16 compute-0 groupadd[128076]: group added to /etc/gshadow: name=dnsmasq
Dec 05 11:48:16 compute-0 groupadd[128076]: new group: name=dnsmasq, GID=992
Dec 05 11:48:16 compute-0 useradd[128083]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 05 11:48:16 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec 05 11:48:16 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 05 11:48:16 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec 05 11:48:17 compute-0 groupadd[128096]: group added to /etc/group: name=clevis, GID=991
Dec 05 11:48:17 compute-0 groupadd[128096]: group added to /etc/gshadow: name=clevis
Dec 05 11:48:17 compute-0 groupadd[128096]: new group: name=clevis, GID=991
Dec 05 11:48:17 compute-0 useradd[128103]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 05 11:48:17 compute-0 usermod[128113]: add 'clevis' to group 'tss'
Dec 05 11:48:17 compute-0 usermod[128113]: add 'clevis' to shadow group 'tss'
Dec 05 11:48:19 compute-0 polkitd[43701]: Reloading rules
Dec 05 11:48:19 compute-0 polkitd[43701]: Collecting garbage unconditionally...
Dec 05 11:48:19 compute-0 polkitd[43701]: Loading rules from directory /etc/polkit-1/rules.d
Dec 05 11:48:19 compute-0 polkitd[43701]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 05 11:48:19 compute-0 polkitd[43701]: Finished loading, compiling and executing 3 rules
Dec 05 11:48:19 compute-0 polkitd[43701]: Reloading rules
Dec 05 11:48:19 compute-0 polkitd[43701]: Collecting garbage unconditionally...
Dec 05 11:48:19 compute-0 polkitd[43701]: Loading rules from directory /etc/polkit-1/rules.d
Dec 05 11:48:19 compute-0 polkitd[43701]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 05 11:48:19 compute-0 polkitd[43701]: Finished loading, compiling and executing 3 rules
Dec 05 11:48:21 compute-0 groupadd[128300]: group added to /etc/group: name=ceph, GID=167
Dec 05 11:48:21 compute-0 groupadd[128300]: group added to /etc/gshadow: name=ceph
Dec 05 11:48:21 compute-0 groupadd[128300]: new group: name=ceph, GID=167
Dec 05 11:48:21 compute-0 useradd[128306]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 05 11:48:23 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Dec 05 11:48:23 compute-0 sshd[1005]: Received signal 15; terminating.
Dec 05 11:48:23 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Dec 05 11:48:23 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Dec 05 11:48:23 compute-0 systemd[1]: sshd.service: Consumed 2.960s CPU time, read 32.0K from disk, written 4.0K to disk.
Dec 05 11:48:23 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Dec 05 11:48:23 compute-0 systemd[1]: Stopping sshd-keygen.target...
Dec 05 11:48:23 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 11:48:23 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 11:48:23 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 11:48:23 compute-0 systemd[1]: Reached target sshd-keygen.target.
Dec 05 11:48:23 compute-0 systemd[1]: Starting OpenSSH server daemon...
Dec 05 11:48:23 compute-0 sshd[128825]: Server listening on 0.0.0.0 port 22.
Dec 05 11:48:23 compute-0 sshd[128825]: Server listening on :: port 22.
Dec 05 11:48:23 compute-0 systemd[1]: Started OpenSSH server daemon.
Dec 05 11:48:25 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 11:48:25 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 11:48:26 compute-0 systemd[1]: Reloading.
Dec 05 11:48:26 compute-0 systemd-sysv-generator[129083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:26 compute-0 systemd-rc-local-generator[129080]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:26 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 11:48:29 compute-0 sudo[110912]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:30 compute-0 sudo[134056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zygwuikznedazhmbubmfbgnidbmcomcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935309.8533316-336-247464737005546/AnsiballZ_systemd.py'
Dec 05 11:48:30 compute-0 sudo[134056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:30 compute-0 python3.9[134081]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 11:48:30 compute-0 systemd[1]: Reloading.
Dec 05 11:48:30 compute-0 systemd-rc-local-generator[134499]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:30 compute-0 systemd-sysv-generator[134507]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:31 compute-0 sudo[134056]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:31 compute-0 sudo[135300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htbtmqbuqhzsdzdaljllbuvyjeuhsftn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935311.1898322-336-46860721472885/AnsiballZ_systemd.py'
Dec 05 11:48:31 compute-0 sudo[135300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:31 compute-0 python3.9[135327]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 11:48:31 compute-0 systemd[1]: Reloading.
Dec 05 11:48:31 compute-0 systemd-rc-local-generator[135698]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:31 compute-0 systemd-sysv-generator[135701]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:32 compute-0 sudo[135300]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:32 compute-0 sudo[136521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsfyemxtmogibdeyqkzcyizgbzcbaslp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935312.3142073-336-32315967897751/AnsiballZ_systemd.py'
Dec 05 11:48:32 compute-0 sudo[136521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:32 compute-0 python3.9[136548]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 11:48:32 compute-0 systemd[1]: Reloading.
Dec 05 11:48:32 compute-0 systemd-rc-local-generator[137038]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:33 compute-0 systemd-sysv-generator[137042]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:33 compute-0 sudo[136521]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:33 compute-0 sudo[137865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmszyguvvfpksruxribjhmlnacmsghpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935313.327153-336-180441464970904/AnsiballZ_systemd.py'
Dec 05 11:48:33 compute-0 sudo[137865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:33 compute-0 podman[137771]: 2025-12-05 11:48:33.645977819 +0000 UTC m=+0.060523163 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 05 11:48:33 compute-0 python3.9[137895]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 11:48:34 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 11:48:34 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 11:48:34 compute-0 systemd[1]: man-db-cache-update.service: Consumed 10.293s CPU time.
Dec 05 11:48:34 compute-0 systemd[1]: run-r11ad2af578eb459cb50d0e03494c13de.service: Deactivated successfully.
Dec 05 11:48:34 compute-0 systemd[1]: Reloading.
Dec 05 11:48:34 compute-0 systemd-sysv-generator[138235]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:34 compute-0 systemd-rc-local-generator[138230]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:34 compute-0 sudo[137865]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:34 compute-0 sudo[138389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibbbjtukobuethpcbihofcglqchljgyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935314.5542393-365-46055610197072/AnsiballZ_systemd.py'
Dec 05 11:48:34 compute-0 sudo[138389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:35 compute-0 python3.9[138391]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:35 compute-0 systemd[1]: Reloading.
Dec 05 11:48:35 compute-0 systemd-rc-local-generator[138422]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:35 compute-0 systemd-sysv-generator[138427]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:35 compute-0 sudo[138389]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:35 compute-0 sudo[138580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olimgfyhwyyrcgjlwyemrcdjaxnrhbxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935315.645247-365-250044146345450/AnsiballZ_systemd.py'
Dec 05 11:48:35 compute-0 sudo[138580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:36 compute-0 python3.9[138582]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:36 compute-0 systemd[1]: Reloading.
Dec 05 11:48:36 compute-0 systemd-rc-local-generator[138609]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:36 compute-0 systemd-sysv-generator[138614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:36 compute-0 sudo[138580]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:37 compute-0 sudo[138770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfjxzfmoanzkrctpnnmbynhwypmnpiah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935316.8009105-365-209514388533594/AnsiballZ_systemd.py'
Dec 05 11:48:37 compute-0 sudo[138770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:37 compute-0 python3.9[138772]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:37 compute-0 systemd[1]: Reloading.
Dec 05 11:48:37 compute-0 systemd-rc-local-generator[138799]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:37 compute-0 systemd-sysv-generator[138806]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:37 compute-0 sudo[138770]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:38 compute-0 sudo[138960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hacdlomvxvdcbqgwopcijxphpgdvqpzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935317.9314857-365-111588379667587/AnsiballZ_systemd.py'
Dec 05 11:48:38 compute-0 sudo[138960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:38 compute-0 python3.9[138962]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:38 compute-0 sudo[138960]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:39 compute-0 sudo[139115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltatzqdtngbeclelomjzxfbwavhctcgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935318.848588-365-163420196875868/AnsiballZ_systemd.py'
Dec 05 11:48:39 compute-0 sudo[139115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:39 compute-0 python3.9[139117]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:39 compute-0 systemd[1]: Reloading.
Dec 05 11:48:39 compute-0 systemd-sysv-generator[139151]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:39 compute-0 systemd-rc-local-generator[139147]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:39 compute-0 sudo[139115]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:40 compute-0 sudo[139317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrlfaknfipmnmqtjkysuejwsnhcsvalo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935320.064885-401-69293626507931/AnsiballZ_systemd.py'
Dec 05 11:48:40 compute-0 sudo[139317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:40 compute-0 podman[139278]: 2025-12-05 11:48:40.417111205 +0000 UTC m=+0.096780448 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 11:48:40 compute-0 python3.9[139325]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 11:48:40 compute-0 systemd[1]: Reloading.
Dec 05 11:48:40 compute-0 systemd-rc-local-generator[139362]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:48:40 compute-0 systemd-sysv-generator[139365]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:48:41 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 05 11:48:41 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 05 11:48:41 compute-0 sudo[139317]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:41 compute-0 sudo[139524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlxjuazlkawgwmqrqznakeuwvoiwdgmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935321.222515-409-262664025302336/AnsiballZ_systemd.py'
Dec 05 11:48:41 compute-0 sudo[139524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:41 compute-0 python3.9[139526]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:41 compute-0 sudo[139524]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:42 compute-0 sudo[139679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcirdzidpmddoqhpjczhrenyhgaqiias ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935322.0629804-409-266635614707244/AnsiballZ_systemd.py'
Dec 05 11:48:42 compute-0 sudo[139679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:42 compute-0 python3.9[139681]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:42 compute-0 sudo[139679]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:43 compute-0 sudo[139834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsghzpcvvbevpppvabgqyovlmfselesd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935322.9712386-409-219395628704954/AnsiballZ_systemd.py'
Dec 05 11:48:43 compute-0 sudo[139834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:43 compute-0 python3.9[139836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:43 compute-0 sudo[139834]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:44 compute-0 sudo[139989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjggddiqeoxsjhstcbppkkdsqsvbvcqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935323.8945866-409-147342444698363/AnsiballZ_systemd.py'
Dec 05 11:48:44 compute-0 sudo[139989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:44 compute-0 python3.9[139991]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:44 compute-0 sudo[139989]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:45 compute-0 sudo[140144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tufvezvyswzokajmiochzdzqmfjmbywk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935324.8141968-409-133654814759134/AnsiballZ_systemd.py'
Dec 05 11:48:45 compute-0 sudo[140144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:45 compute-0 python3.9[140146]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:45 compute-0 sudo[140144]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:45 compute-0 sudo[140299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fheqizsbqbrraqddyggygeubyprxtoei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935325.6945078-409-217852962585255/AnsiballZ_systemd.py'
Dec 05 11:48:45 compute-0 sudo[140299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:46 compute-0 python3.9[140301]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:46 compute-0 sudo[140299]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:46 compute-0 sudo[140454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kulegywnhiywbmgrpvrmsyimkegudgoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935326.4946983-409-165566836740840/AnsiballZ_systemd.py'
Dec 05 11:48:46 compute-0 sudo[140454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:47 compute-0 python3.9[140456]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:47 compute-0 sudo[140454]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:47 compute-0 sudo[140609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnaygtxliqyqfyudyohootqicdwyinfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935327.3635705-409-159046763313590/AnsiballZ_systemd.py'
Dec 05 11:48:47 compute-0 sudo[140609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:47 compute-0 python3.9[140611]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:49 compute-0 sudo[140609]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:49 compute-0 sudo[140764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwoeqqztatupnbbwrqahqoxtdrtssila ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935329.447728-409-275372634164107/AnsiballZ_systemd.py'
Dec 05 11:48:49 compute-0 sudo[140764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:50 compute-0 python3.9[140766]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:50 compute-0 sudo[140764]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:50 compute-0 sudo[140919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twzvplalbdghtuogfieyhrazjvfstcba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935330.3560886-409-174166720036723/AnsiballZ_systemd.py'
Dec 05 11:48:50 compute-0 sudo[140919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:50 compute-0 python3.9[140921]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:51 compute-0 sudo[140919]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:51 compute-0 sudo[141074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewaexrxkgnzbpjjgqjaucfwmeejgulle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935331.2279289-409-226888729651365/AnsiballZ_systemd.py'
Dec 05 11:48:51 compute-0 sudo[141074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:51 compute-0 python3.9[141076]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:51 compute-0 sudo[141074]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:52 compute-0 sudo[141229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbefxokrrqneobtduqsihpxhscanjayp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935332.052123-409-172996953059223/AnsiballZ_systemd.py'
Dec 05 11:48:52 compute-0 sudo[141229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:52 compute-0 python3.9[141231]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:52 compute-0 sudo[141229]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:53 compute-0 sudo[141384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwxhkmdzvwjmbhskfugntknctkgqbnqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935332.849337-409-190532227334789/AnsiballZ_systemd.py'
Dec 05 11:48:53 compute-0 sudo[141384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:53 compute-0 python3.9[141386]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:53 compute-0 sudo[141384]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:53 compute-0 sudo[141539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orxrfcqxprnwlvxfxlojzrevikglpikg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935333.6310651-409-86959790977602/AnsiballZ_systemd.py'
Dec 05 11:48:53 compute-0 sudo[141539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:54 compute-0 python3.9[141541]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 11:48:54 compute-0 sudo[141539]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:55 compute-0 sudo[141694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzxpizhnmrzhirjdjkfbvhjmhnnkaoaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935334.7803402-511-271126188605355/AnsiballZ_file.py'
Dec 05 11:48:55 compute-0 sudo[141694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:55 compute-0 python3.9[141696]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:48:55 compute-0 sudo[141694]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:55 compute-0 sudo[141846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrvknoukcwlswgputuwjghrrnjklfxbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935335.4285378-511-24436284783494/AnsiballZ_file.py'
Dec 05 11:48:55 compute-0 sudo[141846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:55 compute-0 python3.9[141848]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:48:55 compute-0 sudo[141846]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:56 compute-0 sudo[141998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdgfjzgcckqvolvquwtuotgjlqzlyawn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935336.1442478-511-161385680120052/AnsiballZ_file.py'
Dec 05 11:48:56 compute-0 sudo[141998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:56 compute-0 python3.9[142000]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:48:56 compute-0 sudo[141998]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:57 compute-0 sudo[142150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzepiymckzrfytvpzmzgvnndbkaxehbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935336.846328-511-184979235434881/AnsiballZ_file.py'
Dec 05 11:48:57 compute-0 sudo[142150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:57 compute-0 python3.9[142152]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:48:57 compute-0 sudo[142150]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:57 compute-0 sudo[142302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvqlwgnivlfglwradszuhurdgbtseine ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935337.4931169-511-270615071975870/AnsiballZ_file.py'
Dec 05 11:48:57 compute-0 sudo[142302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:57 compute-0 python3.9[142304]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:48:58 compute-0 sudo[142302]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:58 compute-0 sudo[142454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eewbwjmohgvobnbxrgqwwhmtdvnnwnbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935338.154464-511-215061764297018/AnsiballZ_file.py'
Dec 05 11:48:58 compute-0 sudo[142454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:58 compute-0 python3.9[142456]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:48:58 compute-0 sudo[142454]: pam_unix(sudo:session): session closed for user root
Dec 05 11:48:59 compute-0 sudo[142606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggfzaijuoerpieptzpmyfmnibkbxponl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935338.8307583-554-171782957020572/AnsiballZ_stat.py'
Dec 05 11:48:59 compute-0 sudo[142606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:48:59 compute-0 python3.9[142608]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:48:59 compute-0 sudo[142606]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:00 compute-0 sudo[142731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpmlrnolbhxgefjrunjtzmzgscfwdyke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935338.8307583-554-171782957020572/AnsiballZ_copy.py'
Dec 05 11:49:00 compute-0 sudo[142731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:00 compute-0 python3.9[142733]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935338.8307583-554-171782957020572/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:00 compute-0 sudo[142731]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:00 compute-0 sudo[142883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwtuvdnqgyvwyhafziuoehyjswtjsqkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935340.380457-554-254657543180213/AnsiballZ_stat.py'
Dec 05 11:49:00 compute-0 sudo[142883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:00 compute-0 python3.9[142885]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:00 compute-0 sudo[142883]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:01 compute-0 sudo[143008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crojhndtwegafjllsjqsoobnqcgntjpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935340.380457-554-254657543180213/AnsiballZ_copy.py'
Dec 05 11:49:01 compute-0 sudo[143008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:01 compute-0 python3.9[143010]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935340.380457-554-254657543180213/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:01 compute-0 sudo[143008]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:01 compute-0 sudo[143160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgjcmlmntdyzornsceyrzjfcejirgeun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935341.596943-554-11611920050830/AnsiballZ_stat.py'
Dec 05 11:49:01 compute-0 sudo[143160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:02 compute-0 python3.9[143162]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:02 compute-0 sudo[143160]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:02 compute-0 sudo[143285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otcxeojlwdtmktxvbzufoxuuoyfwoihj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935341.596943-554-11611920050830/AnsiballZ_copy.py'
Dec 05 11:49:02 compute-0 sudo[143285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:02 compute-0 python3.9[143287]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935341.596943-554-11611920050830/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:02 compute-0 sudo[143285]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:49:02.990 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:49:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:49:02.992 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:49:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:49:02.992 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:49:03 compute-0 sudo[143437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfwigwmtttdzdwqprnqwdqrveacbukon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935342.9292514-554-219466482027564/AnsiballZ_stat.py'
Dec 05 11:49:03 compute-0 sudo[143437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:03 compute-0 python3.9[143439]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:03 compute-0 sudo[143437]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:03 compute-0 sudo[143572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkitubfgsgmnskanekzxxplqczoccxnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935342.9292514-554-219466482027564/AnsiballZ_copy.py'
Dec 05 11:49:03 compute-0 sudo[143572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:03 compute-0 podman[143536]: 2025-12-05 11:49:03.846405463 +0000 UTC m=+0.070176913 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 11:49:04 compute-0 python3.9[143579]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935342.9292514-554-219466482027564/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:04 compute-0 sudo[143572]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:04 compute-0 sudo[143731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujmnhfxoukuljihibxjxzfnjroskfszc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935344.2016222-554-201719294833372/AnsiballZ_stat.py'
Dec 05 11:49:04 compute-0 sudo[143731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:04 compute-0 python3.9[143733]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:04 compute-0 sudo[143731]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:05 compute-0 sudo[143856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdshfbgnrsqrnrsqcqmecrpozvffiulm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935344.2016222-554-201719294833372/AnsiballZ_copy.py'
Dec 05 11:49:05 compute-0 sudo[143856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:05 compute-0 python3.9[143858]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935344.2016222-554-201719294833372/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:05 compute-0 sudo[143856]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:05 compute-0 sudo[144008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etjicsvldzrxnwqsoynxbzuaexlrsxvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935345.437748-554-213687459831911/AnsiballZ_stat.py'
Dec 05 11:49:05 compute-0 sudo[144008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:05 compute-0 python3.9[144010]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:05 compute-0 sudo[144008]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:06 compute-0 sudo[144133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efxdsyfeohvewxveqmodxobrxspotuhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935345.437748-554-213687459831911/AnsiballZ_copy.py'
Dec 05 11:49:06 compute-0 sudo[144133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:06 compute-0 python3.9[144135]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935345.437748-554-213687459831911/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:06 compute-0 sudo[144133]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:06 compute-0 sudo[144285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smwkrmpsbkqeuorecynsrugsucqtdsol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935346.65228-554-195734451696297/AnsiballZ_stat.py'
Dec 05 11:49:06 compute-0 sudo[144285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:07 compute-0 python3.9[144287]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:07 compute-0 sudo[144285]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:07 compute-0 sudo[144408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojeoxpbrkwhmrylbzhuhoiipgxcngaxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935346.65228-554-195734451696297/AnsiballZ_copy.py'
Dec 05 11:49:07 compute-0 sudo[144408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:07 compute-0 python3.9[144410]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935346.65228-554-195734451696297/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:07 compute-0 sudo[144408]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:08 compute-0 sudo[144560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tstlkemsejungqqdpepvtdqnrnxhvlms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935347.8445086-554-126216714884499/AnsiballZ_stat.py'
Dec 05 11:49:08 compute-0 sudo[144560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:08 compute-0 python3.9[144562]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:08 compute-0 sudo[144560]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:08 compute-0 sudo[144685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxjwpsnigvamkrozwtlqhkmymhbqnwrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935347.8445086-554-126216714884499/AnsiballZ_copy.py'
Dec 05 11:49:08 compute-0 sudo[144685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:08 compute-0 python3.9[144687]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935347.8445086-554-126216714884499/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:08 compute-0 sudo[144685]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:09 compute-0 sudo[144837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azrrkgfdigtrymtdzisthcpwpqdkuefl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935349.1442294-667-40917016869745/AnsiballZ_command.py'
Dec 05 11:49:09 compute-0 sudo[144837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:09 compute-0 python3.9[144839]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 05 11:49:09 compute-0 sudo[144837]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:10 compute-0 sudo[144990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-milvcperwrzqytvttzhysrepydjxhgia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935349.916363-676-169167670226349/AnsiballZ_file.py'
Dec 05 11:49:10 compute-0 sudo[144990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:10 compute-0 python3.9[144992]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:10 compute-0 sudo[144990]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:10 compute-0 sudo[145153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmyhztjycvbpnfxvsziozwdqmazjejeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935350.5718937-676-209719826283898/AnsiballZ_file.py'
Dec 05 11:49:10 compute-0 sudo[145153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:10 compute-0 podman[145116]: 2025-12-05 11:49:10.912222149 +0000 UTC m=+0.087933080 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 11:49:11 compute-0 python3.9[145161]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:11 compute-0 sudo[145153]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:11 compute-0 sudo[145320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpptbwntwxblbuffnuhydudyqsjfwafo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935351.221027-676-39363985516465/AnsiballZ_file.py'
Dec 05 11:49:11 compute-0 sudo[145320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:11 compute-0 python3.9[145322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:11 compute-0 sudo[145320]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:12 compute-0 sudo[145472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krbxbaycrwobxmsckpyowozwuonvldjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935351.8799088-676-175238578278066/AnsiballZ_file.py'
Dec 05 11:49:12 compute-0 sudo[145472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:12 compute-0 python3.9[145474]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:12 compute-0 sudo[145472]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:12 compute-0 sudo[145624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lirbamewmxlltzassbafsgzouwrmcklm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935352.559064-676-236281275267842/AnsiballZ_file.py'
Dec 05 11:49:12 compute-0 sudo[145624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:13 compute-0 python3.9[145626]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:13 compute-0 sudo[145624]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:13 compute-0 sudo[145776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stpptujlrhdnmvmjzrlnzkzuidsyrugq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935353.2623544-676-83694091261705/AnsiballZ_file.py'
Dec 05 11:49:13 compute-0 sudo[145776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:13 compute-0 python3.9[145778]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:13 compute-0 sudo[145776]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:14 compute-0 sudo[145928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyodzvivaskatystunyttidoqevsmdts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935353.91329-676-98966059837488/AnsiballZ_file.py'
Dec 05 11:49:14 compute-0 sudo[145928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:14 compute-0 python3.9[145930]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:14 compute-0 sudo[145928]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:14 compute-0 sudo[146080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umgbjzojeoxgibsyoshydihmogowdvhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935354.5982656-676-557484484107/AnsiballZ_file.py'
Dec 05 11:49:14 compute-0 sudo[146080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:15 compute-0 python3.9[146082]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:15 compute-0 sudo[146080]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:15 compute-0 sudo[146232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruasuttpgfuodtrmcsanpybpnnvvzljo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935355.2415555-676-134633964011378/AnsiballZ_file.py'
Dec 05 11:49:15 compute-0 sudo[146232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:15 compute-0 python3.9[146234]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:15 compute-0 sudo[146232]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:16 compute-0 sudo[146384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sihktzzicsfsdyhvxuvdayfndmzdrfif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935355.933863-676-1115339002130/AnsiballZ_file.py'
Dec 05 11:49:16 compute-0 sudo[146384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:16 compute-0 python3.9[146386]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:16 compute-0 sudo[146384]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:16 compute-0 sudo[146536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfbmwwimsrpvcmmmrmujaqyfmhounerd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935356.6253314-676-61948094489196/AnsiballZ_file.py'
Dec 05 11:49:16 compute-0 sudo[146536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:17 compute-0 python3.9[146538]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:17 compute-0 sudo[146536]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:17 compute-0 sudo[146688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgpzthtvujuznakfpyacvmsueqdalmff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935357.3302922-676-207343005727653/AnsiballZ_file.py'
Dec 05 11:49:17 compute-0 sudo[146688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:17 compute-0 python3.9[146690]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:17 compute-0 sudo[146688]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:18 compute-0 sudo[146840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajfrkdhvdcauuepvcfsxfvcwbyhtrort ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935358.0243828-676-64093555740399/AnsiballZ_file.py'
Dec 05 11:49:18 compute-0 sudo[146840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:18 compute-0 python3.9[146842]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:18 compute-0 sudo[146840]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:19 compute-0 sudo[146992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmfpyigorscqjmkhxojondzspdsfntgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935358.7649384-676-272032827330816/AnsiballZ_file.py'
Dec 05 11:49:19 compute-0 sudo[146992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:19 compute-0 python3.9[146994]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:19 compute-0 sudo[146992]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:19 compute-0 sudo[147144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwklbytwdqzicctvytyfetkxxzmntatl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935359.5988936-775-263359113316731/AnsiballZ_stat.py'
Dec 05 11:49:19 compute-0 sudo[147144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:20 compute-0 python3.9[147146]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:20 compute-0 sudo[147144]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:20 compute-0 sudo[147267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhhuwenqtqigrxvwdfpuuhnrtwvvxicx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935359.5988936-775-263359113316731/AnsiballZ_copy.py'
Dec 05 11:49:20 compute-0 sudo[147267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:20 compute-0 python3.9[147269]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935359.5988936-775-263359113316731/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:20 compute-0 sudo[147267]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:21 compute-0 sudo[147419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hccpejvjqvhfgfsolpghbunwldfaslss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935360.8178632-775-215581831471210/AnsiballZ_stat.py'
Dec 05 11:49:21 compute-0 sudo[147419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:21 compute-0 python3.9[147421]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:21 compute-0 sudo[147419]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:21 compute-0 sudo[147542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmxlunojflbfnsylelerivxrsxgtypof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935360.8178632-775-215581831471210/AnsiballZ_copy.py'
Dec 05 11:49:21 compute-0 sudo[147542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:22 compute-0 python3.9[147544]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935360.8178632-775-215581831471210/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:22 compute-0 sudo[147542]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:22 compute-0 sudo[147694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igejpyyhtcwmanraobqvazlolkanosdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935362.2192955-775-105121960303728/AnsiballZ_stat.py'
Dec 05 11:49:22 compute-0 sudo[147694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:22 compute-0 python3.9[147696]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:22 compute-0 sudo[147694]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:23 compute-0 sudo[147817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcypyqkaoacyrixqeijsvnvseibwkmkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935362.2192955-775-105121960303728/AnsiballZ_copy.py'
Dec 05 11:49:23 compute-0 sudo[147817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:23 compute-0 python3.9[147819]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935362.2192955-775-105121960303728/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:23 compute-0 sudo[147817]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:23 compute-0 sudo[147969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnbomtefducczkutpkhuehhlzxbxgido ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935363.5355566-775-129591414834584/AnsiballZ_stat.py'
Dec 05 11:49:23 compute-0 sudo[147969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:24 compute-0 python3.9[147971]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:24 compute-0 sudo[147969]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:24 compute-0 sudo[148092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzprhnwffsunbiomcxgzkkvdtylpxfrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935363.5355566-775-129591414834584/AnsiballZ_copy.py'
Dec 05 11:49:24 compute-0 sudo[148092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:24 compute-0 python3.9[148094]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935363.5355566-775-129591414834584/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:24 compute-0 sudo[148092]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:25 compute-0 sudo[148244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxhyepzdpzbmnabngybzbzaobndwqtwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935364.9294007-775-108819657828384/AnsiballZ_stat.py'
Dec 05 11:49:25 compute-0 sudo[148244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:25 compute-0 python3.9[148246]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:25 compute-0 sudo[148244]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:25 compute-0 sudo[148367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzidgevzepvetyblfqbspwjtxrvcsmrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935364.9294007-775-108819657828384/AnsiballZ_copy.py'
Dec 05 11:49:25 compute-0 sudo[148367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:26 compute-0 python3.9[148369]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935364.9294007-775-108819657828384/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:26 compute-0 sudo[148367]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:26 compute-0 sudo[148519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajhqjzkewyopvwzimfpkrxnmolkhcgrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935366.3013575-775-238325479842852/AnsiballZ_stat.py'
Dec 05 11:49:26 compute-0 sudo[148519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:26 compute-0 python3.9[148521]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:26 compute-0 sudo[148519]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:27 compute-0 sudo[148642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtolsaqlwzvfldkgjcotzkhpfdpkvkep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935366.3013575-775-238325479842852/AnsiballZ_copy.py'
Dec 05 11:49:27 compute-0 sudo[148642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:27 compute-0 python3.9[148644]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935366.3013575-775-238325479842852/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:27 compute-0 sudo[148642]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:27 compute-0 sudo[148794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stpphnliemmumksubpdvltmczbqetzvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935367.5300677-775-130882944250722/AnsiballZ_stat.py'
Dec 05 11:49:27 compute-0 sudo[148794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:28 compute-0 python3.9[148796]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:28 compute-0 sudo[148794]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:28 compute-0 sudo[148917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmjoqicvmqbufeirzhjzjikgzgeelwmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935367.5300677-775-130882944250722/AnsiballZ_copy.py'
Dec 05 11:49:28 compute-0 sudo[148917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:28 compute-0 python3.9[148919]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935367.5300677-775-130882944250722/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:28 compute-0 sudo[148917]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:29 compute-0 sudo[149069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyogdeybngwrqonhdputtgzrdfpwtpab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935368.8490222-775-118927576611400/AnsiballZ_stat.py'
Dec 05 11:49:29 compute-0 sudo[149069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:29 compute-0 python3.9[149071]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:29 compute-0 sudo[149069]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:29 compute-0 sudo[149192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykbjbpxczntdwplfwdyqfbdpgylvpcyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935368.8490222-775-118927576611400/AnsiballZ_copy.py'
Dec 05 11:49:29 compute-0 sudo[149192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:30 compute-0 python3.9[149194]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935368.8490222-775-118927576611400/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:30 compute-0 sudo[149192]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:30 compute-0 sudo[149344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnapqamkdhkgdnwrhbsafbzpdingoitu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935370.1985319-775-56974481441264/AnsiballZ_stat.py'
Dec 05 11:49:30 compute-0 sudo[149344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:30 compute-0 python3.9[149346]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:30 compute-0 sudo[149344]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:31 compute-0 sudo[149467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxqhkbfwgzadoqtzmedzpnnvwgohkmmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935370.1985319-775-56974481441264/AnsiballZ_copy.py'
Dec 05 11:49:31 compute-0 sudo[149467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:31 compute-0 python3.9[149469]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935370.1985319-775-56974481441264/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:31 compute-0 sudo[149467]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:31 compute-0 sudo[149619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eityakwhiqkqgdpetrppwmdthjdeayrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935371.3998506-775-180221470410513/AnsiballZ_stat.py'
Dec 05 11:49:31 compute-0 sudo[149619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:31 compute-0 python3.9[149621]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:31 compute-0 sudo[149619]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:32 compute-0 sudo[149742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afzffbhxgnmpunqzzdwnbmksjoyfayqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935371.3998506-775-180221470410513/AnsiballZ_copy.py'
Dec 05 11:49:32 compute-0 sudo[149742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:32 compute-0 python3.9[149744]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935371.3998506-775-180221470410513/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:32 compute-0 sudo[149742]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:33 compute-0 sudo[149894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exgkqyyjryttzukqfqflcvqtueaywpdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935372.7705736-775-250998152773812/AnsiballZ_stat.py'
Dec 05 11:49:33 compute-0 sudo[149894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:33 compute-0 python3.9[149896]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:33 compute-0 sudo[149894]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:33 compute-0 sudo[150017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfuuoeuvhsnuwltqjulgnghacqtjdnar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935372.7705736-775-250998152773812/AnsiballZ_copy.py'
Dec 05 11:49:33 compute-0 sudo[150017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:33 compute-0 python3.9[150019]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935372.7705736-775-250998152773812/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:33 compute-0 sudo[150017]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:34 compute-0 podman[150090]: 2025-12-05 11:49:34.234655074 +0000 UTC m=+0.083845293 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 11:49:34 compute-0 sudo[150189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vunhjysyxnehdhxzafsdtlzdmjltcsog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935374.038747-775-177884112333369/AnsiballZ_stat.py'
Dec 05 11:49:34 compute-0 sudo[150189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:34 compute-0 python3.9[150191]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:34 compute-0 sudo[150189]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:35 compute-0 sudo[150312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hetfclypjporylpnaakryalfizopmhzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935374.038747-775-177884112333369/AnsiballZ_copy.py'
Dec 05 11:49:35 compute-0 sudo[150312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:35 compute-0 python3.9[150314]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935374.038747-775-177884112333369/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:35 compute-0 sudo[150312]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:35 compute-0 sudo[150464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syfaknintvjfxmvjrodhcaxdryyiuaku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935375.4071891-775-198862746481053/AnsiballZ_stat.py'
Dec 05 11:49:35 compute-0 sudo[150464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:35 compute-0 python3.9[150466]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:35 compute-0 sudo[150464]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:36 compute-0 sudo[150587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbdokgahwimfgugcovtjtartukfbqnjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935375.4071891-775-198862746481053/AnsiballZ_copy.py'
Dec 05 11:49:36 compute-0 sudo[150587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:36 compute-0 python3.9[150589]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935375.4071891-775-198862746481053/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:36 compute-0 sudo[150587]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:37 compute-0 sudo[150739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjcdfidkgnizzgmgrodqkbrtcwztchto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935376.6753912-775-193406779632992/AnsiballZ_stat.py'
Dec 05 11:49:37 compute-0 sudo[150739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:37 compute-0 python3.9[150741]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:37 compute-0 sudo[150739]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:37 compute-0 sudo[150862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whnnkhlzsidkvnfqyooqbwlfjpthvvsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935376.6753912-775-193406779632992/AnsiballZ_copy.py'
Dec 05 11:49:37 compute-0 sudo[150862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:37 compute-0 python3.9[150864]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935376.6753912-775-193406779632992/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:37 compute-0 sudo[150862]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:38 compute-0 python3.9[151014]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:49:39 compute-0 sudo[151167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uygglmlixkhzwdlnxirywottljnflour ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935378.8766243-981-188789233029590/AnsiballZ_seboolean.py'
Dec 05 11:49:39 compute-0 sudo[151167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:39 compute-0 python3.9[151169]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 05 11:49:40 compute-0 sudo[151167]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:41 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 05 11:49:41 compute-0 sudo[151343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsqlmawvjwdymkayqbmcnbcxlespzlmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935380.9286015-989-100240824541589/AnsiballZ_copy.py'
Dec 05 11:49:41 compute-0 sudo[151343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:41 compute-0 podman[151282]: 2025-12-05 11:49:41.296132401 +0000 UTC m=+0.124650186 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 11:49:41 compute-0 python3.9[151351]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:41 compute-0 sudo[151343]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:41 compute-0 sudo[151501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-devsbnxxmapijbokedhjwkthoyqjgzls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935381.6442053-989-81757664705160/AnsiballZ_copy.py'
Dec 05 11:49:41 compute-0 sudo[151501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:42 compute-0 python3.9[151503]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:42 compute-0 sudo[151501]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:42 compute-0 sudo[151653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhoaggjtbmktpugwxrdktczdgursnfnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935382.2832205-989-258528303173276/AnsiballZ_copy.py'
Dec 05 11:49:42 compute-0 sudo[151653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:42 compute-0 python3.9[151655]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:42 compute-0 sudo[151653]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:43 compute-0 sudo[151805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cucwmzecciisvhnpokxfwmtaletczrwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935382.9560702-989-218473310228853/AnsiballZ_copy.py'
Dec 05 11:49:43 compute-0 sudo[151805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:43 compute-0 python3.9[151807]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:43 compute-0 sudo[151805]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:43 compute-0 sudo[151957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzdxxguwjzebbsdbbghaudwfxqlplgsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935383.6374824-989-35644651622221/AnsiballZ_copy.py'
Dec 05 11:49:43 compute-0 sudo[151957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:44 compute-0 python3.9[151959]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:44 compute-0 sudo[151957]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:44 compute-0 sudo[152109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iprhvahorxkvglamgoexiuglkjtqvgxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935384.3887334-1025-132201450577645/AnsiballZ_copy.py'
Dec 05 11:49:44 compute-0 sudo[152109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:44 compute-0 python3.9[152111]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:44 compute-0 sudo[152109]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:45 compute-0 sudo[152261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cujuwlxlncpebfdltgtxhyxzhmmssllw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935385.106231-1025-131649922340612/AnsiballZ_copy.py'
Dec 05 11:49:45 compute-0 sudo[152261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:45 compute-0 python3.9[152263]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:45 compute-0 sudo[152261]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:46 compute-0 sudo[152413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deizrzqywdnbykdxaugcsakbaatcnfab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935385.745262-1025-150233891299887/AnsiballZ_copy.py'
Dec 05 11:49:46 compute-0 sudo[152413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:46 compute-0 python3.9[152415]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:46 compute-0 sudo[152413]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:46 compute-0 sudo[152565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cngwyrhvdhncotyznvslrcgeyueytpoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935386.438268-1025-15216923378868/AnsiballZ_copy.py'
Dec 05 11:49:46 compute-0 sudo[152565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:47 compute-0 python3.9[152567]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:47 compute-0 sudo[152565]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:47 compute-0 sudo[152717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auhhzjnqzbkrbqqgjazhbzilghhhsogo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935387.2482934-1025-201225074173044/AnsiballZ_copy.py'
Dec 05 11:49:47 compute-0 sudo[152717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:47 compute-0 python3.9[152719]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:47 compute-0 sudo[152717]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:48 compute-0 sudo[152869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbaydxqpvvpoxyldvdzvpjbczlrpjmwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935387.9206662-1061-149591299146564/AnsiballZ_systemd.py'
Dec 05 11:49:48 compute-0 sudo[152869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:49 compute-0 python3.9[152871]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:49:49 compute-0 systemd[1]: Reloading.
Dec 05 11:49:49 compute-0 systemd-rc-local-generator[152898]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:49:49 compute-0 systemd-sysv-generator[152902]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:49:49 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Dec 05 11:49:49 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Dec 05 11:49:49 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 05 11:49:49 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 05 11:49:49 compute-0 systemd[1]: Starting libvirt logging daemon...
Dec 05 11:49:49 compute-0 systemd[1]: Started libvirt logging daemon.
Dec 05 11:49:49 compute-0 sudo[152869]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:50 compute-0 sudo[153062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvrsvpjgghytzkvehlzvtzfbcveueilh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935390.0890527-1061-236750325767468/AnsiballZ_systemd.py'
Dec 05 11:49:50 compute-0 sudo[153062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:50 compute-0 python3.9[153064]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:49:50 compute-0 systemd[1]: Reloading.
Dec 05 11:49:50 compute-0 systemd-sysv-generator[153097]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:49:50 compute-0 systemd-rc-local-generator[153092]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:49:51 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 05 11:49:51 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 05 11:49:51 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 05 11:49:51 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 05 11:49:51 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 05 11:49:51 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 05 11:49:51 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 05 11:49:51 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 05 11:49:51 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 05 11:49:51 compute-0 sudo[153062]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:51 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 05 11:49:51 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 05 11:49:51 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 05 11:49:51 compute-0 sudo[153287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnogpgvstodizygqofaquxkgpxeyqzad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935391.368926-1061-26229655192016/AnsiballZ_systemd.py'
Dec 05 11:49:51 compute-0 sudo[153287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:51 compute-0 python3.9[153289]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:49:51 compute-0 systemd[1]: Reloading.
Dec 05 11:49:52 compute-0 systemd-rc-local-generator[153316]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:49:52 compute-0 systemd-sysv-generator[153319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:49:52 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 05 11:49:52 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 05 11:49:52 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 05 11:49:52 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 05 11:49:52 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 05 11:49:52 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 05 11:49:52 compute-0 sudo[153287]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:52 compute-0 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 2c1840a9-6925-41dc-8e3a-a6a8d8978d2f
Dec 05 11:49:52 compute-0 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 05 11:49:52 compute-0 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 2c1840a9-6925-41dc-8e3a-a6a8d8978d2f
Dec 05 11:49:52 compute-0 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 05 11:49:52 compute-0 sudo[153501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqmybemypdxgxrrkkcfumugkdexdfjvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935392.5386083-1061-71818927608268/AnsiballZ_systemd.py'
Dec 05 11:49:52 compute-0 sudo[153501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:53 compute-0 python3.9[153503]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:49:53 compute-0 systemd[1]: Reloading.
Dec 05 11:49:53 compute-0 systemd-rc-local-generator[153530]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:49:53 compute-0 systemd-sysv-generator[153534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:49:53 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Dec 05 11:49:53 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 05 11:49:53 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 05 11:49:53 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 05 11:49:53 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 05 11:49:53 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 05 11:49:53 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 05 11:49:53 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 05 11:49:53 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 05 11:49:53 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 05 11:49:53 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 05 11:49:53 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 05 11:49:53 compute-0 sudo[153501]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:54 compute-0 sudo[153716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulqbpafpkbhmlmoucasporhiwcrswguw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935393.7508821-1061-179375441075933/AnsiballZ_systemd.py'
Dec 05 11:49:54 compute-0 sudo[153716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:54 compute-0 python3.9[153718]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:49:54 compute-0 systemd[1]: Reloading.
Dec 05 11:49:54 compute-0 systemd-rc-local-generator[153746]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:49:54 compute-0 systemd-sysv-generator[153749]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:49:54 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Dec 05 11:49:54 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Dec 05 11:49:54 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 05 11:49:54 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 05 11:49:54 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 05 11:49:54 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 05 11:49:55 compute-0 systemd[1]: Starting libvirt secret daemon...
Dec 05 11:49:55 compute-0 systemd[1]: Started libvirt secret daemon.
Dec 05 11:49:55 compute-0 sudo[153716]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:55 compute-0 sudo[153928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqaoelibyrrmvfyewiagciqrjepkvmfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935395.3318324-1098-189115137080340/AnsiballZ_file.py'
Dec 05 11:49:55 compute-0 sudo[153928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:55 compute-0 python3.9[153930]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:55 compute-0 sudo[153928]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:56 compute-0 sudo[154080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siduvxfshthyslpsafcpavwbyytpndqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935396.0788748-1106-64677729924735/AnsiballZ_find.py'
Dec 05 11:49:56 compute-0 sudo[154080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:56 compute-0 python3.9[154082]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 11:49:56 compute-0 sudo[154080]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:57 compute-0 sudo[154232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhzidenvzqympmtfvfhjcftubdyppjhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935396.9463062-1120-92344443357151/AnsiballZ_stat.py'
Dec 05 11:49:57 compute-0 sudo[154232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:57 compute-0 python3.9[154234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:57 compute-0 sudo[154232]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:57 compute-0 sudo[154355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrxzaxcpycrlfgwneqhyctsuawrmcsza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935396.9463062-1120-92344443357151/AnsiballZ_copy.py'
Dec 05 11:49:57 compute-0 sudo[154355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:58 compute-0 python3.9[154357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935396.9463062-1120-92344443357151/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:58 compute-0 sudo[154355]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:58 compute-0 sudo[154507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clclqvqbauoklgkrswndmulylmnynhgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935398.4865649-1136-81214751782188/AnsiballZ_file.py'
Dec 05 11:49:58 compute-0 sudo[154507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:59 compute-0 python3.9[154509]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:49:59 compute-0 sudo[154507]: pam_unix(sudo:session): session closed for user root
Dec 05 11:49:59 compute-0 sudo[154659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqckpynqndjkftksngqflgtfmtnyqste ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935399.2465534-1144-206486186962089/AnsiballZ_stat.py'
Dec 05 11:49:59 compute-0 sudo[154659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:49:59 compute-0 python3.9[154661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:49:59 compute-0 sudo[154659]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:00 compute-0 sudo[154737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxlumvnxivrlfdzjjrsjlrtonqxiwbwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935399.2465534-1144-206486186962089/AnsiballZ_file.py'
Dec 05 11:50:00 compute-0 sudo[154737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:00 compute-0 python3.9[154739]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:00 compute-0 sudo[154737]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:00 compute-0 sudo[154889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpegnienolbwrcflhvyppcuiqcaubgve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935400.5909498-1156-33053994151018/AnsiballZ_stat.py'
Dec 05 11:50:00 compute-0 sudo[154889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:01 compute-0 python3.9[154891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:01 compute-0 sudo[154889]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:01 compute-0 sudo[154967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjnbwryhdfgdotpjdkwwepbvxexacmjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935400.5909498-1156-33053994151018/AnsiballZ_file.py'
Dec 05 11:50:01 compute-0 sudo[154967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:01 compute-0 python3.9[154969]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zuvd0a23 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:01 compute-0 sudo[154967]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:02 compute-0 sudo[155119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruxankoldxxxtzmohcpgtenzsjebhrvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935401.8295598-1168-259557410289534/AnsiballZ_stat.py'
Dec 05 11:50:02 compute-0 sudo[155119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:02 compute-0 python3.9[155121]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:02 compute-0 sudo[155119]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:02 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 05 11:50:02 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 05 11:50:02 compute-0 sudo[155197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjujfoaednaudedofpjryykcokedreky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935401.8295598-1168-259557410289534/AnsiballZ_file.py'
Dec 05 11:50:02 compute-0 sudo[155197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:02 compute-0 python3.9[155199]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:02 compute-0 sudo[155197]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:50:02.992 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:50:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:50:02.993 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:50:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:50:02.993 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:50:03 compute-0 sudo[155349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxdrlkiyyjkwzkuzhgqziehobznabfee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935403.1065295-1181-14014051610735/AnsiballZ_command.py'
Dec 05 11:50:03 compute-0 sudo[155349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:03 compute-0 python3.9[155351]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:50:03 compute-0 sudo[155349]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:04 compute-0 sudo[155511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqkbihhapqoorsyrwsxbkcrjyuvvaht ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935403.872084-1189-61810644083462/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 11:50:04 compute-0 podman[155476]: 2025-12-05 11:50:04.393872397 +0000 UTC m=+0.059061578 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 11:50:04 compute-0 sudo[155511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:04 compute-0 python3[155520]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 11:50:04 compute-0 sudo[155511]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:05 compute-0 sudo[155670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plxaqzpzhbjrcytcliiqrwnmbajejwdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935404.8104732-1197-140185624926615/AnsiballZ_stat.py'
Dec 05 11:50:05 compute-0 sudo[155670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:05 compute-0 python3.9[155672]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:05 compute-0 sudo[155670]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:05 compute-0 sudo[155748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnbgrpaptdhqgfonmwcotqzbjjandglr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935404.8104732-1197-140185624926615/AnsiballZ_file.py'
Dec 05 11:50:05 compute-0 sudo[155748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:05 compute-0 python3.9[155750]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:05 compute-0 sudo[155748]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:06 compute-0 sudo[155900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swyetwthdyzglweqancnslkvgoizzvyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935406.1187215-1209-129285854688857/AnsiballZ_stat.py'
Dec 05 11:50:06 compute-0 sudo[155900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:06 compute-0 python3.9[155902]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:06 compute-0 sudo[155900]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:06 compute-0 sudo[155978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxivjvcswmgdcbaddsoysatyghqruilk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935406.1187215-1209-129285854688857/AnsiballZ_file.py'
Dec 05 11:50:06 compute-0 sudo[155978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:07 compute-0 python3.9[155980]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:07 compute-0 sudo[155978]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:07 compute-0 sudo[156130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hppxutjfizaowekpayaflpmgrnlcarud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935407.369359-1221-264207022277858/AnsiballZ_stat.py'
Dec 05 11:50:07 compute-0 sudo[156130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:07 compute-0 python3.9[156132]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:07 compute-0 sudo[156130]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:08 compute-0 sudo[156208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teuezxnmuenoxabnxcsxnrhqdpxkkeyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935407.369359-1221-264207022277858/AnsiballZ_file.py'
Dec 05 11:50:08 compute-0 sudo[156208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:08 compute-0 python3.9[156210]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:08 compute-0 sudo[156208]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:09 compute-0 sudo[156360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgkrhmredlirgmqlurtnjydpgybynsot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935408.7272549-1233-13997710501314/AnsiballZ_stat.py'
Dec 05 11:50:09 compute-0 sudo[156360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:09 compute-0 python3.9[156362]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:09 compute-0 sudo[156360]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:09 compute-0 sudo[156438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kepqsxzqkkhyvaqtjrryiqyjdilaqxul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935408.7272549-1233-13997710501314/AnsiballZ_file.py'
Dec 05 11:50:09 compute-0 sudo[156438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:09 compute-0 python3.9[156440]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:09 compute-0 sudo[156438]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:10 compute-0 sudo[156590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aijhduahkeugwyymybxdhdykwdzzkcmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935409.9812627-1245-171749698121015/AnsiballZ_stat.py'
Dec 05 11:50:10 compute-0 sudo[156590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:10 compute-0 python3.9[156592]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:10 compute-0 sudo[156590]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:10 compute-0 sudo[156715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntngvcjmlbfvzovmlfhtjrgjihmyulky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935409.9812627-1245-171749698121015/AnsiballZ_copy.py'
Dec 05 11:50:10 compute-0 sudo[156715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:11 compute-0 python3.9[156717]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935409.9812627-1245-171749698121015/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:11 compute-0 sudo[156715]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:11 compute-0 sudo[156879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbfctrzlustxaeakuztayqowxkxjsbnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935411.395108-1260-184717681635377/AnsiballZ_file.py'
Dec 05 11:50:11 compute-0 sudo[156879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:11 compute-0 podman[156841]: 2025-12-05 11:50:11.801459497 +0000 UTC m=+0.115112908 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 11:50:11 compute-0 python3.9[156886]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:11 compute-0 sudo[156879]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:12 compute-0 sudo[157045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbkqpcqlqzgltdnccgifckhtzzsofwpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935412.1503756-1268-71568290412665/AnsiballZ_command.py'
Dec 05 11:50:12 compute-0 sudo[157045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:12 compute-0 python3.9[157047]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:50:12 compute-0 sudo[157045]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:13 compute-0 sudo[157200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmqmpwyvknngkuxrokxaqiybpqmstphs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935412.985059-1276-94741697084422/AnsiballZ_blockinfile.py'
Dec 05 11:50:13 compute-0 sudo[157200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:13 compute-0 python3.9[157202]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:13 compute-0 sudo[157200]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:14 compute-0 sudo[157352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imvlhdrxxsyrluwezbteldxofnglskmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935414.1560218-1285-155836997922359/AnsiballZ_command.py'
Dec 05 11:50:14 compute-0 sudo[157352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:14 compute-0 python3.9[157354]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:50:14 compute-0 sudo[157352]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:15 compute-0 sudo[157505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkalqzjxmozudypzxzjbanxbxrwfzzub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935415.016178-1293-239993834875456/AnsiballZ_stat.py'
Dec 05 11:50:15 compute-0 sudo[157505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:15 compute-0 python3.9[157507]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:50:15 compute-0 sudo[157505]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:16 compute-0 sudo[157659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxukivxmurkuurfxtgrvttriktzfsejx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935415.8485966-1301-142876624986134/AnsiballZ_command.py'
Dec 05 11:50:16 compute-0 sudo[157659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:16 compute-0 python3.9[157661]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:50:16 compute-0 sudo[157659]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:17 compute-0 sudo[157814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjwhgpyzguyhtnfuhntczygwctjjjxbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935416.6551912-1309-91360292413347/AnsiballZ_file.py'
Dec 05 11:50:17 compute-0 sudo[157814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:17 compute-0 python3.9[157816]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:17 compute-0 sudo[157814]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:17 compute-0 sudo[157966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyuwasnxdlzfrtoyacpdzhpjmmjlftoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935417.4288177-1317-230268456682157/AnsiballZ_stat.py'
Dec 05 11:50:17 compute-0 sudo[157966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:18 compute-0 python3.9[157968]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:18 compute-0 sudo[157966]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:18 compute-0 sudo[158089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mymheapgvcuhfxnouejpfosmixllznvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935417.4288177-1317-230268456682157/AnsiballZ_copy.py'
Dec 05 11:50:18 compute-0 sudo[158089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:18 compute-0 python3.9[158091]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935417.4288177-1317-230268456682157/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:18 compute-0 sudo[158089]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:19 compute-0 sudo[158241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgpqqonadzfnzsfpmpsdxmfpnawhdddf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935418.8997276-1332-2597566670873/AnsiballZ_stat.py'
Dec 05 11:50:19 compute-0 sudo[158241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:19 compute-0 python3.9[158243]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:19 compute-0 sudo[158241]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:19 compute-0 sudo[158364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdgojffesywesqnlgmlucbngzidaxvaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935418.8997276-1332-2597566670873/AnsiballZ_copy.py'
Dec 05 11:50:19 compute-0 sudo[158364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:20 compute-0 python3.9[158366]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935418.8997276-1332-2597566670873/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:20 compute-0 sudo[158364]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:20 compute-0 sudo[158516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvgmevupiisemvznpoeusnzqqnckzrth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935420.2872055-1347-82813606919032/AnsiballZ_stat.py'
Dec 05 11:50:20 compute-0 sudo[158516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:20 compute-0 python3.9[158518]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:20 compute-0 sudo[158516]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:21 compute-0 sudo[158639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsbbyrjlkyosrnkdthljerqkmrihecou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935420.2872055-1347-82813606919032/AnsiballZ_copy.py'
Dec 05 11:50:21 compute-0 sudo[158639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:21 compute-0 python3.9[158641]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935420.2872055-1347-82813606919032/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:21 compute-0 sudo[158639]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:22 compute-0 sudo[158791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrdbguitcgbohujvnxcrwgnkfxzrrsql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935421.6623516-1362-214315474065225/AnsiballZ_systemd.py'
Dec 05 11:50:22 compute-0 sudo[158791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:22 compute-0 python3.9[158793]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:50:22 compute-0 systemd[1]: Reloading.
Dec 05 11:50:22 compute-0 systemd-sysv-generator[158823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:50:22 compute-0 systemd-rc-local-generator[158819]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:50:22 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Dec 05 11:50:22 compute-0 sudo[158791]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:23 compute-0 sudo[158981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfiwtsvcuikopsphncuawqabjmlxglff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935423.018638-1370-2779146709837/AnsiballZ_systemd.py'
Dec 05 11:50:23 compute-0 sudo[158981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:23 compute-0 python3.9[158983]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 05 11:50:23 compute-0 systemd[1]: Reloading.
Dec 05 11:50:23 compute-0 systemd-rc-local-generator[159007]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:50:23 compute-0 systemd-sysv-generator[159011]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:50:24 compute-0 systemd[1]: Reloading.
Dec 05 11:50:24 compute-0 systemd-rc-local-generator[159047]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:50:24 compute-0 systemd-sysv-generator[159051]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:50:24 compute-0 sudo[158981]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:24 compute-0 sshd-session[104598]: Connection closed by 192.168.122.30 port 48946
Dec 05 11:50:24 compute-0 sshd-session[104589]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:50:24 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Dec 05 11:50:24 compute-0 systemd[1]: session-22.scope: Consumed 3min 31.057s CPU time.
Dec 05 11:50:24 compute-0 systemd-logind[792]: Session 22 logged out. Waiting for processes to exit.
Dec 05 11:50:24 compute-0 systemd-logind[792]: Removed session 22.
Dec 05 11:50:30 compute-0 sshd-session[159081]: Accepted publickey for zuul from 192.168.122.30 port 33578 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:50:30 compute-0 systemd-logind[792]: New session 23 of user zuul.
Dec 05 11:50:30 compute-0 systemd[1]: Started Session 23 of User zuul.
Dec 05 11:50:30 compute-0 sshd-session[159081]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:50:31 compute-0 python3.9[159234]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:50:33 compute-0 python3.9[159388]: ansible-ansible.builtin.service_facts Invoked
Dec 05 11:50:33 compute-0 network[159405]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 11:50:33 compute-0 network[159406]: 'network-scripts' will be removed from distribution in near future.
Dec 05 11:50:33 compute-0 network[159407]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 11:50:34 compute-0 podman[159420]: 2025-12-05 11:50:34.554647339 +0000 UTC m=+0.076887720 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 05 11:50:37 compute-0 sudo[159695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngugqpyclbwtzgwdijradehxelnfvvyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935437.523349-47-62102804802584/AnsiballZ_setup.py'
Dec 05 11:50:37 compute-0 sudo[159695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:38 compute-0 python3.9[159697]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 11:50:38 compute-0 sudo[159695]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:38 compute-0 sudo[159779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcvgsytbfqljwwakpmbmengqiyaicbfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935437.523349-47-62102804802584/AnsiballZ_dnf.py'
Dec 05 11:50:38 compute-0 sudo[159779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:39 compute-0 python3.9[159781]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:50:42 compute-0 podman[159783]: 2025-12-05 11:50:42.303816617 +0000 UTC m=+0.142825970 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 11:50:44 compute-0 sudo[159779]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:44 compute-0 sudo[159958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydskfbepjmnghdcrdpgqxdekligqyiik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935444.448861-59-240032578161402/AnsiballZ_stat.py'
Dec 05 11:50:44 compute-0 sudo[159958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:45 compute-0 python3.9[159960]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:50:45 compute-0 sudo[159958]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:45 compute-0 sudo[160110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcuoedlumfnpphviudqxalvzodjsrsvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935445.4237845-69-34847918296842/AnsiballZ_command.py'
Dec 05 11:50:45 compute-0 sudo[160110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:46 compute-0 python3.9[160112]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:50:46 compute-0 sudo[160110]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:46 compute-0 sudo[160263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgxfmlwdxsrfcnocdrujlzchflscikrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935446.419062-79-45172961182968/AnsiballZ_stat.py'
Dec 05 11:50:46 compute-0 sudo[160263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:47 compute-0 python3.9[160265]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:50:47 compute-0 sudo[160263]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:47 compute-0 sudo[160415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-islxgfqbiqlsizylfulmkxjfxfvqixwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935447.180649-87-143243850281139/AnsiballZ_command.py'
Dec 05 11:50:47 compute-0 sudo[160415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:47 compute-0 python3.9[160417]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:50:47 compute-0 sudo[160415]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:48 compute-0 sudo[160568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edjzixibozxfkfflblnozwlasiemgqat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935447.8522537-95-8448953494723/AnsiballZ_stat.py'
Dec 05 11:50:48 compute-0 sudo[160568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:49 compute-0 python3.9[160570]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:50:49 compute-0 sudo[160568]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:49 compute-0 sudo[160691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzytlopeehhtdugzgzwflcgvpuughfjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935447.8522537-95-8448953494723/AnsiballZ_copy.py'
Dec 05 11:50:49 compute-0 sudo[160691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:49 compute-0 python3.9[160693]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935447.8522537-95-8448953494723/.source.iscsi _original_basename=.8bkcp8sd follow=False checksum=a829c6ed530b00b3536c4c41b581253018e4d1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:49 compute-0 sudo[160691]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:50 compute-0 sudo[160843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stgunekebwpnfjxkicqykvjdpaghdmvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935450.1394973-110-23124280725253/AnsiballZ_file.py'
Dec 05 11:50:50 compute-0 sudo[160843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:50 compute-0 python3.9[160845]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:50 compute-0 sudo[160843]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:51 compute-0 sudo[160995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asgrwkunrtrmdjqdxjjcyeewjjylcoxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935451.1080651-118-259390666230981/AnsiballZ_lineinfile.py'
Dec 05 11:50:51 compute-0 sudo[160995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:51 compute-0 python3.9[160997]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:50:51 compute-0 sudo[160995]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:51 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 11:50:51 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 11:50:52 compute-0 sudo[161148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjmkerqqcjxrmerppskixlrwjvreiuwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935452.0842757-127-25002374221967/AnsiballZ_systemd_service.py'
Dec 05 11:50:52 compute-0 sudo[161148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:53 compute-0 python3.9[161150]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:50:53 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 05 11:50:53 compute-0 sudo[161148]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:53 compute-0 sudo[161304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnqcfvlfmkjthpouoxbudyhzxwnjyuun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935453.385152-135-211280596816246/AnsiballZ_systemd_service.py'
Dec 05 11:50:53 compute-0 sudo[161304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:54 compute-0 python3.9[161306]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:50:54 compute-0 systemd[1]: Reloading.
Dec 05 11:50:54 compute-0 systemd-rc-local-generator[161333]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:50:54 compute-0 systemd-sysv-generator[161338]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:50:54 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 05 11:50:54 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 05 11:50:54 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Dec 05 11:50:54 compute-0 systemd[1]: Started Open-iSCSI.
Dec 05 11:50:54 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 05 11:50:54 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 05 11:50:54 compute-0 sudo[161304]: pam_unix(sudo:session): session closed for user root
Dec 05 11:50:55 compute-0 sudo[161505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikyazbhsbljlrmcrvycfpsfqvgmkmzvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935455.2511666-146-154414233775809/AnsiballZ_service_facts.py'
Dec 05 11:50:55 compute-0 sudo[161505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:50:55 compute-0 python3.9[161507]: ansible-ansible.builtin.service_facts Invoked
Dec 05 11:50:55 compute-0 network[161524]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 11:50:55 compute-0 network[161525]: 'network-scripts' will be removed from distribution in near future.
Dec 05 11:50:55 compute-0 network[161526]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 11:50:59 compute-0 sudo[161505]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:00 compute-0 sudo[161795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcqykphgbenezufqlibugnzvofprybya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935460.122639-156-112005987229845/AnsiballZ_file.py'
Dec 05 11:51:00 compute-0 sudo[161795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:00 compute-0 python3.9[161797]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 05 11:51:00 compute-0 sudo[161795]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:01 compute-0 sudo[161947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izfsdlcyvhelrfdownjhgpljqwqzkrce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935460.87475-164-204867240412527/AnsiballZ_modprobe.py'
Dec 05 11:51:01 compute-0 sudo[161947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:01 compute-0 python3.9[161949]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 05 11:51:01 compute-0 sudo[161947]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:02 compute-0 sudo[162103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onsrfummsqxyefggrcsixoryuhwtobsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935461.801045-172-148488548558277/AnsiballZ_stat.py'
Dec 05 11:51:02 compute-0 sudo[162103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:02 compute-0 python3.9[162105]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:02 compute-0 sudo[162103]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:02 compute-0 sudo[162226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptebpqspbtepiuzgtupsfciomrfhwgsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935461.801045-172-148488548558277/AnsiballZ_copy.py'
Dec 05 11:51:02 compute-0 sudo[162226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:02 compute-0 python3.9[162228]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935461.801045-172-148488548558277/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:02 compute-0 sudo[162226]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:51:02.994 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:51:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:51:02.996 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:51:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:51:02.996 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:51:03 compute-0 sudo[162378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lklfpqdyfmqebhmehngjpdxjefwylygp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935463.2450833-188-81960436931681/AnsiballZ_lineinfile.py'
Dec 05 11:51:03 compute-0 sudo[162378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:03 compute-0 python3.9[162380]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:03 compute-0 sudo[162378]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:04 compute-0 sudo[162543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlzznblarovfxvcdvfpynycrkowswuty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935464.0585597-196-188812158329513/AnsiballZ_systemd.py'
Dec 05 11:51:04 compute-0 sudo[162543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:04 compute-0 podman[162504]: 2025-12-05 11:51:04.965881486 +0000 UTC m=+0.071293502 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 11:51:05 compute-0 python3.9[162551]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:51:05 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 05 11:51:05 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 05 11:51:05 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 05 11:51:05 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 05 11:51:05 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 05 11:51:05 compute-0 sudo[162543]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:05 compute-0 sudo[162706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-issrbofndwnzeopatuqkrrziiwdwrxql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935465.5690145-204-245847514913027/AnsiballZ_file.py'
Dec 05 11:51:05 compute-0 sudo[162706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:06 compute-0 python3.9[162708]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:51:06 compute-0 sudo[162706]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:06 compute-0 sudo[162858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aexlzkvvexutlozhfvexeoxhgdfdswvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935466.3396664-213-224715375925206/AnsiballZ_stat.py'
Dec 05 11:51:06 compute-0 sudo[162858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:06 compute-0 python3.9[162860]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:51:06 compute-0 sudo[162858]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:07 compute-0 sudo[163010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knjyvldljcjjwkqcqsvjbooiufyjtpip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935467.02786-222-162767322345460/AnsiballZ_stat.py'
Dec 05 11:51:07 compute-0 sudo[163010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:07 compute-0 python3.9[163012]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:51:07 compute-0 sudo[163010]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:08 compute-0 sudo[163162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhxzaybfvkjtftkszmxtvsrbjgrcbpkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935467.7157578-230-117262560434775/AnsiballZ_stat.py'
Dec 05 11:51:08 compute-0 sudo[163162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:08 compute-0 python3.9[163164]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:08 compute-0 sudo[163162]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:08 compute-0 sudo[163285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shmmtfrfmdxhusewfvoyxyklgjnbexwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935467.7157578-230-117262560434775/AnsiballZ_copy.py'
Dec 05 11:51:08 compute-0 sudo[163285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:08 compute-0 python3.9[163287]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935467.7157578-230-117262560434775/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:08 compute-0 sudo[163285]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:09 compute-0 sudo[163437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egdfamqvbgbslyphczzvsoovgstccmjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935468.994326-245-144813155407013/AnsiballZ_command.py'
Dec 05 11:51:09 compute-0 sudo[163437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:09 compute-0 python3.9[163439]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:51:09 compute-0 sudo[163437]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:10 compute-0 sudo[163590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vokazwbijrfpkhbxkqzxbdyvphutvzgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935469.754057-253-249000835183190/AnsiballZ_lineinfile.py'
Dec 05 11:51:10 compute-0 sudo[163590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:10 compute-0 python3.9[163592]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:10 compute-0 sudo[163590]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:11 compute-0 sudo[163742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlpoaaqkqzhwjseximzdqzbwthfbuflr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935470.6364212-261-54955939599392/AnsiballZ_replace.py'
Dec 05 11:51:11 compute-0 sudo[163742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:11 compute-0 python3.9[163744]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:11 compute-0 sudo[163742]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:11 compute-0 sudo[163894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzeeulnkbvcjtdizxpyqhiagrytmjxyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935471.483705-269-212806769925725/AnsiballZ_replace.py'
Dec 05 11:51:11 compute-0 sudo[163894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:11 compute-0 python3.9[163896]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:11 compute-0 sudo[163894]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:12 compute-0 sudo[164063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-groqbkdtfkeztrirmlpbxxyvvruvovdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935472.189854-278-58785001778903/AnsiballZ_lineinfile.py'
Dec 05 11:51:12 compute-0 sudo[164063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:12 compute-0 podman[164020]: 2025-12-05 11:51:12.585751182 +0000 UTC m=+0.107043850 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 05 11:51:12 compute-0 python3.9[164068]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:12 compute-0 sudo[164063]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:13 compute-0 sudo[164224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqnpseoaexhtilxgjaivqnrrkuplbike ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935472.8906167-278-22680788380019/AnsiballZ_lineinfile.py'
Dec 05 11:51:13 compute-0 sudo[164224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:13 compute-0 python3.9[164226]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:13 compute-0 sudo[164224]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:13 compute-0 sudo[164376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeusarkyjpjwwbizozlpeqlujmnvgofb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935473.5528822-278-1122483988021/AnsiballZ_lineinfile.py'
Dec 05 11:51:13 compute-0 sudo[164376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:13 compute-0 python3.9[164378]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:14 compute-0 sudo[164376]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:14 compute-0 sudo[164528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abmwndrnugqlxthghklfghgmvejafjaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935474.1340094-278-249887445566241/AnsiballZ_lineinfile.py'
Dec 05 11:51:14 compute-0 sudo[164528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:14 compute-0 python3.9[164530]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:14 compute-0 sudo[164528]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:15 compute-0 sudo[164680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqwpjxesptecrkmqnzijgivwipbbfovh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935475.014827-307-122521505653768/AnsiballZ_stat.py'
Dec 05 11:51:15 compute-0 sudo[164680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:15 compute-0 python3.9[164682]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:51:15 compute-0 sudo[164680]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:16 compute-0 sudo[164836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnmuouczixqvquuzycttdthxkxnlgvsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935475.771264-315-164984216547608/AnsiballZ_file.py'
Dec 05 11:51:16 compute-0 sudo[164836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:16 compute-0 python3.9[164838]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:16 compute-0 sudo[164836]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:16 compute-0 sshd-session[164808]: Received disconnect from 193.46.255.159 port 55402:11:  [preauth]
Dec 05 11:51:16 compute-0 sshd-session[164808]: Disconnected from authenticating user root 193.46.255.159 port 55402 [preauth]
Dec 05 11:51:16 compute-0 sudo[164988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oudtnosowxolpchwrcgzivwvxhqvhcea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935476.6135318-324-25996333212746/AnsiballZ_file.py'
Dec 05 11:51:16 compute-0 sudo[164988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:17 compute-0 python3.9[164990]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:51:17 compute-0 sudo[164988]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:17 compute-0 sudo[165140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xchqbhxhtomttpucikooobnfopmxitkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935477.273655-332-142255363273193/AnsiballZ_stat.py'
Dec 05 11:51:17 compute-0 sudo[165140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:17 compute-0 python3.9[165142]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:17 compute-0 sudo[165140]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:18 compute-0 sudo[165218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znxegxnlwnfxnjopabocqpqoajuugjgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935477.273655-332-142255363273193/AnsiballZ_file.py'
Dec 05 11:51:18 compute-0 sudo[165218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:18 compute-0 python3.9[165220]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:51:18 compute-0 sudo[165218]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:18 compute-0 sudo[165370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxdpjbdojfaymjjfvutfyzmeoogezsbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935478.4053898-332-13994145941835/AnsiballZ_stat.py'
Dec 05 11:51:18 compute-0 sudo[165370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:18 compute-0 python3.9[165372]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:18 compute-0 sudo[165370]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:19 compute-0 sudo[165448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eowgqhafytzebpxjawdudsiqldwuyfez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935478.4053898-332-13994145941835/AnsiballZ_file.py'
Dec 05 11:51:19 compute-0 sudo[165448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:19 compute-0 python3.9[165450]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:51:19 compute-0 sudo[165448]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:19 compute-0 sudo[165600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzqsqcqsaxtnlfxmbahpzgkrmtwxjgih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935479.5231373-355-195629867585063/AnsiballZ_file.py'
Dec 05 11:51:19 compute-0 sudo[165600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:20 compute-0 python3.9[165602]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:20 compute-0 sudo[165600]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:20 compute-0 sudo[165752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcbwgnakivyrldglqnzmzevcccqatqmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935480.245846-363-141499985879510/AnsiballZ_stat.py'
Dec 05 11:51:20 compute-0 sudo[165752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:20 compute-0 python3.9[165754]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:20 compute-0 sudo[165752]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:21 compute-0 sudo[165830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blvxbcuatulwzsdmctbehubqidcjyglu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935480.245846-363-141499985879510/AnsiballZ_file.py'
Dec 05 11:51:21 compute-0 sudo[165830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:21 compute-0 python3.9[165832]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:21 compute-0 sudo[165830]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:21 compute-0 sudo[165982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yopvjipbbbmglicdfdfqxslzmypdanyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935481.4652882-375-17602162188254/AnsiballZ_stat.py'
Dec 05 11:51:21 compute-0 sudo[165982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:21 compute-0 python3.9[165984]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:21 compute-0 sudo[165982]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:22 compute-0 sudo[166060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdinavuzkglgmkkfncugxufphxvtjhhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935481.4652882-375-17602162188254/AnsiballZ_file.py'
Dec 05 11:51:22 compute-0 sudo[166060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:22 compute-0 python3.9[166062]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:22 compute-0 sudo[166060]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:22 compute-0 sudo[166212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnjizmxmhmkmgqtazajhsrjbyverawdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935482.6029556-387-279132051830154/AnsiballZ_systemd.py'
Dec 05 11:51:22 compute-0 sudo[166212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:23 compute-0 python3.9[166214]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:51:23 compute-0 systemd[1]: Reloading.
Dec 05 11:51:23 compute-0 systemd-rc-local-generator[166240]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:51:23 compute-0 systemd-sysv-generator[166245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:51:23 compute-0 sudo[166212]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:23 compute-0 sudo[166402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fesjmbxklrmxpbjinfbisrafvxvdvrqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935483.7060165-395-125878064901928/AnsiballZ_stat.py'
Dec 05 11:51:23 compute-0 sudo[166402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:24 compute-0 python3.9[166404]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:24 compute-0 sudo[166402]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:24 compute-0 sudo[166480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjfqbwvxjcuaaejgpejjfvsupvouaxhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935483.7060165-395-125878064901928/AnsiballZ_file.py'
Dec 05 11:51:24 compute-0 sudo[166480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:24 compute-0 python3.9[166482]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:24 compute-0 sudo[166480]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:25 compute-0 sudo[166632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doqwspoihgdecrccdqhwkmokucmdybqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935484.8203988-407-59517358974358/AnsiballZ_stat.py'
Dec 05 11:51:25 compute-0 sudo[166632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:25 compute-0 python3.9[166634]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:25 compute-0 sudo[166632]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:25 compute-0 sudo[166710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkebiysopawqbzngamuvecaldlrjszmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935484.8203988-407-59517358974358/AnsiballZ_file.py'
Dec 05 11:51:25 compute-0 sudo[166710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:25 compute-0 python3.9[166712]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:25 compute-0 sudo[166710]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:26 compute-0 sudo[166862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jspvqyptsbkmxkjbkcrrqhoetrhocyyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935485.9880753-419-262522091066907/AnsiballZ_systemd.py'
Dec 05 11:51:26 compute-0 sudo[166862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:26 compute-0 python3.9[166864]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:51:26 compute-0 systemd[1]: Reloading.
Dec 05 11:51:26 compute-0 systemd-sysv-generator[166893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:51:26 compute-0 systemd-rc-local-generator[166890]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:51:26 compute-0 systemd[1]: Starting Create netns directory...
Dec 05 11:51:26 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 11:51:26 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 11:51:26 compute-0 systemd[1]: Finished Create netns directory.
Dec 05 11:51:27 compute-0 sudo[166862]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:27 compute-0 sudo[167055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brppuuprwdryhixvussufdfvbpenxbon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935487.3181288-429-115115825568419/AnsiballZ_file.py'
Dec 05 11:51:27 compute-0 sudo[167055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:27 compute-0 python3.9[167057]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:51:27 compute-0 sudo[167055]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:28 compute-0 sudo[167207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjmoisdgbztujkghzlqqhzfscprakkal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935487.9978871-437-7665812514565/AnsiballZ_stat.py'
Dec 05 11:51:28 compute-0 sudo[167207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:28 compute-0 python3.9[167209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:28 compute-0 sudo[167207]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:28 compute-0 sudo[167330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aurzzrvgbsghwnllbqpkccxtkpkmfcsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935487.9978871-437-7665812514565/AnsiballZ_copy.py'
Dec 05 11:51:28 compute-0 sudo[167330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:29 compute-0 python3.9[167332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935487.9978871-437-7665812514565/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:51:29 compute-0 sudo[167330]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:29 compute-0 sudo[167482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cacjtuyvdxgsnsxuzrjfysquosmiyhsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935489.4659264-454-160928782048617/AnsiballZ_file.py'
Dec 05 11:51:29 compute-0 sudo[167482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:29 compute-0 python3.9[167484]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:51:29 compute-0 sudo[167482]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:30 compute-0 sudo[167634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrsoxfpcfxdadrpumexhyxavqnzcbudx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935490.22195-462-76224646519653/AnsiballZ_stat.py'
Dec 05 11:51:30 compute-0 sudo[167634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:30 compute-0 python3.9[167636]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:30 compute-0 sudo[167634]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:31 compute-0 sudo[167757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwevlojzryldimpnhnnrvpqtmhbhvipi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935490.22195-462-76224646519653/AnsiballZ_copy.py'
Dec 05 11:51:31 compute-0 sudo[167757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:31 compute-0 python3.9[167759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935490.22195-462-76224646519653/.source.json _original_basename=.2bjuzt9u follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:31 compute-0 sudo[167757]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:31 compute-0 sudo[167909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sywdhrnbyjdaijtzulqsoyawamlqblae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935491.573852-477-57421949427328/AnsiballZ_file.py'
Dec 05 11:51:31 compute-0 sudo[167909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:32 compute-0 python3.9[167911]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:32 compute-0 sudo[167909]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:32 compute-0 sudo[168061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geymvbfhetgnhyimhrvyatoreqhpyywn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935492.280691-485-49549898709731/AnsiballZ_stat.py'
Dec 05 11:51:32 compute-0 sudo[168061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:32 compute-0 sudo[168061]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:33 compute-0 sudo[168184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liwrkbrbdsfiwtwwgdzrpyezqpykhscg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935492.280691-485-49549898709731/AnsiballZ_copy.py'
Dec 05 11:51:33 compute-0 sudo[168184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:33 compute-0 sudo[168184]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:34 compute-0 sudo[168336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uszfgfpwippfydtfvrizwshdrcdecczf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935493.812633-502-132838981941629/AnsiballZ_container_config_data.py'
Dec 05 11:51:34 compute-0 sudo[168336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:34 compute-0 python3.9[168338]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 05 11:51:34 compute-0 sudo[168336]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:35 compute-0 sudo[168497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpfqoyyxtwvxjpsfczbocubygjzchrqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935494.6667602-511-47314192149466/AnsiballZ_container_config_hash.py'
Dec 05 11:51:35 compute-0 sudo[168497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:35 compute-0 podman[168462]: 2025-12-05 11:51:35.244671223 +0000 UTC m=+0.103098662 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 11:51:35 compute-0 python3.9[168505]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 11:51:35 compute-0 sudo[168497]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:36 compute-0 sudo[168658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sltammvhjqjbkujttfliuuvblpwxeehf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935495.69515-520-135178391717849/AnsiballZ_podman_container_info.py'
Dec 05 11:51:36 compute-0 sudo[168658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:36 compute-0 python3.9[168660]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 11:51:36 compute-0 sudo[168658]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:37 compute-0 sudo[168836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugrtbspmnzwbvxptknwyhekqpuuarxtf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935497.1359167-533-809920576068/AnsiballZ_edpm_container_manage.py'
Dec 05 11:51:37 compute-0 sudo[168836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:37 compute-0 python3[168838]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 11:51:38 compute-0 podman[168874]: 2025-12-05 11:51:38.015420378 +0000 UTC m=+0.020683975 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 05 11:51:38 compute-0 podman[168874]: 2025-12-05 11:51:38.112205533 +0000 UTC m=+0.117469150 container create 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 05 11:51:38 compute-0 python3[168838]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 05 11:51:38 compute-0 sudo[168836]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:38 compute-0 sudo[169062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqrdpgmwuimrmiulsekpeifsyvrqocac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935498.451666-541-264617260308181/AnsiballZ_stat.py'
Dec 05 11:51:38 compute-0 sudo[169062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:38 compute-0 python3.9[169064]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:51:38 compute-0 sudo[169062]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:39 compute-0 sudo[169216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkecsazmewglqydquxzisrgeaauvfmrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935499.301006-550-1754102153436/AnsiballZ_file.py'
Dec 05 11:51:39 compute-0 sudo[169216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:39 compute-0 python3.9[169218]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:39 compute-0 sudo[169216]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:40 compute-0 sudo[169292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khhbrtpibzssmuejojwgkqetjvmcnjmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935499.301006-550-1754102153436/AnsiballZ_stat.py'
Dec 05 11:51:40 compute-0 sudo[169292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:40 compute-0 python3.9[169294]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:51:40 compute-0 sudo[169292]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:40 compute-0 sudo[169443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfmhwzmpjwopvesecjbiqzchjnhdsrte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935500.363893-550-81140421315706/AnsiballZ_copy.py'
Dec 05 11:51:40 compute-0 sudo[169443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:41 compute-0 python3.9[169445]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935500.363893-550-81140421315706/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:41 compute-0 sudo[169443]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:41 compute-0 sudo[169519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwzzpifdfmjxnnxialvlngssnvspuctk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935500.363893-550-81140421315706/AnsiballZ_systemd.py'
Dec 05 11:51:41 compute-0 sudo[169519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:41 compute-0 python3.9[169521]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:51:41 compute-0 systemd[1]: Reloading.
Dec 05 11:51:41 compute-0 systemd-rc-local-generator[169548]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:51:41 compute-0 systemd-sysv-generator[169552]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:51:42 compute-0 sudo[169519]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:42 compute-0 sudo[169630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgfkvlmkwimnejcavkrnniefyswbsbhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935500.363893-550-81140421315706/AnsiballZ_systemd.py'
Dec 05 11:51:42 compute-0 sudo[169630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:42 compute-0 python3.9[169632]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:51:42 compute-0 systemd[1]: Reloading.
Dec 05 11:51:42 compute-0 systemd-rc-local-generator[169679]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:51:42 compute-0 systemd-sysv-generator[169683]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:51:42 compute-0 podman[169634]: 2025-12-05 11:51:42.95383642 +0000 UTC m=+0.142466687 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:51:43 compute-0 systemd[1]: Starting multipathd container...
Dec 05 11:51:43 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:51:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 11:51:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 11:51:43 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.
Dec 05 11:51:43 compute-0 podman[169698]: 2025-12-05 11:51:43.317072311 +0000 UTC m=+0.146495353 container init 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 11:51:43 compute-0 multipathd[169712]: + sudo -E kolla_set_configs
Dec 05 11:51:43 compute-0 sudo[169718]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 05 11:51:43 compute-0 podman[169698]: 2025-12-05 11:51:43.347680706 +0000 UTC m=+0.177103688 container start 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 11:51:43 compute-0 sudo[169718]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 11:51:43 compute-0 sudo[169718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 11:51:43 compute-0 podman[169698]: multipathd
Dec 05 11:51:43 compute-0 systemd[1]: Started multipathd container.
Dec 05 11:51:43 compute-0 sudo[169630]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:43 compute-0 multipathd[169712]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 11:51:43 compute-0 multipathd[169712]: INFO:__main__:Validating config file
Dec 05 11:51:43 compute-0 multipathd[169712]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 11:51:43 compute-0 multipathd[169712]: INFO:__main__:Writing out command to execute
Dec 05 11:51:43 compute-0 sudo[169718]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:43 compute-0 multipathd[169712]: ++ cat /run_command
Dec 05 11:51:43 compute-0 multipathd[169712]: + CMD='/usr/sbin/multipathd -d'
Dec 05 11:51:43 compute-0 multipathd[169712]: + ARGS=
Dec 05 11:51:43 compute-0 multipathd[169712]: + sudo kolla_copy_cacerts
Dec 05 11:51:43 compute-0 sudo[169741]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 05 11:51:43 compute-0 sudo[169741]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 11:51:43 compute-0 sudo[169741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 11:51:43 compute-0 sudo[169741]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:43 compute-0 multipathd[169712]: + [[ ! -n '' ]]
Dec 05 11:51:43 compute-0 multipathd[169712]: + . kolla_extend_start
Dec 05 11:51:43 compute-0 multipathd[169712]: Running command: '/usr/sbin/multipathd -d'
Dec 05 11:51:43 compute-0 multipathd[169712]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 05 11:51:43 compute-0 multipathd[169712]: + umask 0022
Dec 05 11:51:43 compute-0 multipathd[169712]: + exec /usr/sbin/multipathd -d
Dec 05 11:51:43 compute-0 podman[169719]: 2025-12-05 11:51:43.456572189 +0000 UTC m=+0.093840579 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec 05 11:51:43 compute-0 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-5fbcdf1bc53c9d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 11:51:43 compute-0 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-5fbcdf1bc53c9d85.service: Failed with result 'exit-code'.
Dec 05 11:51:43 compute-0 multipathd[169712]: 2884.091357 | --------start up--------
Dec 05 11:51:43 compute-0 multipathd[169712]: 2884.091385 | read /etc/multipath.conf
Dec 05 11:51:43 compute-0 multipathd[169712]: 2884.097980 | path checkers start up
Dec 05 11:51:46 compute-0 python3.9[169900]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:51:46 compute-0 sudo[170052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nujdebzidkfgrnllohxrdovhfhjhrkly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935506.2222433-586-8791840832805/AnsiballZ_command.py'
Dec 05 11:51:46 compute-0 sudo[170052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:46 compute-0 python3.9[170054]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:51:46 compute-0 sudo[170052]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:47 compute-0 sudo[170217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqfoqwpvasalcyhylzovhjylukbzkbvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935507.0568671-594-246669979208534/AnsiballZ_systemd.py'
Dec 05 11:51:47 compute-0 sudo[170217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:47 compute-0 python3.9[170219]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:51:47 compute-0 systemd[1]: Stopping multipathd container...
Dec 05 11:51:48 compute-0 multipathd[169712]: 2888.915758 | exit (signal)
Dec 05 11:51:48 compute-0 multipathd[169712]: 2888.915829 | --------shut down-------
Dec 05 11:51:48 compute-0 systemd[1]: libpod-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope: Deactivated successfully.
Dec 05 11:51:48 compute-0 podman[170223]: 2025-12-05 11:51:48.32687779 +0000 UTC m=+0.463266053 container died 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 11:51:48 compute-0 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-5fbcdf1bc53c9d85.timer: Deactivated successfully.
Dec 05 11:51:48 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.
Dec 05 11:51:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-userdata-shm.mount: Deactivated successfully.
Dec 05 11:51:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405-merged.mount: Deactivated successfully.
Dec 05 11:51:48 compute-0 podman[170223]: 2025-12-05 11:51:48.389805004 +0000 UTC m=+0.526193297 container cleanup 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:51:48 compute-0 podman[170223]: multipathd
Dec 05 11:51:48 compute-0 podman[170254]: multipathd
Dec 05 11:51:48 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 05 11:51:48 compute-0 systemd[1]: Stopped multipathd container.
Dec 05 11:51:48 compute-0 systemd[1]: Starting multipathd container...
Dec 05 11:51:48 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:51:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 11:51:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 11:51:48 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.
Dec 05 11:51:48 compute-0 podman[170267]: 2025-12-05 11:51:48.627645318 +0000 UTC m=+0.143277518 container init 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 11:51:48 compute-0 multipathd[170283]: + sudo -E kolla_set_configs
Dec 05 11:51:48 compute-0 sudo[170289]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 05 11:51:48 compute-0 sudo[170289]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 11:51:48 compute-0 podman[170267]: 2025-12-05 11:51:48.67259516 +0000 UTC m=+0.188227310 container start 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 11:51:48 compute-0 sudo[170289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 11:51:48 compute-0 podman[170267]: multipathd
Dec 05 11:51:48 compute-0 systemd[1]: Started multipathd container.
Dec 05 11:51:48 compute-0 sudo[170217]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:48 compute-0 multipathd[170283]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 11:51:48 compute-0 multipathd[170283]: INFO:__main__:Validating config file
Dec 05 11:51:48 compute-0 multipathd[170283]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 11:51:48 compute-0 multipathd[170283]: INFO:__main__:Writing out command to execute
Dec 05 11:51:48 compute-0 sudo[170289]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:48 compute-0 multipathd[170283]: ++ cat /run_command
Dec 05 11:51:48 compute-0 multipathd[170283]: + CMD='/usr/sbin/multipathd -d'
Dec 05 11:51:48 compute-0 multipathd[170283]: + ARGS=
Dec 05 11:51:48 compute-0 multipathd[170283]: + sudo kolla_copy_cacerts
Dec 05 11:51:48 compute-0 sudo[170307]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 05 11:51:48 compute-0 sudo[170307]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 11:51:48 compute-0 sudo[170307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 11:51:48 compute-0 podman[170290]: 2025-12-05 11:51:48.779553163 +0000 UTC m=+0.085442468 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 11:51:48 compute-0 sudo[170307]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:48 compute-0 multipathd[170283]: + [[ ! -n '' ]]
Dec 05 11:51:48 compute-0 multipathd[170283]: + . kolla_extend_start
Dec 05 11:51:48 compute-0 multipathd[170283]: Running command: '/usr/sbin/multipathd -d'
Dec 05 11:51:48 compute-0 multipathd[170283]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 05 11:51:48 compute-0 multipathd[170283]: + umask 0022
Dec 05 11:51:48 compute-0 multipathd[170283]: + exec /usr/sbin/multipathd -d
Dec 05 11:51:48 compute-0 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-3b598ae60d2c9b07.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 11:51:48 compute-0 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-3b598ae60d2c9b07.service: Failed with result 'exit-code'.
Dec 05 11:51:48 compute-0 multipathd[170283]: 2889.422783 | --------start up--------
Dec 05 11:51:48 compute-0 multipathd[170283]: 2889.422800 | read /etc/multipath.conf
Dec 05 11:51:48 compute-0 multipathd[170283]: 2889.429575 | path checkers start up
Dec 05 11:51:49 compute-0 sudo[170469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szhqhgiihlgznpjlzsxpwpscbwfgspkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935508.9308913-602-204511328208624/AnsiballZ_file.py'
Dec 05 11:51:49 compute-0 sudo[170469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:49 compute-0 python3.9[170471]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:49 compute-0 sudo[170469]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:50 compute-0 sudo[170622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enotmlughmgsflicesualapsabnvougm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935509.9665875-614-3336966472646/AnsiballZ_file.py'
Dec 05 11:51:50 compute-0 sudo[170622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:50 compute-0 python3.9[170624]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 05 11:51:50 compute-0 sudo[170622]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:50 compute-0 sudo[170774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmejtvootbemcwrtcnqaptanatvlvprk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935510.6687257-622-127374323655434/AnsiballZ_modprobe.py'
Dec 05 11:51:50 compute-0 sudo[170774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:51 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 05 11:51:51 compute-0 python3.9[170776]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 05 11:51:51 compute-0 kernel: Key type psk registered
Dec 05 11:51:51 compute-0 sudo[170774]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:51 compute-0 sudo[170939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnrufzqpoqiaqstauircalatqkxsdxcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935511.5218735-630-94802856244009/AnsiballZ_stat.py'
Dec 05 11:51:51 compute-0 sudo[170939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:52 compute-0 python3.9[170941]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:51:52 compute-0 sudo[170939]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:52 compute-0 sudo[171062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxuxbukvfxyqxgbedfuacqxcgomxrngf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935511.5218735-630-94802856244009/AnsiballZ_copy.py'
Dec 05 11:51:52 compute-0 sudo[171062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:52 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 05 11:51:52 compute-0 python3.9[171064]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935511.5218735-630-94802856244009/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:52 compute-0 sudo[171062]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:53 compute-0 sudo[171215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwsnamdxtcjfnapldjopeosqsldoxasj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935512.7549841-646-152626622489043/AnsiballZ_lineinfile.py'
Dec 05 11:51:53 compute-0 sudo[171215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:53 compute-0 python3.9[171217]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:51:53 compute-0 sudo[171215]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:53 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 05 11:51:53 compute-0 sudo[171368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoderepuedxhwkeyxszfbabewxnjwppg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935513.410776-654-119314959107365/AnsiballZ_systemd.py'
Dec 05 11:51:53 compute-0 sudo[171368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:54 compute-0 python3.9[171370]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:51:54 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 05 11:51:54 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 05 11:51:54 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 05 11:51:54 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 05 11:51:54 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 05 11:51:54 compute-0 sudo[171368]: pam_unix(sudo:session): session closed for user root
Dec 05 11:51:54 compute-0 sudo[171524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbhsfwuwwmurimfmgjjmcjbcwipapaqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935514.3675394-662-163530106796231/AnsiballZ_dnf.py'
Dec 05 11:51:54 compute-0 sudo[171524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:51:54 compute-0 python3.9[171526]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 11:51:55 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 05 11:51:57 compute-0 systemd[1]: Reloading.
Dec 05 11:51:58 compute-0 systemd-sysv-generator[171557]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:51:58 compute-0 systemd-rc-local-generator[171551]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:51:58 compute-0 systemd[1]: Reloading.
Dec 05 11:51:58 compute-0 systemd-rc-local-generator[171595]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:51:58 compute-0 systemd-sysv-generator[171599]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:51:58 compute-0 systemd-logind[792]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 05 11:51:58 compute-0 systemd-logind[792]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 05 11:51:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 11:51:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 11:51:59 compute-0 systemd[1]: Reloading.
Dec 05 11:51:59 compute-0 systemd-sysv-generator[171692]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:51:59 compute-0 systemd-rc-local-generator[171687]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:51:59 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 11:52:00 compute-0 sudo[171524]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:01 compute-0 sudo[172911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izqegspjwoyvqcrpnchcfymqxhaightz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935521.049954-670-76399379576057/AnsiballZ_systemd_service.py'
Dec 05 11:52:01 compute-0 sudo[172911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:01 compute-0 python3.9[172944]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:52:01 compute-0 systemd[1]: Stopping Open-iSCSI...
Dec 05 11:52:01 compute-0 iscsid[161346]: iscsid shutting down.
Dec 05 11:52:01 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Dec 05 11:52:01 compute-0 systemd[1]: Stopped Open-iSCSI.
Dec 05 11:52:01 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 05 11:52:01 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 05 11:52:01 compute-0 systemd[1]: Started Open-iSCSI.
Dec 05 11:52:01 compute-0 sudo[172911]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:01 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 11:52:01 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 11:52:01 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.778s CPU time.
Dec 05 11:52:01 compute-0 systemd[1]: run-rd075b8d777104360a5fd69052d17ebe9.service: Deactivated successfully.
Dec 05 11:52:02 compute-0 python3.9[173134]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:52:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:52:02.996 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:52:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:52:02.997 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:52:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:52:02.998 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:52:03 compute-0 sudo[173288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfnhhmcwxtpsmswvzbbrwzjeajxmlzep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935523.0424674-688-108526693775685/AnsiballZ_file.py'
Dec 05 11:52:03 compute-0 sudo[173288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:03 compute-0 python3.9[173290]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:03 compute-0 sudo[173288]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:04 compute-0 sudo[173440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltdentbpeahnriivnjvjfiwzejgbolyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935523.9106092-699-652473366021/AnsiballZ_systemd_service.py'
Dec 05 11:52:04 compute-0 sudo[173440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:04 compute-0 python3.9[173442]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:52:04 compute-0 systemd[1]: Reloading.
Dec 05 11:52:04 compute-0 systemd-rc-local-generator[173472]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:52:04 compute-0 systemd-sysv-generator[173475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:52:04 compute-0 sudo[173440]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:05 compute-0 python3.9[173629]: ansible-ansible.builtin.service_facts Invoked
Dec 05 11:52:05 compute-0 network[173646]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 11:52:05 compute-0 network[173647]: 'network-scripts' will be removed from distribution in near future.
Dec 05 11:52:05 compute-0 network[173648]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 11:52:05 compute-0 podman[173653]: 2025-12-05 11:52:05.712415288 +0000 UTC m=+0.086091535 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 05 11:52:10 compute-0 sudo[173937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txktqohfnmvvtjgtkgjbjooyrtuofyxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935530.245424-718-250956570887110/AnsiballZ_systemd_service.py'
Dec 05 11:52:10 compute-0 sudo[173937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:10 compute-0 python3.9[173939]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:52:10 compute-0 sudo[173937]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:11 compute-0 sudo[174090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eraicprlvamfzgxzofhbjbxjmdvnhviw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935530.9982584-718-178791332024464/AnsiballZ_systemd_service.py'
Dec 05 11:52:11 compute-0 sudo[174090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:11 compute-0 python3.9[174092]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:52:11 compute-0 sudo[174090]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:12 compute-0 sudo[174243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsalcklooqeivvkzxddhotlsubehlojf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935531.7721457-718-167680164767000/AnsiballZ_systemd_service.py'
Dec 05 11:52:12 compute-0 sudo[174243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:12 compute-0 python3.9[174245]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:52:12 compute-0 sudo[174243]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:12 compute-0 sudo[174396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbudzifsojodicifdgeikwjaqfoeosio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935532.5072756-718-58140980888377/AnsiballZ_systemd_service.py'
Dec 05 11:52:12 compute-0 sudo[174396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:13 compute-0 python3.9[174398]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:52:13 compute-0 sudo[174396]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:13 compute-0 podman[174399]: 2025-12-05 11:52:13.250723002 +0000 UTC m=+0.100372280 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 11:52:13 compute-0 sudo[174575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lasdsuecsoqvktyriotzrswuihtwkkuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935533.354651-718-9909641456059/AnsiballZ_systemd_service.py'
Dec 05 11:52:13 compute-0 sudo[174575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:13 compute-0 python3.9[174577]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:52:13 compute-0 sudo[174575]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:14 compute-0 sudo[174728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsaswnrbqkamickufivzcmmealojznlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935534.0985212-718-252688644435144/AnsiballZ_systemd_service.py'
Dec 05 11:52:14 compute-0 sudo[174728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:14 compute-0 python3.9[174730]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:52:14 compute-0 sudo[174728]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:15 compute-0 sudo[174881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywinfmyjvfoskbeedktibfmmzwzvtgkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935534.9606934-718-156263515075616/AnsiballZ_systemd_service.py'
Dec 05 11:52:15 compute-0 sudo[174881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:15 compute-0 python3.9[174883]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:52:15 compute-0 sudo[174881]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:16 compute-0 sudo[175034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkqwdquchxdejjbyzeadnvlyosbklzbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935535.7375548-718-64386133699956/AnsiballZ_systemd_service.py'
Dec 05 11:52:16 compute-0 sudo[175034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:16 compute-0 python3.9[175036]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:52:16 compute-0 sudo[175034]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:17 compute-0 sudo[175187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlyjqcyebyjyuzstbvckshrgcvybbmis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935536.8060784-777-205802620415838/AnsiballZ_file.py'
Dec 05 11:52:17 compute-0 sudo[175187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:17 compute-0 python3.9[175189]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:17 compute-0 sudo[175187]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:17 compute-0 sudo[175339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbbkazoaiecontonuwwgforhjiejsvzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935537.4370744-777-222802348487851/AnsiballZ_file.py'
Dec 05 11:52:17 compute-0 sudo[175339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:17 compute-0 python3.9[175341]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:17 compute-0 sudo[175339]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:18 compute-0 sudo[175491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eamoiyisxgeibwoslwrgckkkjrrigpcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935538.0116224-777-47100664377434/AnsiballZ_file.py'
Dec 05 11:52:18 compute-0 sudo[175491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:18 compute-0 python3.9[175493]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:18 compute-0 sudo[175491]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:18 compute-0 sudo[175660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edmytnzyoypdwheiinvvokcswiobmyxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935538.6219592-777-270899837052617/AnsiballZ_file.py'
Dec 05 11:52:18 compute-0 sudo[175660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:18 compute-0 podman[175617]: 2025-12-05 11:52:18.895822725 +0000 UTC m=+0.054313869 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 05 11:52:19 compute-0 python3.9[175665]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:19 compute-0 sudo[175660]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:19 compute-0 sudo[175815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ochvyjemrzrltmdtkwvxkzecuhsaoiri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935539.2356668-777-141998185664109/AnsiballZ_file.py'
Dec 05 11:52:19 compute-0 sudo[175815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:19 compute-0 python3.9[175817]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:19 compute-0 sudo[175815]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:20 compute-0 sudo[175967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dofxpofvyavnrktjlllewpchvrwkidkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935539.8841205-777-254397489704392/AnsiballZ_file.py'
Dec 05 11:52:20 compute-0 sudo[175967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:20 compute-0 python3.9[175969]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:20 compute-0 sudo[175967]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:20 compute-0 sudo[176119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejkrodmixarpstwsaujogkjydgcztznq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935540.5690074-777-259458650927001/AnsiballZ_file.py'
Dec 05 11:52:20 compute-0 sudo[176119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:21 compute-0 python3.9[176121]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:21 compute-0 sudo[176119]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:21 compute-0 sudo[176271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxwwqbhbologbsbzwvhvsnzgfgxxuzwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935541.263529-777-179668364844229/AnsiballZ_file.py'
Dec 05 11:52:21 compute-0 sudo[176271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:21 compute-0 python3.9[176273]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:21 compute-0 sudo[176271]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:22 compute-0 sudo[176423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efcqaofmialnikxqhgfokfwtzrazasgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935541.915669-834-47016836960158/AnsiballZ_file.py'
Dec 05 11:52:22 compute-0 sudo[176423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:22 compute-0 python3.9[176425]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:22 compute-0 sudo[176423]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:22 compute-0 sudo[176575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivqleedusguqljwokqpidgilocvmqbms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935542.648289-834-42253574953600/AnsiballZ_file.py'
Dec 05 11:52:22 compute-0 sudo[176575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:23 compute-0 python3.9[176577]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:23 compute-0 sudo[176575]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:23 compute-0 sudo[176727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poricedguaumejvpbuhnfmsheipvviog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935543.3717823-834-190668090128386/AnsiballZ_file.py'
Dec 05 11:52:23 compute-0 sudo[176727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:23 compute-0 python3.9[176729]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:23 compute-0 sudo[176727]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:24 compute-0 sudo[176879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkmkfactrbstghkaocqaukqrjegxczsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935544.0233817-834-249493017204961/AnsiballZ_file.py'
Dec 05 11:52:24 compute-0 sudo[176879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:24 compute-0 python3.9[176881]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:24 compute-0 sudo[176879]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:24 compute-0 sudo[177031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdtrjyfrnhqrwijismfbchxcjcrpndjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935544.655377-834-203045751708780/AnsiballZ_file.py'
Dec 05 11:52:24 compute-0 sudo[177031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:25 compute-0 python3.9[177033]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:25 compute-0 sudo[177031]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:25 compute-0 sudo[177183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqvmsslcwaapffhscprdmmiptruvphuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935545.2869413-834-199144601627969/AnsiballZ_file.py'
Dec 05 11:52:25 compute-0 sudo[177183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:25 compute-0 python3.9[177185]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:25 compute-0 sudo[177183]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:26 compute-0 sudo[177335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cixjeebyhitkdorrsrxmmvmodbypnyjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935545.9505486-834-262339276736299/AnsiballZ_file.py'
Dec 05 11:52:26 compute-0 sudo[177335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:26 compute-0 python3.9[177337]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:26 compute-0 sudo[177335]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:26 compute-0 sudo[177487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-posduvtnsxknqrmqgyydifbsaeyxjlfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935546.7630138-834-33652033652317/AnsiballZ_file.py'
Dec 05 11:52:26 compute-0 sudo[177487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:27 compute-0 python3.9[177489]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:52:27 compute-0 sudo[177487]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:27 compute-0 sudo[177639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdhunxzqdqwdpziciogftvtzhzctvtth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935547.43789-892-16303244314043/AnsiballZ_command.py'
Dec 05 11:52:27 compute-0 sudo[177639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:27 compute-0 python3.9[177641]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:52:27 compute-0 sudo[177639]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:28 compute-0 python3.9[177793]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 11:52:29 compute-0 sudo[177943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qopvwfsjsisxzwudjzsvtlgzndxlvbvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935549.0721745-910-165239778345339/AnsiballZ_systemd_service.py'
Dec 05 11:52:29 compute-0 sudo[177943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:29 compute-0 python3.9[177945]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:52:29 compute-0 systemd[1]: Reloading.
Dec 05 11:52:29 compute-0 systemd-sysv-generator[177976]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:52:29 compute-0 systemd-rc-local-generator[177972]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:52:30 compute-0 sudo[177943]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:30 compute-0 sudo[178130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtbqhtsrlvsbgfcujcskurwteczausdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935550.4106057-918-225370421549464/AnsiballZ_command.py'
Dec 05 11:52:30 compute-0 sudo[178130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:30 compute-0 python3.9[178132]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:52:30 compute-0 sudo[178130]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:31 compute-0 sudo[178283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygosmhootgtpawuyzktfrhbtjkuhhdxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935551.1152673-918-278292012968453/AnsiballZ_command.py'
Dec 05 11:52:31 compute-0 sudo[178283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:31 compute-0 python3.9[178285]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:52:31 compute-0 sudo[178283]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:32 compute-0 sudo[178436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajevfwnaapvalhsbafkfhemqansdrnbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935551.8119519-918-266887124195647/AnsiballZ_command.py'
Dec 05 11:52:32 compute-0 sudo[178436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:32 compute-0 python3.9[178438]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:52:32 compute-0 sudo[178436]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:32 compute-0 sudo[178589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyqlnfhvjratupwfqenzfrqbeyetklsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935552.4530582-918-19976414134568/AnsiballZ_command.py'
Dec 05 11:52:32 compute-0 sudo[178589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:32 compute-0 python3.9[178591]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:52:32 compute-0 sudo[178589]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:33 compute-0 sudo[178742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahstpbqerssiumvjspolebmksxdpueuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935553.1111186-918-102657583743654/AnsiballZ_command.py'
Dec 05 11:52:33 compute-0 sudo[178742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:33 compute-0 python3.9[178744]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:52:33 compute-0 sudo[178742]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:33 compute-0 sudo[178895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aljjndfbwqvlkmtrwhpspjhvcplejtjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935553.7085063-918-141189397511050/AnsiballZ_command.py'
Dec 05 11:52:33 compute-0 sudo[178895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:34 compute-0 python3.9[178897]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:52:34 compute-0 sudo[178895]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:34 compute-0 sudo[179048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uozvjgyhmslsexscqqeullwcbiqjwhyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935554.265174-918-112751414439767/AnsiballZ_command.py'
Dec 05 11:52:34 compute-0 sudo[179048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:34 compute-0 python3.9[179050]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:52:34 compute-0 sudo[179048]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:35 compute-0 sudo[179201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sregojkwtlblrempsmolmihijuqbmvja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935555.0652633-918-171913750057342/AnsiballZ_command.py'
Dec 05 11:52:35 compute-0 sudo[179201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:35 compute-0 python3.9[179203]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:52:35 compute-0 sudo[179201]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:36 compute-0 podman[179328]: 2025-12-05 11:52:36.720309051 +0000 UTC m=+0.063215673 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 05 11:52:36 compute-0 sudo[179370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbepnpmvtefjplqkdzfjfdphewuxccgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935556.3797886-997-163870230563615/AnsiballZ_file.py'
Dec 05 11:52:36 compute-0 sudo[179370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:36 compute-0 python3.9[179374]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:36 compute-0 sudo[179370]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:37 compute-0 sudo[179524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjvqkjxciiaivexhmmigrxrbgwiordzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935557.0764246-997-3178821210182/AnsiballZ_file.py'
Dec 05 11:52:37 compute-0 sudo[179524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:37 compute-0 python3.9[179526]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:37 compute-0 sudo[179524]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:38 compute-0 sudo[179676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeuentrvcqpsznclpesrgsnyynynahmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935557.763314-997-205564042496043/AnsiballZ_file.py'
Dec 05 11:52:38 compute-0 sudo[179676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:38 compute-0 python3.9[179678]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:38 compute-0 sudo[179676]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:38 compute-0 sudo[179828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlpejfmabipnxgckyldelobqgwlvwsjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935558.4608238-1019-32181048662672/AnsiballZ_file.py'
Dec 05 11:52:38 compute-0 sudo[179828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:38 compute-0 python3.9[179830]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:39 compute-0 sudo[179828]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:39 compute-0 sudo[179980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syztqsxylufjztwqjcinrmlokznnavsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935559.1530874-1019-225616617529194/AnsiballZ_file.py'
Dec 05 11:52:39 compute-0 sudo[179980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:39 compute-0 python3.9[179982]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:39 compute-0 sudo[179980]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:40 compute-0 sudo[180132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prmoiercmbpsryjygapistxzkxkbdpcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935559.767848-1019-168857262803296/AnsiballZ_file.py'
Dec 05 11:52:40 compute-0 sudo[180132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:40 compute-0 python3.9[180134]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:40 compute-0 sudo[180132]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:40 compute-0 sudo[180284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgtmgwzwqrisqikupkawhvfubutejgzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935560.399751-1019-50055543537614/AnsiballZ_file.py'
Dec 05 11:52:40 compute-0 sudo[180284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:40 compute-0 python3.9[180286]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:40 compute-0 sudo[180284]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:41 compute-0 sudo[180436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqxfggayfotqshkszqodrjtfnrzrkkcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935561.084745-1019-168520486985490/AnsiballZ_file.py'
Dec 05 11:52:41 compute-0 sudo[180436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:41 compute-0 python3.9[180438]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:41 compute-0 sudo[180436]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:42 compute-0 sudo[180588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsousqfvtjxdleermqjvnsufmfgiozgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935561.7485938-1019-273694072044009/AnsiballZ_file.py'
Dec 05 11:52:42 compute-0 sudo[180588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:42 compute-0 python3.9[180590]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:42 compute-0 sudo[180588]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:42 compute-0 sudo[180740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqaphfbbuxqcssxgjxaogrfsfkvpgsgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935562.4012365-1019-12920735971585/AnsiballZ_file.py'
Dec 05 11:52:42 compute-0 sudo[180740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:42 compute-0 python3.9[180742]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:42 compute-0 sudo[180740]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:44 compute-0 podman[180767]: 2025-12-05 11:52:44.251048544 +0000 UTC m=+0.098235752 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 11:52:47 compute-0 sudo[180918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhheqcccpvyglfhwqvyasbtzzahdcddk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935567.0851173-1188-239979345019463/AnsiballZ_getent.py'
Dec 05 11:52:47 compute-0 sudo[180918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:47 compute-0 python3.9[180920]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 05 11:52:47 compute-0 sudo[180918]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:48 compute-0 sudo[181071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqomhlrtrmaqskwnxwwrgeogfhlqnogz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935567.942032-1196-34128513726523/AnsiballZ_group.py'
Dec 05 11:52:48 compute-0 sudo[181071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:48 compute-0 python3.9[181073]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 11:52:49 compute-0 podman[181075]: 2025-12-05 11:52:49.250951353 +0000 UTC m=+0.091456017 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 11:52:49 compute-0 groupadd[181074]: group added to /etc/group: name=nova, GID=42436
Dec 05 11:52:49 compute-0 groupadd[181074]: group added to /etc/gshadow: name=nova
Dec 05 11:52:50 compute-0 groupadd[181074]: new group: name=nova, GID=42436
Dec 05 11:52:50 compute-0 sudo[181071]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:50 compute-0 sudo[181248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uskkplyjnratfmqpivscljagqssamuew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935570.2227316-1204-216263898678467/AnsiballZ_user.py'
Dec 05 11:52:50 compute-0 sudo[181248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:52:50 compute-0 python3.9[181250]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 11:52:51 compute-0 useradd[181252]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 05 11:52:51 compute-0 useradd[181252]: add 'nova' to group 'libvirt'
Dec 05 11:52:51 compute-0 useradd[181252]: add 'nova' to shadow group 'libvirt'
Dec 05 11:52:51 compute-0 sudo[181248]: pam_unix(sudo:session): session closed for user root
Dec 05 11:52:52 compute-0 sshd-session[181283]: Accepted publickey for zuul from 192.168.122.30 port 51878 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:52:52 compute-0 systemd-logind[792]: New session 24 of user zuul.
Dec 05 11:52:52 compute-0 systemd[1]: Started Session 24 of User zuul.
Dec 05 11:52:52 compute-0 sshd-session[181283]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:52:52 compute-0 sshd-session[181286]: Received disconnect from 192.168.122.30 port 51878:11: disconnected by user
Dec 05 11:52:52 compute-0 sshd-session[181286]: Disconnected from user zuul 192.168.122.30 port 51878
Dec 05 11:52:52 compute-0 sshd-session[181283]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:52:52 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Dec 05 11:52:52 compute-0 systemd-logind[792]: Session 24 logged out. Waiting for processes to exit.
Dec 05 11:52:52 compute-0 systemd-logind[792]: Removed session 24.
Dec 05 11:52:53 compute-0 python3.9[181436]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:52:54 compute-0 python3.9[181557]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935572.9950733-1229-7226660787892/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:54 compute-0 python3.9[181707]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:52:55 compute-0 python3.9[181783]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:55 compute-0 python3.9[181933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:52:56 compute-0 python3.9[182054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935575.1992695-1229-53174794787259/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:56 compute-0 python3.9[182204]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:52:57 compute-0 python3.9[182325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935576.3250377-1229-103211430038952/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:58 compute-0 python3.9[182475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:52:58 compute-0 python3.9[182596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935577.4364803-1229-122980577008084/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:52:59 compute-0 python3.9[182746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:52:59 compute-0 python3.9[182867]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935578.7208652-1229-260771065739552/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:53:00 compute-0 sudo[183017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftsiakfnxqfbaezmencfpewhjdekxooa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935579.8970375-1312-75684194196372/AnsiballZ_file.py'
Dec 05 11:53:00 compute-0 sudo[183017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:00 compute-0 python3.9[183019]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:53:00 compute-0 sudo[183017]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:00 compute-0 sudo[183169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnwfcjsenwmdawevkjnclcgagxyaowse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935580.4969773-1320-224830969564587/AnsiballZ_copy.py'
Dec 05 11:53:00 compute-0 sudo[183169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:00 compute-0 python3.9[183171]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:53:00 compute-0 sudo[183169]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:01 compute-0 sudo[183321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvvzoymhchgmmxirqiksyubldmgsnrgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935581.1134198-1328-67795812379893/AnsiballZ_stat.py'
Dec 05 11:53:01 compute-0 sudo[183321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:01 compute-0 python3.9[183323]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:53:01 compute-0 sudo[183321]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:02 compute-0 sudo[183473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inkwiehyrhmposoncbxyxvhuhjtvfxee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935581.804586-1336-249519458313554/AnsiballZ_stat.py'
Dec 05 11:53:02 compute-0 sudo[183473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:02 compute-0 python3.9[183475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:53:02 compute-0 sudo[183473]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:02 compute-0 sudo[183596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkuuaxdoqgjfrfnvrfkmcbnhitclkfmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935581.804586-1336-249519458313554/AnsiballZ_copy.py'
Dec 05 11:53:02 compute-0 sudo[183596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:53:02.998 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:53:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:53:02.999 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:53:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:53:02.999 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:53:03 compute-0 python3.9[183598]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764935581.804586-1336-249519458313554/.source _original_basename=.layqsaty follow=False checksum=f14ce4c1d82487c4ae1e4905f59d714f2109e75f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 05 11:53:03 compute-0 sudo[183596]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:03 compute-0 python3.9[183750]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:53:04 compute-0 python3.9[183902]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:53:05 compute-0 python3.9[184023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935584.0078566-1362-184769739173268/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:53:05 compute-0 python3.9[184173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:53:06 compute-0 python3.9[184294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935585.2016935-1377-32762689516596/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:53:06 compute-0 sudo[184450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogoikbopbmeynxpthdpprwxkzptdghks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935586.6408916-1394-94935364868914/AnsiballZ_container_config_data.py'
Dec 05 11:53:06 compute-0 sudo[184450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:06 compute-0 podman[184418]: 2025-12-05 11:53:06.943877587 +0000 UTC m=+0.068314259 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 05 11:53:07 compute-0 python3.9[184459]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 05 11:53:07 compute-0 sudo[184450]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:07 compute-0 sudo[184615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywudnodnskjzdqvhvtmwukbeazgotlgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935587.3512828-1403-179918825217462/AnsiballZ_container_config_hash.py'
Dec 05 11:53:07 compute-0 sudo[184615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:07 compute-0 python3.9[184617]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 11:53:07 compute-0 sudo[184615]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:08 compute-0 sudo[184767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upzvhejumcbujdlfkbvdcviwvtmzsqoz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935588.2367806-1413-34553857294941/AnsiballZ_edpm_container_manage.py'
Dec 05 11:53:08 compute-0 sudo[184767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:08 compute-0 python3[184769]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 11:53:09 compute-0 podman[184803]: 2025-12-05 11:53:08.988260274 +0000 UTC m=+0.019196615 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 05 11:53:09 compute-0 podman[184803]: 2025-12-05 11:53:09.790222131 +0000 UTC m=+0.821158442 container create 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 05 11:53:09 compute-0 python3[184769]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 05 11:53:09 compute-0 sudo[184767]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:10 compute-0 sudo[184992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvkvrzmsarjeznajnbxeouecxqkrfjca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935590.1197407-1421-184838865102694/AnsiballZ_stat.py'
Dec 05 11:53:10 compute-0 sudo[184992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:10 compute-0 python3.9[184994]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:53:10 compute-0 sudo[184992]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:11 compute-0 sudo[185146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgevfxjrywsybpvklivqjdxgdhpedsxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935591.2746222-1433-152772579889498/AnsiballZ_container_config_data.py'
Dec 05 11:53:11 compute-0 sudo[185146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:11 compute-0 python3.9[185148]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 05 11:53:11 compute-0 sudo[185146]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:12 compute-0 sudo[185298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqfmnlfzgyqhbrqyefccrrgluccpnmtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935592.0609295-1442-51089792376223/AnsiballZ_container_config_hash.py'
Dec 05 11:53:12 compute-0 sudo[185298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:12 compute-0 python3.9[185300]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 11:53:12 compute-0 sudo[185298]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:13 compute-0 sudo[185450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aogsrrfjjdqgcuepzpzpffyibhanrity ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935592.9181654-1452-278434918194144/AnsiballZ_edpm_container_manage.py'
Dec 05 11:53:13 compute-0 sudo[185450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:13 compute-0 python3[185452]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 11:53:13 compute-0 podman[185492]: 2025-12-05 11:53:13.759116852 +0000 UTC m=+0.029290095 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 05 11:53:13 compute-0 podman[185492]: 2025-12-05 11:53:13.954689067 +0000 UTC m=+0.224862290 container create 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 11:53:13 compute-0 python3[185452]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 05 11:53:14 compute-0 sudo[185450]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:14 compute-0 sudo[185691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqcbkogjhwwswkvkkjyrhuglkdlkmmjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935594.2841105-1460-9923687057758/AnsiballZ_stat.py'
Dec 05 11:53:14 compute-0 sudo[185691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:14 compute-0 podman[185650]: 2025-12-05 11:53:14.635909046 +0000 UTC m=+0.123368315 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 05 11:53:14 compute-0 python3.9[185698]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:53:14 compute-0 sudo[185691]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:15 compute-0 sudo[185856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jslmzzmjzyrnyylmxckyivbeekmehomo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935595.0490396-1469-50211185756660/AnsiballZ_file.py'
Dec 05 11:53:15 compute-0 sudo[185856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:15 compute-0 python3.9[185858]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:53:15 compute-0 sudo[185856]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:16 compute-0 sudo[186007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvebijpbhxckklumdvouxxbjxhodpuzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935595.6853445-1469-89166165564689/AnsiballZ_copy.py'
Dec 05 11:53:16 compute-0 sudo[186007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:16 compute-0 python3.9[186009]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935595.6853445-1469-89166165564689/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:53:16 compute-0 sudo[186007]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:16 compute-0 sudo[186083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlhhulonpvlgyrnvxjiqlhnlhffxnqkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935595.6853445-1469-89166165564689/AnsiballZ_systemd.py'
Dec 05 11:53:16 compute-0 sudo[186083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:16 compute-0 python3.9[186085]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:53:16 compute-0 systemd[1]: Reloading.
Dec 05 11:53:17 compute-0 systemd-rc-local-generator[186104]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:53:17 compute-0 systemd-sysv-generator[186108]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:53:17 compute-0 sudo[186083]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:17 compute-0 sudo[186193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqqivrrdfhzfkqztgbxnvneowdvbljba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935595.6853445-1469-89166165564689/AnsiballZ_systemd.py'
Dec 05 11:53:17 compute-0 sudo[186193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:17 compute-0 python3.9[186195]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:53:17 compute-0 systemd[1]: Reloading.
Dec 05 11:53:17 compute-0 systemd-rc-local-generator[186224]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:53:17 compute-0 systemd-sysv-generator[186227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:53:18 compute-0 systemd[1]: Starting nova_compute container...
Dec 05 11:53:18 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:53:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:18 compute-0 podman[186234]: 2025-12-05 11:53:18.668720276 +0000 UTC m=+0.505320859 container init 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 11:53:18 compute-0 podman[186234]: 2025-12-05 11:53:18.675285275 +0000 UTC m=+0.511885838 container start 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 11:53:18 compute-0 nova_compute[186250]: + sudo -E kolla_set_configs
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Validating config file
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Copying service configuration files
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Deleting /etc/ceph
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Creating directory /etc/ceph
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /etc/ceph
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Writing out command to execute
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 11:53:18 compute-0 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 11:53:18 compute-0 nova_compute[186250]: ++ cat /run_command
Dec 05 11:53:18 compute-0 nova_compute[186250]: + CMD=nova-compute
Dec 05 11:53:18 compute-0 nova_compute[186250]: + ARGS=
Dec 05 11:53:18 compute-0 nova_compute[186250]: + sudo kolla_copy_cacerts
Dec 05 11:53:18 compute-0 nova_compute[186250]: + [[ ! -n '' ]]
Dec 05 11:53:18 compute-0 nova_compute[186250]: + . kolla_extend_start
Dec 05 11:53:18 compute-0 nova_compute[186250]: Running command: 'nova-compute'
Dec 05 11:53:18 compute-0 nova_compute[186250]: + echo 'Running command: '\''nova-compute'\'''
Dec 05 11:53:18 compute-0 nova_compute[186250]: + umask 0022
Dec 05 11:53:18 compute-0 nova_compute[186250]: + exec nova-compute
Dec 05 11:53:18 compute-0 podman[186234]: nova_compute
Dec 05 11:53:18 compute-0 systemd[1]: Started nova_compute container.
Dec 05 11:53:18 compute-0 sudo[186193]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:19 compute-0 podman[186386]: 2025-12-05 11:53:19.652202366 +0000 UTC m=+0.055274713 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3)
Dec 05 11:53:19 compute-0 python3.9[186429]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:53:20 compute-0 python3.9[186582]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:53:20 compute-0 nova_compute[186250]: 2025-12-05 11:53:20.809 186254 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 11:53:20 compute-0 nova_compute[186250]: 2025-12-05 11:53:20.809 186254 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 11:53:20 compute-0 nova_compute[186250]: 2025-12-05 11:53:20.810 186254 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 11:53:20 compute-0 nova_compute[186250]: 2025-12-05 11:53:20.810 186254 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 05 11:53:20 compute-0 nova_compute[186250]: 2025-12-05 11:53:20.950 186254 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:53:20 compute-0 nova_compute[186250]: 2025-12-05 11:53:20.978 186254 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:53:20 compute-0 nova_compute[186250]: 2025-12-05 11:53:20.979 186254 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 05 11:53:21 compute-0 python3.9[186736]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.548 186254 INFO nova.virt.driver [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.662 186254 INFO nova.compute.provider_config [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.676 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.676 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.676 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 WARNING oslo_config.cfg [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 05 11:53:21 compute-0 nova_compute[186250]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 05 11:53:21 compute-0 nova_compute[186250]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 05 11:53:21 compute-0 nova_compute[186250]: and ``live_migration_inbound_addr`` respectively.
Dec 05 11:53:21 compute-0 nova_compute[186250]: ).  Its value may be silently ignored in the future.
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.750 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.750 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.750 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.750 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.811 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.811 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.811 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.812 186254 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.824 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.825 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.825 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.825 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 05 11:53:21 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 05 11:53:21 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.911 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2035121cd0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.915 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2035121cd0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.916 186254 INFO nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Connection event '1' reason 'None'
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.931 186254 WARNING nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 05 11:53:21 compute-0 nova_compute[186250]: 2025-12-05 11:53:21.932 186254 DEBUG nova.virt.libvirt.volume.mount [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 05 11:53:22 compute-0 sudo[186937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsoxprptixfdjpsscenhaqlkzxgwzabg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935601.4749012-1529-186261031823260/AnsiballZ_podman_container.py'
Dec 05 11:53:22 compute-0 sudo[186937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:22 compute-0 python3.9[186940]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 05 11:53:22 compute-0 sudo[186937]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:22 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 11:53:22 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 11:53:22 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.792 186254 INFO nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host capabilities <capabilities>
Dec 05 11:53:22 compute-0 nova_compute[186250]: 
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <host>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <uuid>60bd4df1-481e-4d23-9585-8528ade5c2b1</uuid>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <cpu>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <arch>x86_64</arch>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model>EPYC-Rome-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <vendor>AMD</vendor>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <microcode version='16777317'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <signature family='23' model='49' stepping='0'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='x2apic'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='tsc-deadline'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='osxsave'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='hypervisor'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='tsc_adjust'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='spec-ctrl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='stibp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='arch-capabilities'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='cmp_legacy'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='topoext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='virt-ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='lbrv'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='tsc-scale'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='vmcb-clean'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='pause-filter'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='pfthreshold'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='svme-addr-chk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='rdctl-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='skip-l1dfl-vmentry'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='mds-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature name='pschange-mc-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <pages unit='KiB' size='4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <pages unit='KiB' size='2048'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <pages unit='KiB' size='1048576'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </cpu>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <power_management>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <suspend_mem/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <suspend_disk/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <suspend_hybrid/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </power_management>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <iommu support='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <migration_features>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <live/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <uri_transports>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <uri_transport>tcp</uri_transport>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <uri_transport>rdma</uri_transport>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </uri_transports>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </migration_features>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <topology>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <cells num='1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <cell id='0'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:           <memory unit='KiB'>7864316</memory>
Dec 05 11:53:22 compute-0 nova_compute[186250]:           <pages unit='KiB' size='4'>1966079</pages>
Dec 05 11:53:22 compute-0 nova_compute[186250]:           <pages unit='KiB' size='2048'>0</pages>
Dec 05 11:53:22 compute-0 nova_compute[186250]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 05 11:53:22 compute-0 nova_compute[186250]:           <distances>
Dec 05 11:53:22 compute-0 nova_compute[186250]:             <sibling id='0' value='10'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:           </distances>
Dec 05 11:53:22 compute-0 nova_compute[186250]:           <cpus num='8'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:           </cpus>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         </cell>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </cells>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </topology>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <cache>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </cache>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <secmodel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model>selinux</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <doi>0</doi>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </secmodel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <secmodel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model>dac</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <doi>0</doi>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </secmodel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </host>
Dec 05 11:53:22 compute-0 nova_compute[186250]: 
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <guest>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <os_type>hvm</os_type>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <arch name='i686'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <wordsize>32</wordsize>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <domain type='qemu'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <domain type='kvm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </arch>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <features>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <pae/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <nonpae/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <acpi default='on' toggle='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <apic default='on' toggle='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <cpuselection/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <deviceboot/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <disksnapshot default='on' toggle='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <externalSnapshot/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </features>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </guest>
Dec 05 11:53:22 compute-0 nova_compute[186250]: 
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <guest>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <os_type>hvm</os_type>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <arch name='x86_64'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <wordsize>64</wordsize>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <domain type='qemu'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <domain type='kvm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </arch>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <features>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <acpi default='on' toggle='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <apic default='on' toggle='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <cpuselection/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <deviceboot/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <disksnapshot default='on' toggle='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <externalSnapshot/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </features>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </guest>
Dec 05 11:53:22 compute-0 nova_compute[186250]: 
Dec 05 11:53:22 compute-0 nova_compute[186250]: </capabilities>
Dec 05 11:53:22 compute-0 nova_compute[186250]: 
Dec 05 11:53:22 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.801 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 05 11:53:22 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.823 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 05 11:53:22 compute-0 nova_compute[186250]: <domainCapabilities>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <domain>kvm</domain>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <arch>i686</arch>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <vcpu max='4096'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <iothreads supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <os supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <enum name='firmware'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <loader supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>rom</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pflash</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='readonly'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>yes</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>no</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='secure'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>no</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </loader>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </os>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <cpu>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='host-passthrough' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='hostPassthroughMigratable'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>on</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>off</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='maximum' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='maximumMigratable'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>on</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>off</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='host-model' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <vendor>AMD</vendor>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='x2apic'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='hypervisor'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='stibp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='overflow-recov'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='succor'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='lbrv'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc-scale'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='flushbyasid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='pause-filter'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='pfthreshold'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='disable' name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='custom' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Dhyana-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Genoa'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='auto-ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='auto-ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10-128'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10-256'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10-512'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v6'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v7'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='KnightsMill'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512er'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512pf'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='KnightsMill-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512er'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512pf'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G4-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tbm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G5-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tbm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SierraForest'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cmpccxadd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SierraForest-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cmpccxadd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='athlon'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='athlon-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='core2duo'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='core2duo-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='coreduo'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='coreduo-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='n270'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='n270-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='phenom'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='phenom-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </cpu>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <memoryBacking supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <enum name='sourceType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>file</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>anonymous</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>memfd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </memoryBacking>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <devices>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <disk supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='diskDevice'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>disk</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>cdrom</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>floppy</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>lun</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='bus'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>fdc</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>scsi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>sata</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-non-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </disk>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <graphics supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vnc</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>egl-headless</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>dbus</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </graphics>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <video supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='modelType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vga</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>cirrus</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>none</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>bochs</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>ramfb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </video>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <hostdev supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='mode'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>subsystem</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='startupPolicy'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>default</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>mandatory</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>requisite</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>optional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='subsysType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pci</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>scsi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='capsType'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='pciBackend'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </hostdev>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <rng supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-non-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>random</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>egd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>builtin</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </rng>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <filesystem supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='driverType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>path</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>handle</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtiofs</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </filesystem>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <tpm supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tpm-tis</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tpm-crb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>emulator</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>external</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendVersion'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>2.0</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </tpm>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <redirdev supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='bus'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </redirdev>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <channel supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pty</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>unix</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </channel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <crypto supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>qemu</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>builtin</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </crypto>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <interface supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>default</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>passt</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </interface>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <panic supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>isa</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>hyperv</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </panic>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <console supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>null</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vc</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pty</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>dev</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>file</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pipe</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>stdio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>udp</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tcp</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>unix</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>qemu-vdagent</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>dbus</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </console>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </devices>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <features>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <gic supported='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <vmcoreinfo supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <genid supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <backingStoreInput supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <backup supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <async-teardown supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <ps2 supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <sev supported='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <sgx supported='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <hyperv supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='features'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>relaxed</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vapic</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>spinlocks</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vpindex</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>runtime</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>synic</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>stimer</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>reset</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vendor_id</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>frequencies</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>reenlightenment</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tlbflush</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>ipi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>avic</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>emsr_bitmap</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>xmm_input</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <defaults>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <spinlocks>4095</spinlocks>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <stimer_direct>on</stimer_direct>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </defaults>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </hyperv>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <launchSecurity supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='sectype'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tdx</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </launchSecurity>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </features>
Dec 05 11:53:22 compute-0 nova_compute[186250]: </domainCapabilities>
Dec 05 11:53:22 compute-0 nova_compute[186250]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 11:53:22 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.831 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 05 11:53:22 compute-0 nova_compute[186250]: <domainCapabilities>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <domain>kvm</domain>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <arch>i686</arch>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <vcpu max='240'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <iothreads supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <os supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <enum name='firmware'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <loader supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>rom</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pflash</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='readonly'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>yes</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>no</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='secure'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>no</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </loader>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </os>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <cpu>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='host-passthrough' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='hostPassthroughMigratable'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>on</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>off</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='maximum' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='maximumMigratable'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>on</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>off</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='host-model' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <vendor>AMD</vendor>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='x2apic'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='hypervisor'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='stibp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='overflow-recov'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='succor'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='lbrv'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc-scale'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='flushbyasid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='pause-filter'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='pfthreshold'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='disable' name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='custom' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Dhyana-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Genoa'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='auto-ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='auto-ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10-128'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10-256'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10-512'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v6'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v7'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='KnightsMill'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512er'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512pf'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='KnightsMill-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512er'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512pf'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G4-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tbm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G5-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tbm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SierraForest'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cmpccxadd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SierraForest-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cmpccxadd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='athlon'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='athlon-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='core2duo'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='core2duo-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='coreduo'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='coreduo-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='n270'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='n270-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='phenom'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='phenom-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </cpu>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <memoryBacking supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <enum name='sourceType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>file</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>anonymous</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>memfd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </memoryBacking>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <devices>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <disk supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='diskDevice'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>disk</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>cdrom</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>floppy</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>lun</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='bus'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>ide</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>fdc</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>scsi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>sata</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-non-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </disk>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <graphics supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vnc</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>egl-headless</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>dbus</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </graphics>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <video supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='modelType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vga</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>cirrus</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>none</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>bochs</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>ramfb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </video>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <hostdev supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='mode'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>subsystem</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='startupPolicy'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>default</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>mandatory</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>requisite</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>optional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='subsysType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pci</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>scsi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='capsType'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='pciBackend'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </hostdev>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <rng supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-non-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>random</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>egd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>builtin</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </rng>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <filesystem supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='driverType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>path</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>handle</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtiofs</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </filesystem>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <tpm supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tpm-tis</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tpm-crb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>emulator</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>external</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendVersion'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>2.0</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </tpm>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <redirdev supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='bus'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </redirdev>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <channel supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pty</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>unix</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </channel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <crypto supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>qemu</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>builtin</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </crypto>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <interface supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>default</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>passt</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </interface>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <panic supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>isa</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>hyperv</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </panic>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <console supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>null</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vc</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pty</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>dev</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>file</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pipe</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>stdio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>udp</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tcp</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>unix</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>qemu-vdagent</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>dbus</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </console>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </devices>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <features>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <gic supported='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <vmcoreinfo supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <genid supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <backingStoreInput supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <backup supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <async-teardown supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <ps2 supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <sev supported='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <sgx supported='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <hyperv supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='features'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>relaxed</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vapic</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>spinlocks</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vpindex</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>runtime</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>synic</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>stimer</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>reset</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vendor_id</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>frequencies</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>reenlightenment</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tlbflush</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>ipi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>avic</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>emsr_bitmap</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>xmm_input</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <defaults>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <spinlocks>4095</spinlocks>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <stimer_direct>on</stimer_direct>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </defaults>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </hyperv>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <launchSecurity supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='sectype'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tdx</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </launchSecurity>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </features>
Dec 05 11:53:22 compute-0 nova_compute[186250]: </domainCapabilities>
Dec 05 11:53:22 compute-0 nova_compute[186250]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 11:53:22 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.856 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 05 11:53:22 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.860 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 05 11:53:22 compute-0 nova_compute[186250]: <domainCapabilities>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <domain>kvm</domain>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <arch>x86_64</arch>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <vcpu max='4096'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <iothreads supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <os supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <enum name='firmware'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>efi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <loader supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>rom</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pflash</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='readonly'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>yes</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>no</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='secure'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>yes</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>no</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </loader>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </os>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <cpu>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='host-passthrough' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='hostPassthroughMigratable'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>on</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>off</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='maximum' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='maximumMigratable'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>on</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>off</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='host-model' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <vendor>AMD</vendor>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='x2apic'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='hypervisor'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='stibp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='overflow-recov'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='succor'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='lbrv'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc-scale'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='flushbyasid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='pause-filter'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='pfthreshold'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='disable' name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='custom' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 sudo[187121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqriqxnsemevdayyngjcnbbikykulmfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935602.6781876-1537-23971173724214/AnsiballZ_systemd.py'
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 sudo[187121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Denverton-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Dhyana-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Genoa'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='auto-ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='auto-ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='EPYC-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10-128'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10-256'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx10-512'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Haswell-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v6'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v7'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='KnightsMill'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512er'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512pf'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='KnightsMill-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512er'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512pf'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G4-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tbm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Opteron_G5-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tbm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SierraForest'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cmpccxadd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='SierraForest-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ifma'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cmpccxadd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='athlon'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='athlon-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='core2duo'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='core2duo-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='coreduo'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='coreduo-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='n270'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='n270-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='phenom'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='phenom-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </cpu>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <memoryBacking supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <enum name='sourceType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>file</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>anonymous</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>memfd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </memoryBacking>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <devices>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <disk supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='diskDevice'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>disk</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>cdrom</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>floppy</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>lun</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='bus'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>fdc</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>scsi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>sata</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-non-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </disk>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <graphics supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vnc</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>egl-headless</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>dbus</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </graphics>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <video supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='modelType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vga</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>cirrus</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>none</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>bochs</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>ramfb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </video>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <hostdev supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='mode'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>subsystem</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='startupPolicy'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>default</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>mandatory</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>requisite</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>optional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='subsysType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pci</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>scsi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='capsType'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='pciBackend'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </hostdev>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <rng supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtio-non-transitional</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>random</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>egd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>builtin</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </rng>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <filesystem supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='driverType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>path</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>handle</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>virtiofs</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </filesystem>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <tpm supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tpm-tis</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tpm-crb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>emulator</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>external</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendVersion'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>2.0</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </tpm>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <redirdev supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='bus'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </redirdev>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <channel supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pty</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>unix</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </channel>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <crypto supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>qemu</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>builtin</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </crypto>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <interface supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='backendType'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>default</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>passt</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </interface>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <panic supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>isa</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>hyperv</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </panic>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <console supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>null</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vc</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pty</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>dev</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>file</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pipe</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>stdio</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>udp</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tcp</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>unix</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>qemu-vdagent</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>dbus</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </console>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </devices>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <features>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <gic supported='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <vmcoreinfo supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <genid supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <backingStoreInput supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <backup supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <async-teardown supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <ps2 supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <sev supported='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <sgx supported='no'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <hyperv supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='features'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>relaxed</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vapic</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>spinlocks</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vpindex</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>runtime</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>synic</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>stimer</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>reset</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>vendor_id</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>frequencies</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>reenlightenment</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tlbflush</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>ipi</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>avic</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>emsr_bitmap</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>xmm_input</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <defaults>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <spinlocks>4095</spinlocks>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <stimer_direct>on</stimer_direct>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </defaults>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </hyperv>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <launchSecurity supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='sectype'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>tdx</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </launchSecurity>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </features>
Dec 05 11:53:22 compute-0 nova_compute[186250]: </domainCapabilities>
Dec 05 11:53:22 compute-0 nova_compute[186250]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 11:53:22 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.920 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 05 11:53:22 compute-0 nova_compute[186250]: <domainCapabilities>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <domain>kvm</domain>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <arch>x86_64</arch>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <vcpu max='240'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <iothreads supported='yes'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <os supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <enum name='firmware'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <loader supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>rom</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>pflash</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='readonly'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>yes</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>no</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='secure'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>no</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </loader>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   </os>
Dec 05 11:53:22 compute-0 nova_compute[186250]:   <cpu>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='host-passthrough' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='hostPassthroughMigratable'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>on</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>off</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='maximum' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <enum name='maximumMigratable'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>on</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <value>off</value>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='host-model' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <vendor>AMD</vendor>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='x2apic'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='hypervisor'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='stibp'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='overflow-recov'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='succor'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='ibrs'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='lbrv'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='tsc-scale'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='flushbyasid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='pause-filter'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='pfthreshold'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <feature policy='disable' name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:22 compute-0 nova_compute[186250]:     <mode name='custom' supported='yes'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Broadwell-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 11:53:22 compute-0 nova_compute[186250]:       <blockers model='Cooperlake'>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:22 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Cooperlake-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Cooperlake-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Denverton'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Denverton-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Denverton-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Denverton-v3'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Dhyana-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-Genoa'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='auto-ibrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='auto-ibrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-Milan-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amd-psfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='stibp-always-on'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-Rome-v3'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-v3'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='EPYC-v4'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='GraniteRapids-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx10'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx10-128'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx10-256'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx10-512'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='prefetchiti'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Haswell'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Haswell-IBRS'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Haswell-noTSX'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Haswell-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Haswell-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Haswell-v3'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Haswell-v4'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v3'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v4'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v5'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v6'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Icelake-Server-v7'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='IvyBridge'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-IBRS'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='IvyBridge-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='KnightsMill'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512er'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512pf'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='KnightsMill-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512er'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512pf'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Opteron_G4'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Opteron_G4-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Opteron_G5'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='tbm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Opteron_G5-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fma4'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='tbm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xop'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='SapphireRapids-v3'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-int8'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='amx-tile'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-bf16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-fp16'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bitalg'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrc'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fzrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='la57'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='taa-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xfd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='SierraForest'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='cmpccxadd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='SierraForest-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-ifma'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='cmpccxadd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fbsdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='fsrs'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ibrs-all'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='mcdt-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pbrsb-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='psdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='serialize'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vaes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v3'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Client-v4'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='hle'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='rtm'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v3'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v4'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Skylake-Server-v5'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512bw'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512cd'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512dq'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512f'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='avx512vl'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='invpcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pcid'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='pku'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Snowridge'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='mpx'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v2'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v3'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='core-capability'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='split-lock-detect'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='Snowridge-v4'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='cldemote'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='erms'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='gfni'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdir64b'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='movdiri'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='xsaves'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='athlon'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='athlon-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='core2duo'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='core2duo-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='coreduo'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='coreduo-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='n270'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='n270-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='ss'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='phenom'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <blockers model='phenom-v1'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='3dnow'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <feature name='3dnowext'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </blockers>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </mode>
Dec 05 11:53:23 compute-0 nova_compute[186250]:   </cpu>
Dec 05 11:53:23 compute-0 nova_compute[186250]:   <memoryBacking supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <enum name='sourceType'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <value>file</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <value>anonymous</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <value>memfd</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:   </memoryBacking>
Dec 05 11:53:23 compute-0 nova_compute[186250]:   <devices>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <disk supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='diskDevice'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>disk</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>cdrom</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>floppy</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>lun</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='bus'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>ide</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>fdc</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>scsi</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>sata</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>virtio-transitional</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>virtio-non-transitional</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </disk>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <graphics supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>vnc</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>egl-headless</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>dbus</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </graphics>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <video supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='modelType'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>vga</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>cirrus</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>none</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>bochs</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>ramfb</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </video>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <hostdev supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='mode'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>subsystem</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='startupPolicy'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>default</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>mandatory</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>requisite</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>optional</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='subsysType'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>pci</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>scsi</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='capsType'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='pciBackend'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </hostdev>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <rng supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>virtio</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>virtio-transitional</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>virtio-non-transitional</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>random</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>egd</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>builtin</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </rng>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <filesystem supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='driverType'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>path</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>handle</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>virtiofs</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </filesystem>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <tpm supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>tpm-tis</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>tpm-crb</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>emulator</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>external</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='backendVersion'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>2.0</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </tpm>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <redirdev supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='bus'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>usb</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </redirdev>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <channel supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>pty</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>unix</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </channel>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <crypto supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='model'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>qemu</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='backendModel'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>builtin</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </crypto>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <interface supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='backendType'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>default</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>passt</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </interface>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <panic supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='model'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>isa</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>hyperv</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </panic>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <console supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='type'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>null</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>vc</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>pty</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>dev</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>file</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>pipe</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>stdio</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>udp</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>tcp</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>unix</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>qemu-vdagent</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>dbus</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </console>
Dec 05 11:53:23 compute-0 nova_compute[186250]:   </devices>
Dec 05 11:53:23 compute-0 nova_compute[186250]:   <features>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <gic supported='no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <vmcoreinfo supported='yes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <genid supported='yes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <backingStoreInput supported='yes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <backup supported='yes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <async-teardown supported='yes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <ps2 supported='yes'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <sev supported='no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <sgx supported='no'/>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <hyperv supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='features'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>relaxed</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>vapic</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>spinlocks</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>vpindex</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>runtime</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>synic</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>stimer</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>reset</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>vendor_id</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>frequencies</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>reenlightenment</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>tlbflush</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>ipi</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>avic</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>emsr_bitmap</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>xmm_input</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <defaults>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <spinlocks>4095</spinlocks>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <stimer_direct>on</stimer_direct>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </defaults>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </hyperv>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     <launchSecurity supported='yes'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       <enum name='sectype'>
Dec 05 11:53:23 compute-0 nova_compute[186250]:         <value>tdx</value>
Dec 05 11:53:23 compute-0 nova_compute[186250]:       </enum>
Dec 05 11:53:23 compute-0 nova_compute[186250]:     </launchSecurity>
Dec 05 11:53:23 compute-0 nova_compute[186250]:   </features>
Dec 05 11:53:23 compute-0 nova_compute[186250]: </domainCapabilities>
Dec 05 11:53:23 compute-0 nova_compute[186250]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.982 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.983 186254 INFO nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Secure Boot support detected
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.984 186254 INFO nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.984 186254 INFO nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:22.993 186254 DEBUG nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.026 186254 INFO nova.virt.node [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Determined node identity 5111707b-bdc3-4252-b5b7-b3e96ff05344 from /var/lib/nova/compute_id
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.046 186254 WARNING nova.compute.manager [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Compute nodes ['5111707b-bdc3-4252-b5b7-b3e96ff05344'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.077 186254 INFO nova.compute.manager [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.108 186254 WARNING nova.compute.manager [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.108 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.109 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.109 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.109 186254 DEBUG nova.compute.resource_tracker [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 11:53:23 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 05 11:53:23 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 05 11:53:23 compute-0 python3.9[187123]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:53:23 compute-0 systemd[1]: Stopping nova_compute container...
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.416 186254 WARNING nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.417 186254 DEBUG nova.compute.resource_tracker [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6187MB free_disk=73.5449333190918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.417 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.417 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.431 186254 WARNING nova.compute.resource_tracker [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] No compute node record for compute-0.ctlplane.example.com:5111707b-bdc3-4252-b5b7-b3e96ff05344: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 5111707b-bdc3-4252-b5b7-b3e96ff05344 could not be found.
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.444 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.444 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.445 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:53:23 compute-0 nova_compute[186250]: 2025-12-05 11:53:23.445 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:53:23 compute-0 virtqemud[186841]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 05 11:53:23 compute-0 virtqemud[186841]: hostname: compute-0
Dec 05 11:53:23 compute-0 virtqemud[186841]: End of file while reading data: Input/output error
Dec 05 11:53:23 compute-0 systemd[1]: libpod-5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9.scope: Deactivated successfully.
Dec 05 11:53:23 compute-0 systemd[1]: libpod-5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9.scope: Consumed 3.229s CPU time.
Dec 05 11:53:23 compute-0 podman[187150]: 2025-12-05 11:53:23.915648255 +0000 UTC m=+0.562184381 container died 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 05 11:53:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9-userdata-shm.mount: Deactivated successfully.
Dec 05 11:53:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41-merged.mount: Deactivated successfully.
Dec 05 11:53:25 compute-0 podman[187150]: 2025-12-05 11:53:25.538171747 +0000 UTC m=+2.184707863 container cleanup 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute)
Dec 05 11:53:25 compute-0 podman[187150]: nova_compute
Dec 05 11:53:25 compute-0 podman[187180]: nova_compute
Dec 05 11:53:25 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 05 11:53:25 compute-0 systemd[1]: Stopped nova_compute container.
Dec 05 11:53:25 compute-0 systemd[1]: Starting nova_compute container...
Dec 05 11:53:27 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:53:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:27 compute-0 podman[187193]: 2025-12-05 11:53:27.83718309 +0000 UTC m=+2.198793707 container init 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 11:53:27 compute-0 podman[187193]: 2025-12-05 11:53:27.84791605 +0000 UTC m=+2.209526607 container start 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 11:53:27 compute-0 nova_compute[187208]: + sudo -E kolla_set_configs
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Validating config file
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Copying service configuration files
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Deleting /etc/ceph
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Creating directory /etc/ceph
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /etc/ceph
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Writing out command to execute
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 11:53:27 compute-0 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 11:53:27 compute-0 nova_compute[187208]: ++ cat /run_command
Dec 05 11:53:27 compute-0 nova_compute[187208]: + CMD=nova-compute
Dec 05 11:53:27 compute-0 nova_compute[187208]: + ARGS=
Dec 05 11:53:27 compute-0 nova_compute[187208]: + sudo kolla_copy_cacerts
Dec 05 11:53:27 compute-0 nova_compute[187208]: + [[ ! -n '' ]]
Dec 05 11:53:27 compute-0 nova_compute[187208]: + . kolla_extend_start
Dec 05 11:53:27 compute-0 nova_compute[187208]: Running command: 'nova-compute'
Dec 05 11:53:27 compute-0 nova_compute[187208]: + echo 'Running command: '\''nova-compute'\'''
Dec 05 11:53:27 compute-0 nova_compute[187208]: + umask 0022
Dec 05 11:53:27 compute-0 nova_compute[187208]: + exec nova-compute
Dec 05 11:53:28 compute-0 podman[187193]: nova_compute
Dec 05 11:53:28 compute-0 systemd[1]: Started nova_compute container.
Dec 05 11:53:28 compute-0 sudo[187121]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:28 compute-0 sudo[187369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzdbaqcowjnixgnfpxzxanfkjquntysc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935608.3540425-1546-222109866341368/AnsiballZ_podman_container.py'
Dec 05 11:53:28 compute-0 sudo[187369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:28 compute-0 python3.9[187371]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 05 11:53:29 compute-0 systemd[1]: Started libpod-conmon-8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9.scope.
Dec 05 11:53:29 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:53:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd705a07c7bc29a9eafa697375b74dc2ecbeb5bf38a91f03c149fb0a99e25516/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd705a07c7bc29a9eafa697375b74dc2ecbeb5bf38a91f03c149fb0a99e25516/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd705a07c7bc29a9eafa697375b74dc2ecbeb5bf38a91f03c149fb0a99e25516/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 05 11:53:29 compute-0 podman[187397]: 2025-12-05 11:53:29.610375543 +0000 UTC m=+0.543119359 container init 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 11:53:29 compute-0 podman[187397]: 2025-12-05 11:53:29.622654287 +0000 UTC m=+0.555398053 container start 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Applying nova statedir ownership
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 05 11:53:29 compute-0 nova_compute_init[187418]: INFO:nova_statedir:Nova statedir ownership complete
Dec 05 11:53:29 compute-0 systemd[1]: libpod-8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9.scope: Deactivated successfully.
Dec 05 11:53:29 compute-0 nova_compute[187208]: 2025-12-05 11:53:29.895 187212 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 11:53:29 compute-0 nova_compute[187208]: 2025-12-05 11:53:29.895 187212 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 11:53:29 compute-0 nova_compute[187208]: 2025-12-05 11:53:29.895 187212 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 11:53:29 compute-0 nova_compute[187208]: 2025-12-05 11:53:29.896 187212 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 05 11:53:29 compute-0 python3.9[187371]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 05 11:53:30 compute-0 podman[187432]: 2025-12-05 11:53:30.002960015 +0000 UTC m=+0.023893159 container died 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.028 187212 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.050 187212 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.051 187212 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.542 187212 INFO nova.virt.driver [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.665 187212 INFO nova.compute.provider_config [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.683 187212 DEBUG oslo_concurrency.lockutils [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.683 187212 DEBUG oslo_concurrency.lockutils [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.683 187212 DEBUG oslo_concurrency.lockutils [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.683 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 WARNING oslo_config.cfg [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 05 11:53:30 compute-0 nova_compute[187208]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 05 11:53:30 compute-0 nova_compute[187208]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 05 11:53:30 compute-0 nova_compute[187208]: and ``live_migration_inbound_addr`` respectively.
Dec 05 11:53:30 compute-0 nova_compute[187208]: ).  Its value may be silently ignored in the future.
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.773 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.817 187212 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.831 187212 INFO nova.virt.node [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Determined node identity 5111707b-bdc3-4252-b5b7-b3e96ff05344 from /var/lib/nova/compute_id
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.832 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.832 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.832 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.833 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.847 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd351eec4f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.850 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd351eec4f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.851 187212 INFO nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Connection event '1' reason 'None'
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.865 187212 DEBUG nova.virt.libvirt.volume.mount [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.865 187212 INFO nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host capabilities <capabilities>
Dec 05 11:53:30 compute-0 nova_compute[187208]: 
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <host>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <uuid>60bd4df1-481e-4d23-9585-8528ade5c2b1</uuid>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <cpu>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <arch>x86_64</arch>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model>EPYC-Rome-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <vendor>AMD</vendor>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <microcode version='16777317'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <signature family='23' model='49' stepping='0'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='x2apic'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='tsc-deadline'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='osxsave'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='hypervisor'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='tsc_adjust'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='spec-ctrl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='stibp'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='arch-capabilities'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='ssbd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='cmp_legacy'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='topoext'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='virt-ssbd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='lbrv'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='tsc-scale'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='vmcb-clean'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='pause-filter'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='pfthreshold'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='svme-addr-chk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='rdctl-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='skip-l1dfl-vmentry'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='mds-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature name='pschange-mc-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <pages unit='KiB' size='4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <pages unit='KiB' size='2048'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <pages unit='KiB' size='1048576'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </cpu>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <power_management>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <suspend_mem/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <suspend_disk/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <suspend_hybrid/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </power_management>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <iommu support='no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <migration_features>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <live/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <uri_transports>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <uri_transport>tcp</uri_transport>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <uri_transport>rdma</uri_transport>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </uri_transports>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </migration_features>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <topology>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <cells num='1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <cell id='0'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:           <memory unit='KiB'>7864316</memory>
Dec 05 11:53:30 compute-0 nova_compute[187208]:           <pages unit='KiB' size='4'>1966079</pages>
Dec 05 11:53:30 compute-0 nova_compute[187208]:           <pages unit='KiB' size='2048'>0</pages>
Dec 05 11:53:30 compute-0 nova_compute[187208]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 05 11:53:30 compute-0 nova_compute[187208]:           <distances>
Dec 05 11:53:30 compute-0 nova_compute[187208]:             <sibling id='0' value='10'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:           </distances>
Dec 05 11:53:30 compute-0 nova_compute[187208]:           <cpus num='8'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:           </cpus>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         </cell>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </cells>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </topology>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <cache>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </cache>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <secmodel>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model>selinux</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <doi>0</doi>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </secmodel>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <secmodel>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model>dac</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <doi>0</doi>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </secmodel>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   </host>
Dec 05 11:53:30 compute-0 nova_compute[187208]: 
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <guest>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <os_type>hvm</os_type>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <arch name='i686'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <wordsize>32</wordsize>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <domain type='qemu'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <domain type='kvm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </arch>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <features>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <pae/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <nonpae/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <acpi default='on' toggle='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <apic default='on' toggle='no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <cpuselection/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <deviceboot/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <disksnapshot default='on' toggle='no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <externalSnapshot/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </features>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   </guest>
Dec 05 11:53:30 compute-0 nova_compute[187208]: 
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <guest>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <os_type>hvm</os_type>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <arch name='x86_64'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <wordsize>64</wordsize>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <domain type='qemu'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <domain type='kvm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </arch>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <features>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <acpi default='on' toggle='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <apic default='on' toggle='no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <cpuselection/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <deviceboot/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <disksnapshot default='on' toggle='no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <externalSnapshot/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </features>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   </guest>
Dec 05 11:53:30 compute-0 nova_compute[187208]: 
Dec 05 11:53:30 compute-0 nova_compute[187208]: </capabilities>
Dec 05 11:53:30 compute-0 nova_compute[187208]: 
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.872 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.875 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 05 11:53:30 compute-0 nova_compute[187208]: <domainCapabilities>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <domain>kvm</domain>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <arch>i686</arch>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <vcpu max='240'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <iothreads supported='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <os supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <enum name='firmware'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <loader supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>rom</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>pflash</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='readonly'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>yes</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>no</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='secure'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>no</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </loader>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   </os>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <cpu>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <mode name='host-passthrough' supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='hostPassthroughMigratable'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>on</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>off</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <mode name='maximum' supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='maximumMigratable'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>on</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>off</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <mode name='host-model' supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <vendor>AMD</vendor>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='x2apic'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='hypervisor'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='stibp'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='ssbd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='overflow-recov'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='succor'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='ibrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='lbrv'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc-scale'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='flushbyasid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='pause-filter'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='pfthreshold'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='disable' name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <mode name='custom' supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-noTSX'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cooperlake'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cooperlake-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cooperlake-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Denverton'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Denverton-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Denverton-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Denverton-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Dhyana-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Genoa'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='auto-ibrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='auto-ibrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx10'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx10-128'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx10-256'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx10-512'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-noTSX'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v5'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v6'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v7'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='IvyBridge'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='KnightsMill'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512er'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512pf'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='KnightsMill-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512er'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512pf'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Opteron_G4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Opteron_G4-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Opteron_G5'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tbm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Opteron_G5-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tbm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SierraForest'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cmpccxadd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SierraForest-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cmpccxadd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v5'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Snowridge'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='athlon'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='athlon-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='core2duo'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='core2duo-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='coreduo'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='coreduo-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='n270'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='n270-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='phenom'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='phenom-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <memoryBacking supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <enum name='sourceType'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <value>file</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <value>anonymous</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <value>memfd</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   </memoryBacking>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <disk supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='diskDevice'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>disk</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>cdrom</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>floppy</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>lun</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='bus'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>ide</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>fdc</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>scsi</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>sata</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>virtio-transitional</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>virtio-non-transitional</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <graphics supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>vnc</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>egl-headless</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>dbus</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </graphics>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <video supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='modelType'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>vga</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>cirrus</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>none</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>bochs</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>ramfb</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </video>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <hostdev supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='mode'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>subsystem</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='startupPolicy'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>default</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>mandatory</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>requisite</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>optional</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='subsysType'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>pci</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>scsi</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='capsType'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='pciBackend'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </hostdev>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <rng supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>virtio-transitional</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>virtio-non-transitional</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>random</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>egd</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>builtin</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <filesystem supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='driverType'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>path</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>handle</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>virtiofs</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </filesystem>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <tpm supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>tpm-tis</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>tpm-crb</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>emulator</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>external</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='backendVersion'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>2.0</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </tpm>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <redirdev supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='bus'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </redirdev>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <channel supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>pty</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>unix</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </channel>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <crypto supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='model'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>qemu</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>builtin</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </crypto>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <interface supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='backendType'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>default</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>passt</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </interface>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <panic supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>isa</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>hyperv</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </panic>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <console supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>null</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>vc</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>pty</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>dev</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>file</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>pipe</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>stdio</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>udp</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>tcp</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>unix</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>qemu-vdagent</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>dbus</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </console>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <features>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <gic supported='no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <vmcoreinfo supported='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <genid supported='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <backingStoreInput supported='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <backup supported='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <async-teardown supported='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <ps2 supported='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <sev supported='no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <sgx supported='no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <hyperv supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='features'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>relaxed</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>vapic</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>spinlocks</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>vpindex</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>runtime</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>synic</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>stimer</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>reset</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>vendor_id</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>frequencies</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>reenlightenment</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>tlbflush</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>ipi</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>avic</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>emsr_bitmap</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>xmm_input</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <defaults>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <spinlocks>4095</spinlocks>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <stimer_direct>on</stimer_direct>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </defaults>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </hyperv>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <launchSecurity supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='sectype'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>tdx</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </launchSecurity>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   </features>
Dec 05 11:53:30 compute-0 nova_compute[187208]: </domainCapabilities>
Dec 05 11:53:30 compute-0 nova_compute[187208]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 11:53:30 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.881 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 05 11:53:30 compute-0 nova_compute[187208]: <domainCapabilities>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <domain>kvm</domain>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <arch>i686</arch>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <vcpu max='4096'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <iothreads supported='yes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <os supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <enum name='firmware'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <loader supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>rom</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>pflash</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='readonly'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>yes</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>no</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='secure'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>no</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </loader>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   </os>
Dec 05 11:53:30 compute-0 nova_compute[187208]:   <cpu>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <mode name='host-passthrough' supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='hostPassthroughMigratable'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>on</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>off</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <mode name='maximum' supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <enum name='maximumMigratable'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>on</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <value>off</value>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <mode name='host-model' supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <vendor>AMD</vendor>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='x2apic'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='hypervisor'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='stibp'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='ssbd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='overflow-recov'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='succor'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='ibrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='lbrv'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc-scale'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='flushbyasid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='pause-filter'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='pfthreshold'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <feature policy='disable' name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:30 compute-0 nova_compute[187208]:     <mode name='custom' supported='yes'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-noTSX'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cooperlake'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cooperlake-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Cooperlake-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Denverton'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Denverton-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Denverton-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Denverton-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Dhyana-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Genoa'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='auto-ibrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='auto-ibrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='EPYC-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx10'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx10-128'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx10-256'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx10-512'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-noTSX'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Haswell-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v5'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v6'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v7'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='IvyBridge'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-IBRS'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='KnightsMill'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512er'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512pf'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='KnightsMill-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512er'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512pf'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Opteron_G4'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Opteron_G4-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Opteron_G5'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tbm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Opteron_G5-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tbm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v2'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v3'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SierraForest'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cmpccxadd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='SierraForest-v1'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-ifma'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='cmpccxadd'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 11:53:30 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client'>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:30 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v5'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='athlon'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='athlon-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='core2duo'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='core2duo-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='coreduo'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='coreduo-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='n270'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='n270-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='phenom'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='phenom-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <memoryBacking supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <enum name='sourceType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>file</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>anonymous</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>memfd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </memoryBacking>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <disk supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='diskDevice'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>disk</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>cdrom</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>floppy</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>lun</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='bus'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>fdc</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>scsi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>sata</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-non-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <graphics supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vnc</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>egl-headless</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>dbus</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </graphics>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <video supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='modelType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vga</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>cirrus</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>none</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>bochs</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>ramfb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </video>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <hostdev supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='mode'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>subsystem</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='startupPolicy'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>default</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>mandatory</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>requisite</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>optional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='subsysType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pci</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>scsi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='capsType'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='pciBackend'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </hostdev>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <rng supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-non-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>random</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>egd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>builtin</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <filesystem supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='driverType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>path</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>handle</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtiofs</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </filesystem>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <tpm supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tpm-tis</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tpm-crb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>emulator</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>external</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendVersion'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>2.0</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </tpm>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <redirdev supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='bus'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </redirdev>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <channel supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pty</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>unix</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </channel>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <crypto supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>qemu</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>builtin</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </crypto>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <interface supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>default</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>passt</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </interface>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <panic supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>isa</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>hyperv</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </panic>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <console supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>null</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vc</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pty</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>dev</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>file</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pipe</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>stdio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>udp</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tcp</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>unix</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>qemu-vdagent</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>dbus</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </console>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <features>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <gic supported='no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <vmcoreinfo supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <genid supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <backingStoreInput supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <backup supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <async-teardown supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <ps2 supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <sev supported='no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <sgx supported='no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <hyperv supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='features'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>relaxed</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vapic</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>spinlocks</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vpindex</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>runtime</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>synic</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>stimer</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>reset</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vendor_id</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>frequencies</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>reenlightenment</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tlbflush</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>ipi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>avic</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>emsr_bitmap</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>xmm_input</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <defaults>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <spinlocks>4095</spinlocks>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <stimer_direct>on</stimer_direct>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </defaults>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </hyperv>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <launchSecurity supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='sectype'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tdx</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </launchSecurity>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </features>
Dec 05 11:53:31 compute-0 nova_compute[187208]: </domainCapabilities>
Dec 05 11:53:31 compute-0 nova_compute[187208]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.929 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:30.933 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 05 11:53:31 compute-0 nova_compute[187208]: <domainCapabilities>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <domain>kvm</domain>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <arch>x86_64</arch>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <vcpu max='240'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <iothreads supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <os supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <enum name='firmware'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <loader supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>rom</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pflash</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='readonly'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>yes</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>no</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='secure'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>no</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </loader>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </os>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <cpu>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <mode name='host-passthrough' supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='hostPassthroughMigratable'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>on</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>off</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <mode name='maximum' supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='maximumMigratable'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>on</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>off</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <mode name='host-model' supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <vendor>AMD</vendor>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='x2apic'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='hypervisor'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='stibp'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='ssbd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='overflow-recov'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='succor'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='ibrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='lbrv'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc-scale'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='flushbyasid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='pause-filter'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='pfthreshold'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='disable' name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <mode name='custom' supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-noTSX'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cooperlake'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cooperlake-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cooperlake-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Denverton'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Denverton-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Denverton-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Denverton-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Dhyana-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Genoa'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='auto-ibrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='auto-ibrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx10'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx10-128'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx10-256'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx10-512'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-noTSX'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v5'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v6'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v7'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='IvyBridge'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='KnightsMill'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512er'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512pf'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='KnightsMill-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512er'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512pf'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Opteron_G4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Opteron_G4-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Opteron_G5'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tbm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Opteron_G5-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tbm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SierraForest'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cmpccxadd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SierraForest-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cmpccxadd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v5'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='athlon'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='athlon-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='core2duo'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='core2duo-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='coreduo'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='coreduo-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='n270'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='n270-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='phenom'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='phenom-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <memoryBacking supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <enum name='sourceType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>file</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>anonymous</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>memfd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </memoryBacking>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <disk supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='diskDevice'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>disk</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>cdrom</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>floppy</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>lun</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='bus'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>ide</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>fdc</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>scsi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>sata</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-non-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <graphics supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vnc</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>egl-headless</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>dbus</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </graphics>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <video supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='modelType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vga</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>cirrus</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>none</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>bochs</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>ramfb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </video>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <hostdev supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='mode'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>subsystem</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='startupPolicy'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>default</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>mandatory</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>requisite</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>optional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='subsysType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pci</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>scsi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='capsType'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='pciBackend'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </hostdev>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <rng supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-non-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>random</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>egd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>builtin</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <filesystem supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='driverType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>path</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>handle</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtiofs</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </filesystem>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <tpm supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tpm-tis</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tpm-crb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>emulator</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>external</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendVersion'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>2.0</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </tpm>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <redirdev supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='bus'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </redirdev>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <channel supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pty</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>unix</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </channel>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <crypto supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>qemu</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>builtin</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </crypto>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <interface supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>default</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>passt</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </interface>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <panic supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>isa</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>hyperv</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </panic>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <console supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>null</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vc</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pty</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>dev</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>file</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pipe</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>stdio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>udp</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tcp</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>unix</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>qemu-vdagent</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>dbus</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </console>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <features>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <gic supported='no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <vmcoreinfo supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <genid supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <backingStoreInput supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <backup supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <async-teardown supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <ps2 supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <sev supported='no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <sgx supported='no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <hyperv supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='features'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>relaxed</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vapic</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>spinlocks</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vpindex</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>runtime</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>synic</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>stimer</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>reset</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vendor_id</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>frequencies</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>reenlightenment</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tlbflush</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>ipi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>avic</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>emsr_bitmap</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>xmm_input</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <defaults>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <spinlocks>4095</spinlocks>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <stimer_direct>on</stimer_direct>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </defaults>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </hyperv>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <launchSecurity supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='sectype'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tdx</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </launchSecurity>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </features>
Dec 05 11:53:31 compute-0 nova_compute[187208]: </domainCapabilities>
Dec 05 11:53:31 compute-0 nova_compute[187208]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.011 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 05 11:53:31 compute-0 nova_compute[187208]: <domainCapabilities>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <domain>kvm</domain>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <arch>x86_64</arch>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <vcpu max='4096'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <iothreads supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <os supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <enum name='firmware'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>efi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <loader supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>rom</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pflash</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='readonly'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>yes</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>no</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='secure'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>yes</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>no</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </loader>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </os>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <cpu>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <mode name='host-passthrough' supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='hostPassthroughMigratable'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>on</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>off</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <mode name='maximum' supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='maximumMigratable'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>on</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>off</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <mode name='host-model' supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <vendor>AMD</vendor>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='x2apic'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='hypervisor'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='stibp'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='ssbd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='overflow-recov'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='succor'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='ibrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='lbrv'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='tsc-scale'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='flushbyasid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='pause-filter'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='pfthreshold'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <feature policy='disable' name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <mode name='custom' supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-noTSX'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Broadwell-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cooperlake'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cooperlake-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Cooperlake-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Denverton'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Denverton-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Denverton-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Denverton-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Dhyana-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Genoa'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='auto-ibrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='auto-ibrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Milan-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amd-psfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='no-nested-data-bp'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='null-sel-clr-base'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='stibp-always-on'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-Rome-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='EPYC-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='GraniteRapids-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx10'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx10-128'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx10-256'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx10-512'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='prefetchiti'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-noTSX'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Haswell-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v5'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v6'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Icelake-Server-v7'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='IvyBridge'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='IvyBridge-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='KnightsMill'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512er'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512pf'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='KnightsMill-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-4fmaps'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-4vnniw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512er'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512pf'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Opteron_G4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Opteron_G4-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Opteron_G5'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tbm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Opteron_G5-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fma4'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tbm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xop'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SapphireRapids-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='amx-tile'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-bf16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-fp16'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512-vpopcntdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bitalg'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vbmi2'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrc'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fzrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='la57'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='taa-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='tsx-ldtrk'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xfd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SierraForest'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cmpccxadd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='SierraForest-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-ifma'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-ne-convert'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx-vnni-int8'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='bus-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cmpccxadd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fbsdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='fsrs'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ibrs-all'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mcdt-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pbrsb-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='psdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='sbdr-ssdp-no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='serialize'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vaes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='vpclmulqdq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Client-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='hle'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='rtm'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Skylake-Server-v5'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512bw'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512cd'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512dq'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512f'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='avx512vl'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='invpcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pcid'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='pku'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='mpx'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v2'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v3'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='core-capability'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='split-lock-detect'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='Snowridge-v4'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='cldemote'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='erms'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='gfni'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdir64b'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='movdiri'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='xsaves'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='athlon'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='athlon-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='core2duo'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='core2duo-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='coreduo'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='coreduo-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='n270'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='n270-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='ss'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='phenom'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <blockers model='phenom-v1'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnow'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <feature name='3dnowext'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </blockers>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </mode>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <memoryBacking supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <enum name='sourceType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>file</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>anonymous</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <value>memfd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </memoryBacking>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <disk supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='diskDevice'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>disk</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>cdrom</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>floppy</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>lun</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='bus'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>fdc</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>scsi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>sata</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-non-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <graphics supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vnc</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>egl-headless</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>dbus</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </graphics>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <video supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='modelType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vga</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>cirrus</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>none</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>bochs</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>ramfb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </video>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <hostdev supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='mode'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>subsystem</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='startupPolicy'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>default</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>mandatory</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>requisite</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>optional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='subsysType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pci</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>scsi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='capsType'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='pciBackend'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </hostdev>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <rng supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtio-non-transitional</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>random</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>egd</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>builtin</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <filesystem supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='driverType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>path</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>handle</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>virtiofs</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </filesystem>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <tpm supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tpm-tis</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tpm-crb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>emulator</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>external</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendVersion'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>2.0</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </tpm>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <redirdev supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='bus'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>usb</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </redirdev>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <channel supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pty</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>unix</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </channel>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <crypto supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>qemu</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendModel'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>builtin</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </crypto>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <interface supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='backendType'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>default</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>passt</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </interface>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <panic supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='model'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>isa</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>hyperv</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </panic>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <console supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='type'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>null</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vc</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pty</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>dev</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>file</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>pipe</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>stdio</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>udp</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tcp</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>unix</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>qemu-vdagent</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>dbus</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </console>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   <features>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <gic supported='no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <vmcoreinfo supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <genid supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <backingStoreInput supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <backup supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <async-teardown supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <ps2 supported='yes'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <sev supported='no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <sgx supported='no'/>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <hyperv supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='features'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>relaxed</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vapic</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>spinlocks</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vpindex</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>runtime</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>synic</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>stimer</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>reset</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>vendor_id</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>frequencies</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>reenlightenment</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tlbflush</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>ipi</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>avic</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>emsr_bitmap</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>xmm_input</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <defaults>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <spinlocks>4095</spinlocks>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <stimer_direct>on</stimer_direct>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </defaults>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </hyperv>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     <launchSecurity supported='yes'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       <enum name='sectype'>
Dec 05 11:53:31 compute-0 nova_compute[187208]:         <value>tdx</value>
Dec 05 11:53:31 compute-0 nova_compute[187208]:       </enum>
Dec 05 11:53:31 compute-0 nova_compute[187208]:     </launchSecurity>
Dec 05 11:53:31 compute-0 nova_compute[187208]:   </features>
Dec 05 11:53:31 compute-0 nova_compute[187208]: </domainCapabilities>
Dec 05 11:53:31 compute-0 nova_compute[187208]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.074 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.074 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.074 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.074 187212 INFO nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Secure Boot support detected
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.077 187212 INFO nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.077 187212 INFO nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.086 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.113 187212 INFO nova.virt.node [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Determined node identity 5111707b-bdc3-4252-b5b7-b3e96ff05344 from /var/lib/nova/compute_id
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.134 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Verified node 5111707b-bdc3-4252-b5b7-b3e96ff05344 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.165 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 05 11:53:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9-userdata-shm.mount: Deactivated successfully.
Dec 05 11:53:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-bd705a07c7bc29a9eafa697375b74dc2ecbeb5bf38a91f03c149fb0a99e25516-merged.mount: Deactivated successfully.
Dec 05 11:53:31 compute-0 sudo[187369]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.627 187212 ERROR nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Could not retrieve compute node resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '5111707b-bdc3-4252-b5b7-b3e96ff05344' not found: No resource provider with uuid 5111707b-bdc3-4252-b5b7-b3e96ff05344 found  ", "request_id": "req-774a28d9-a88e-4e5a-9372-5429c75d68b0"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '5111707b-bdc3-4252-b5b7-b3e96ff05344' not found: No resource provider with uuid 5111707b-bdc3-4252-b5b7-b3e96ff05344 found  ", "request_id": "req-774a28d9-a88e-4e5a-9372-5429c75d68b0"}]}
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.646 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.646 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.647 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.647 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.810 187212 WARNING nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.811 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6202MB free_disk=73.54329299926758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.812 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:53:31 compute-0 nova_compute[187208]: 2025-12-05 11:53:31.812 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:53:31 compute-0 podman[187419]: 2025-12-05 11:53:31.880467105 +0000 UTC m=+2.148683864 container cleanup 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:53:31 compute-0 systemd[1]: libpod-conmon-8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9.scope: Deactivated successfully.
Dec 05 11:53:32 compute-0 sshd-session[159084]: Connection closed by 192.168.122.30 port 33578
Dec 05 11:53:32 compute-0 sshd-session[159081]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:53:32 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Dec 05 11:53:32 compute-0 systemd[1]: session-23.scope: Consumed 2min 1.738s CPU time.
Dec 05 11:53:32 compute-0 systemd-logind[792]: Session 23 logged out. Waiting for processes to exit.
Dec 05 11:53:32 compute-0 systemd-logind[792]: Removed session 23.
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.325 187212 ERROR nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '5111707b-bdc3-4252-b5b7-b3e96ff05344' not found: No resource provider with uuid 5111707b-bdc3-4252-b5b7-b3e96ff05344 found  ", "request_id": "req-9e1b7cdc-c1d2-4580-865f-85886820152d"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '5111707b-bdc3-4252-b5b7-b3e96ff05344' not found: No resource provider with uuid 5111707b-bdc3-4252-b5b7-b3e96ff05344 found  ", "request_id": "req-9e1b7cdc-c1d2-4580-865f-85886820152d"}]}
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.326 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.326 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.787 187212 INFO nova.scheduler.client.report [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [req-122471f4-9d23-45f9-8412-7e0ca534d4f2] Created resource provider record via placement API for resource provider with UUID 5111707b-bdc3-4252-b5b7-b3e96ff05344 and name compute-0.ctlplane.example.com.
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.819 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 05 11:53:32 compute-0 nova_compute[187208]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.819 187212 INFO nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] kernel doesn't support AMD SEV
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.820 187212 DEBUG nova.compute.provider_tree [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.821 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.877 187212 DEBUG nova.scheduler.client.report [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updated inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.877 187212 DEBUG nova.compute.provider_tree [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updating resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.878 187212 DEBUG nova.compute.provider_tree [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.970 187212 DEBUG nova.compute.provider_tree [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updating resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.993 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.993 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:53:32 compute-0 nova_compute[187208]: 2025-12-05 11:53:32.994 187212 DEBUG nova.service [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 05 11:53:33 compute-0 nova_compute[187208]: 2025-12-05 11:53:33.070 187212 DEBUG nova.service [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 05 11:53:33 compute-0 nova_compute[187208]: 2025-12-05 11:53:33.070 187212 DEBUG nova.servicegroup.drivers.db [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 05 11:53:37 compute-0 podman[187510]: 2025-12-05 11:53:37.224957642 +0000 UTC m=+0.073947282 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Dec 05 11:53:38 compute-0 sshd-session[187530]: Accepted publickey for zuul from 192.168.122.30 port 40106 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 11:53:38 compute-0 systemd-logind[792]: New session 25 of user zuul.
Dec 05 11:53:38 compute-0 systemd[1]: Started Session 25 of User zuul.
Dec 05 11:53:38 compute-0 sshd-session[187530]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 11:53:39 compute-0 python3.9[187683]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 11:53:40 compute-0 sudo[187837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzhunpwkowtccnwrrwaswwarmpjzoavp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935619.9044533-36-83379692341269/AnsiballZ_systemd_service.py'
Dec 05 11:53:40 compute-0 sudo[187837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:40 compute-0 python3.9[187839]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:53:40 compute-0 systemd[1]: Reloading.
Dec 05 11:53:40 compute-0 systemd-rc-local-generator[187857]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:53:40 compute-0 systemd-sysv-generator[187862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:53:41 compute-0 nova_compute[187208]: 2025-12-05 11:53:41.072 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:53:41 compute-0 nova_compute[187208]: 2025-12-05 11:53:41.097 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:53:41 compute-0 sudo[187837]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:41 compute-0 python3.9[188025]: ansible-ansible.builtin.service_facts Invoked
Dec 05 11:53:41 compute-0 network[188042]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 11:53:42 compute-0 network[188043]: 'network-scripts' will be removed from distribution in near future.
Dec 05 11:53:42 compute-0 network[188044]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 11:53:44 compute-0 podman[188155]: 2025-12-05 11:53:44.791792705 +0000 UTC m=+0.090494108 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 05 11:53:45 compute-0 sudo[188341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-towvdhyzovxegjurecpdtyoqdpyfpajl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935625.2336733-55-172966598863754/AnsiballZ_systemd_service.py'
Dec 05 11:53:45 compute-0 sudo[188341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:45 compute-0 python3.9[188343]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:53:45 compute-0 sudo[188341]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:46 compute-0 sudo[188494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvfwzdwvndnpmzrxkgcgcqyonetnbrwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935626.1014752-65-218848417159270/AnsiballZ_file.py'
Dec 05 11:53:46 compute-0 sudo[188494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:46 compute-0 python3.9[188496]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:53:46 compute-0 sudo[188494]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:46 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 11:53:47 compute-0 sudo[188647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaljzukulsmjrvhnspmihrpbvyoeyyfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935626.8497505-73-166108666004041/AnsiballZ_file.py'
Dec 05 11:53:47 compute-0 sudo[188647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:47 compute-0 python3.9[188649]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:53:47 compute-0 sudo[188647]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:48 compute-0 sudo[188799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrhuxikypomnyhjqpjyhndxhbacayauu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935627.7131116-82-156052895598533/AnsiballZ_command.py'
Dec 05 11:53:48 compute-0 sudo[188799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:48 compute-0 python3.9[188801]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:53:48 compute-0 sudo[188799]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:49 compute-0 python3.9[188953]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 11:53:49 compute-0 sudo[189103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogdxyfkwltpclovrbjbytbizvgtgmaxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935629.328369-100-40029886095297/AnsiballZ_systemd_service.py'
Dec 05 11:53:49 compute-0 sudo[189103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:49 compute-0 python3.9[189105]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:53:49 compute-0 systemd[1]: Reloading.
Dec 05 11:53:50 compute-0 systemd-sysv-generator[189156]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:53:50 compute-0 systemd-rc-local-generator[189153]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:53:50 compute-0 podman[189107]: 2025-12-05 11:53:50.021434774 +0000 UTC m=+0.073508099 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 05 11:53:50 compute-0 sudo[189103]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:50 compute-0 sudo[189310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npvmzrcotivpmmcuuvkvrxnexvigomdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935630.4313798-108-73206286011513/AnsiballZ_command.py'
Dec 05 11:53:50 compute-0 sudo[189310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:50 compute-0 python3.9[189312]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:53:50 compute-0 sudo[189310]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:51 compute-0 sudo[189463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruatfuvpiraeyxvktkcgzkrzzvmzvavo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935631.1135368-117-95462455743227/AnsiballZ_file.py'
Dec 05 11:53:51 compute-0 sudo[189463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:51 compute-0 python3.9[189465]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:53:51 compute-0 sudo[189463]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:52 compute-0 python3.9[189615]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:53:53 compute-0 python3.9[189767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:53:53 compute-0 python3.9[189888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935632.6334188-133-180122835217658/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:53:54 compute-0 sudo[190038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuwhzjvjdnlujtfaxmeleemlvytocbfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935633.927534-148-234759197204467/AnsiballZ_group.py'
Dec 05 11:53:54 compute-0 sudo[190038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:54 compute-0 python3.9[190040]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 05 11:53:54 compute-0 sudo[190038]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:55 compute-0 sudo[190190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbfgguqmphivpkmrujgxjwgktzssuxzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935634.9170876-159-9921847990641/AnsiballZ_getent.py'
Dec 05 11:53:55 compute-0 sudo[190190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:55 compute-0 python3.9[190192]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 05 11:53:55 compute-0 sudo[190190]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:56 compute-0 sudo[190343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tavcmzvwbjyubmnlrglhiuztbhgcazdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935635.8519745-167-277820070875166/AnsiballZ_group.py'
Dec 05 11:53:56 compute-0 sudo[190343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:56 compute-0 python3.9[190345]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 11:53:56 compute-0 groupadd[190346]: group added to /etc/group: name=ceilometer, GID=42405
Dec 05 11:53:56 compute-0 groupadd[190346]: group added to /etc/gshadow: name=ceilometer
Dec 05 11:53:56 compute-0 groupadd[190346]: new group: name=ceilometer, GID=42405
Dec 05 11:53:56 compute-0 sudo[190343]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:57 compute-0 sudo[190501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwhitqoqhvtdigrnawcvokggqssngqqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935636.827523-175-63366452557869/AnsiballZ_user.py'
Dec 05 11:53:57 compute-0 sudo[190501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:53:57 compute-0 python3.9[190503]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 11:53:57 compute-0 useradd[190505]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 11:53:57 compute-0 useradd[190505]: add 'ceilometer' to group 'libvirt'
Dec 05 11:53:57 compute-0 useradd[190505]: add 'ceilometer' to shadow group 'libvirt'
Dec 05 11:53:58 compute-0 sudo[190501]: pam_unix(sudo:session): session closed for user root
Dec 05 11:53:59 compute-0 python3.9[190661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:53:59 compute-0 python3.9[190782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764935638.7289858-201-62735448118087/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:00 compute-0 python3.9[190932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:00 compute-0 python3.9[191053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764935639.8859148-201-12977640296662/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:01 compute-0 python3.9[191203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:02 compute-0 python3.9[191324]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764935641.0302138-201-201061544219823/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:02 compute-0 python3.9[191474]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:54:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:54:02.999 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:54:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:54:03.000 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:54:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:54:03.000 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:54:03 compute-0 python3.9[191626]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:54:04 compute-0 python3.9[191778]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:04 compute-0 python3.9[191899]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935643.5685701-260-17046767679443/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:05 compute-0 python3.9[192049]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:05 compute-0 python3.9[192125]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:06 compute-0 python3.9[192275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:07 compute-0 python3.9[192396]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935645.9814017-260-231313481189936/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:07 compute-0 podman[192520]: 2025-12-05 11:54:07.599010919 +0000 UTC m=+0.075284771 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 11:54:07 compute-0 python3.9[192556]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:08 compute-0 python3.9[192686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935647.1825607-260-26876874097228/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:09 compute-0 python3.9[192837]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:09 compute-0 python3.9[192958]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935648.4935868-260-95108047884505/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:10 compute-0 python3.9[193109]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:10 compute-0 python3.9[193230]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935649.667252-260-116426040866187/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:11 compute-0 python3.9[193380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:11 compute-0 python3.9[193501]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935650.7880056-260-185492813421047/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:12 compute-0 sshd-session[192780]: Connection reset by authenticating user root 91.202.233.33 port 21582 [preauth]
Dec 05 11:54:12 compute-0 python3.9[193651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:12 compute-0 python3.9[193774]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935651.897296-260-202891361344810/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:13 compute-0 python3.9[193924]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:13 compute-0 python3.9[194045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935652.9834416-260-69713075083964/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:14 compute-0 sshd-session[193652]: Connection reset by authenticating user root 91.202.233.33 port 47326 [preauth]
Dec 05 11:54:14 compute-0 python3.9[194195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:15 compute-0 podman[194291]: 2025-12-05 11:54:15.007884799 +0000 UTC m=+0.103807302 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 05 11:54:15 compute-0 python3.9[194330]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935654.089444-260-97552924682234/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:15 compute-0 python3.9[194495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:16 compute-0 python3.9[194616]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935655.2563097-260-133458606956568/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:16 compute-0 python3.9[194766]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:17 compute-0 sshd-session[194233]: Connection reset by authenticating user root 91.202.233.33 port 47340 [preauth]
Dec 05 11:54:17 compute-0 python3.9[194842]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:17 compute-0 python3.9[194994]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:18 compute-0 python3.9[195070]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:19 compute-0 python3.9[195220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:19 compute-0 python3.9[195296]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:19 compute-0 sshd-session[194843]: Connection reset by authenticating user root 91.202.233.33 port 47348 [preauth]
Dec 05 11:54:19 compute-0 sudo[195447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeprwzpajodmamymfhqimubxovkwbhgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935659.6983225-449-128839788840558/AnsiballZ_file.py'
Dec 05 11:54:19 compute-0 sudo[195447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:20 compute-0 python3.9[195449]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:20 compute-0 sudo[195447]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:20 compute-0 sudo[195613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlrnlnyovlcjadwcfbwvdirykeueiuzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935660.408899-457-217870358057240/AnsiballZ_file.py'
Dec 05 11:54:20 compute-0 sudo[195613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:20 compute-0 podman[195574]: 2025-12-05 11:54:20.744967278 +0000 UTC m=+0.067446364 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 11:54:20 compute-0 python3.9[195622]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:20 compute-0 sudo[195613]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:21 compute-0 sudo[195772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoxuztcixgbujsoeffmbwruntqubpwnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935661.1153688-465-107524197755288/AnsiballZ_file.py'
Dec 05 11:54:21 compute-0 sudo[195772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:21 compute-0 python3.9[195774]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:54:21 compute-0 sudo[195772]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:22 compute-0 sshd-session[195374]: Invalid user dev from 91.202.233.33 port 47356
Dec 05 11:54:22 compute-0 sudo[195924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuhcwwivytfkxrptjakdxbmbzwkahwaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935661.7783892-473-5404160441928/AnsiballZ_systemd_service.py'
Dec 05 11:54:22 compute-0 sudo[195924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:22 compute-0 python3.9[195926]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:54:22 compute-0 systemd[1]: Reloading.
Dec 05 11:54:22 compute-0 sshd-session[195374]: Connection reset by invalid user dev 91.202.233.33 port 47356 [preauth]
Dec 05 11:54:22 compute-0 systemd-sysv-generator[195958]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:54:22 compute-0 systemd-rc-local-generator[195955]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:54:22 compute-0 systemd[1]: Listening on Podman API Socket.
Dec 05 11:54:22 compute-0 sudo[195924]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:23 compute-0 sudo[196115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybywysdwwxsgvknwxrhtoaconmzkotqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935662.9978337-482-31954752467497/AnsiballZ_stat.py'
Dec 05 11:54:23 compute-0 sudo[196115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:23 compute-0 python3.9[196117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:23 compute-0 sudo[196115]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:23 compute-0 sudo[196238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsplcvioqxeqhgmsaxiaejkakrsjqxpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935662.9978337-482-31954752467497/AnsiballZ_copy.py'
Dec 05 11:54:23 compute-0 sudo[196238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:23 compute-0 python3.9[196240]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935662.9978337-482-31954752467497/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:54:24 compute-0 sudo[196238]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:24 compute-0 sudo[196314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghxladugqqfxoufrtcpohfxfnplzodmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935662.9978337-482-31954752467497/AnsiballZ_stat.py'
Dec 05 11:54:24 compute-0 sudo[196314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:24 compute-0 python3.9[196316]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:24 compute-0 sudo[196314]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:24 compute-0 sudo[196437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfnviggsrnbwmfpsujmxkiqndswgepsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935662.9978337-482-31954752467497/AnsiballZ_copy.py'
Dec 05 11:54:24 compute-0 sudo[196437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:24 compute-0 python3.9[196439]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935662.9978337-482-31954752467497/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:54:25 compute-0 sudo[196437]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:25 compute-0 sudo[196589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mslfnpmgkcxgakbrtnfmvlrdlssfqlch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935665.3992245-510-93279799600625/AnsiballZ_container_config_data.py'
Dec 05 11:54:25 compute-0 sudo[196589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:26 compute-0 python3.9[196591]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Dec 05 11:54:26 compute-0 sudo[196589]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:26 compute-0 sudo[196741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kohayqyxilmroujralahxlgugmvuvzlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935666.3126874-519-197414680795008/AnsiballZ_container_config_hash.py'
Dec 05 11:54:26 compute-0 sudo[196741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:26 compute-0 python3.9[196743]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 11:54:26 compute-0 sudo[196741]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:27 compute-0 sudo[196893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aupszwyybkhpcvfteudikyedulifrxmc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935667.273288-529-182920373118089/AnsiballZ_edpm_container_manage.py'
Dec 05 11:54:27 compute-0 sudo[196893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:28 compute-0 python3[196895]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 11:54:28 compute-0 podman[196930]: 2025-12-05 11:54:28.281975501 +0000 UTC m=+0.056510489 container create 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 11:54:28 compute-0 podman[196930]: 2025-12-05 11:54:28.250922617 +0000 UTC m=+0.025457645 image pull 343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 05 11:54:28 compute-0 python3[196895]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Dec 05 11:54:28 compute-0 sudo[196893]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:28 compute-0 sudo[197115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkijuxpjvtbvryagwmcmrywynmdgvwma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935668.5675607-537-38565371265172/AnsiballZ_stat.py'
Dec 05 11:54:28 compute-0 sudo[197115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:29 compute-0 python3.9[197117]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:54:29 compute-0 sudo[197115]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:29 compute-0 sudo[197269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxfqmafcjqxizjmskwewviyrjragmqyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935669.2755337-546-236340005333637/AnsiballZ_file.py'
Dec 05 11:54:29 compute-0 sudo[197269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:29 compute-0 python3.9[197271]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:29 compute-0 sudo[197269]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.063 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.078 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.079 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.079 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.079 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.081 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.109 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.109 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.110 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.110 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.260 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.261 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6174MB free_disk=73.5428237915039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.262 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.262 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:54:30 compute-0 sudo[197420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgvqqfpvkxdiypsmofjdmwssgpaxkrnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935669.9160051-546-16239367757230/AnsiballZ_copy.py'
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.327 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.328 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 11:54:30 compute-0 sudo[197420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.354 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.367 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.368 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 11:54:30 compute-0 nova_compute[187208]: 2025-12-05 11:54:30.368 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:54:30 compute-0 python3.9[197422]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935669.9160051-546-16239367757230/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:30 compute-0 sudo[197420]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:31 compute-0 sudo[197496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jttymdprprvanmweibgqgnpihvjwrbwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935669.9160051-546-16239367757230/AnsiballZ_systemd.py'
Dec 05 11:54:31 compute-0 sudo[197496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:31 compute-0 python3.9[197498]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:54:31 compute-0 systemd[1]: Reloading.
Dec 05 11:54:31 compute-0 systemd-rc-local-generator[197525]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:54:31 compute-0 systemd-sysv-generator[197528]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:54:31 compute-0 sudo[197496]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:32 compute-0 sudo[197608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lphxtczwqcmnfmrihsnctblgbxsuopnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935669.9160051-546-16239367757230/AnsiballZ_systemd.py'
Dec 05 11:54:32 compute-0 sudo[197608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:32 compute-0 python3.9[197610]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:54:32 compute-0 systemd[1]: Reloading.
Dec 05 11:54:32 compute-0 systemd-sysv-generator[197643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:54:32 compute-0 systemd-rc-local-generator[197639]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:54:32 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Dec 05 11:54:32 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:54:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:32 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.
Dec 05 11:54:32 compute-0 podman[197649]: 2025-12-05 11:54:32.815613005 +0000 UTC m=+0.120055611 container init 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: + sudo -E kolla_set_configs
Dec 05 11:54:32 compute-0 podman[197649]: 2025-12-05 11:54:32.837392792 +0000 UTC m=+0.141835378 container start 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 11:54:32 compute-0 sudo[197671]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: sudo: unable to send audit message: Operation not permitted
Dec 05 11:54:32 compute-0 sudo[197671]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 11:54:32 compute-0 sudo[197671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 05 11:54:32 compute-0 podman[197649]: ceilometer_agent_compute
Dec 05 11:54:32 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Dec 05 11:54:32 compute-0 sudo[197608]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Validating config file
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Copying service configuration files
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: INFO:__main__:Writing out command to execute
Dec 05 11:54:32 compute-0 sudo[197671]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: ++ cat /run_command
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: + ARGS=
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: + sudo kolla_copy_cacerts
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: sudo: unable to send audit message: Operation not permitted
Dec 05 11:54:32 compute-0 sudo[197696]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 05 11:54:32 compute-0 sudo[197696]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 11:54:32 compute-0 sudo[197696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 05 11:54:32 compute-0 sudo[197696]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: + [[ ! -n '' ]]
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: + . kolla_extend_start
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: + umask 0022
Dec 05 11:54:32 compute-0 ceilometer_agent_compute[197665]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 05 11:54:32 compute-0 podman[197672]: 2025-12-05 11:54:32.927715125 +0000 UTC m=+0.075328792 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec 05 11:54:32 compute-0 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-6e2187f3ee73ebb3.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 11:54:32 compute-0 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-6e2187f3ee73ebb3.service: Failed with result 'exit-code'.
Dec 05 11:54:33 compute-0 sudo[197848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjkkvoskjcgzdgrbqxgnrzikwqvawugh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935673.0472698-570-139005208299972/AnsiballZ_systemd.py'
Dec 05 11:54:33 compute-0 sudo[197848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:33 compute-0 python3.9[197850]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:54:33 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Dec 05 11:54:33 compute-0 systemd[1]: libpod-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope: Deactivated successfully.
Dec 05 11:54:33 compute-0 podman[197854]: 2025-12-05 11:54:33.75597653 +0000 UTC m=+0.052678429 container died 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 11:54:33 compute-0 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-6e2187f3ee73ebb3.timer: Deactivated successfully.
Dec 05 11:54:33 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.
Dec 05 11:54:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-userdata-shm.mount: Deactivated successfully.
Dec 05 11:54:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5-merged.mount: Deactivated successfully.
Dec 05 11:54:33 compute-0 podman[197854]: 2025-12-05 11:54:33.954099328 +0000 UTC m=+0.250801207 container cleanup 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:54:33 compute-0 podman[197854]: ceilometer_agent_compute
Dec 05 11:54:34 compute-0 podman[197884]: ceilometer_agent_compute
Dec 05 11:54:34 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Dec 05 11:54:34 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Dec 05 11:54:34 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Dec 05 11:54:34 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:54:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:34 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.
Dec 05 11:54:34 compute-0 podman[197897]: 2025-12-05 11:54:34.188258396 +0000 UTC m=+0.128705460 container init 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: + sudo -E kolla_set_configs
Dec 05 11:54:34 compute-0 sudo[197919]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: sudo: unable to send audit message: Operation not permitted
Dec 05 11:54:34 compute-0 sudo[197919]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 11:54:34 compute-0 sudo[197919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 05 11:54:34 compute-0 podman[197897]: 2025-12-05 11:54:34.22172839 +0000 UTC m=+0.162175464 container start 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 05 11:54:34 compute-0 podman[197897]: ceilometer_agent_compute
Dec 05 11:54:34 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Dec 05 11:54:34 compute-0 sudo[197848]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Validating config file
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Copying service configuration files
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: INFO:__main__:Writing out command to execute
Dec 05 11:54:34 compute-0 podman[197920]: 2025-12-05 11:54:34.281468922 +0000 UTC m=+0.047138260 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 11:54:34 compute-0 sudo[197919]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: ++ cat /run_command
Dec 05 11:54:34 compute-0 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-2261b38cdfea5d2d.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 11:54:34 compute-0 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-2261b38cdfea5d2d.service: Failed with result 'exit-code'.
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: + ARGS=
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: + sudo kolla_copy_cacerts
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: sudo: unable to send audit message: Operation not permitted
Dec 05 11:54:34 compute-0 sudo[197942]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 05 11:54:34 compute-0 sudo[197942]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 11:54:34 compute-0 sudo[197942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 05 11:54:34 compute-0 sudo[197942]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: + [[ ! -n '' ]]
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: + . kolla_extend_start
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: + umask 0022
Dec 05 11:54:34 compute-0 ceilometer_agent_compute[197913]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 05 11:54:34 compute-0 sudo[198094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvexpmwxmacmkzwwgobrtqzbqbqmynwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935674.5946898-578-238661738900389/AnsiballZ_stat.py'
Dec 05 11:54:34 compute-0 sudo[198094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:35 compute-0 python3.9[198096]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:35 compute-0 sudo[198094]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.127 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.160 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.162 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.162 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.245 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.327 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.349 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 05 11:54:35 compute-0 sudo[198220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzlxnmxgalgjyxxzrniislzabrkprtpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935674.5946898-578-238661738900389/AnsiballZ_copy.py'
Dec 05 11:54:35 compute-0 sudo[198220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.355 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:54:35 compute-0 python3.9[198224]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935674.5946898-578-238661738900389/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:54:35 compute-0 sudo[198220]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:36 compute-0 sudo[198375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opxqdturdeilitoxluiqbueyywejxmcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935675.8354802-595-182284402384414/AnsiballZ_container_config_data.py'
Dec 05 11:54:36 compute-0 sudo[198375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:36 compute-0 python3.9[198377]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Dec 05 11:54:36 compute-0 sudo[198375]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:36 compute-0 sudo[198527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hafssdcudhboqaxnosgusqlurltfgoeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935676.550227-604-148775973074808/AnsiballZ_container_config_hash.py'
Dec 05 11:54:36 compute-0 sudo[198527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:37 compute-0 python3.9[198529]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 11:54:37 compute-0 sudo[198527]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:37 compute-0 sudo[198679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkmwdvwtvdxspbygxnqcwntcyoutgbzq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935677.3525765-614-205787707719321/AnsiballZ_edpm_container_manage.py'
Dec 05 11:54:37 compute-0 sudo[198679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:37 compute-0 podman[198681]: 2025-12-05 11:54:37.707880811 +0000 UTC m=+0.053728899 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 05 11:54:37 compute-0 python3[198682]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 11:54:38 compute-0 podman[198736]: 2025-12-05 11:54:38.12812814 +0000 UTC m=+0.047028686 container create 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter)
Dec 05 11:54:38 compute-0 podman[198736]: 2025-12-05 11:54:38.10140481 +0000 UTC m=+0.020305326 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 05 11:54:38 compute-0 python3[198682]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec 05 11:54:38 compute-0 sudo[198679]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:38 compute-0 sudo[198923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfnisixuhhskhcmnfrgcqynjnjwfgkah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935678.430106-622-100266915971359/AnsiballZ_stat.py'
Dec 05 11:54:38 compute-0 sudo[198923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:38 compute-0 python3.9[198925]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:54:38 compute-0 sudo[198923]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:39 compute-0 sudo[199077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alfbsvlznfmtjhhnzxbvbvnnoayzsmff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935679.0945933-631-248856164588643/AnsiballZ_file.py'
Dec 05 11:54:39 compute-0 sudo[199077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:39 compute-0 python3.9[199079]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:39 compute-0 sudo[199077]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:40 compute-0 sudo[199228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hffxnhrakvykxaainegclbekomwaabhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935679.669225-631-211216472428256/AnsiballZ_copy.py'
Dec 05 11:54:40 compute-0 sudo[199228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:40 compute-0 python3.9[199230]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935679.669225-631-211216472428256/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:40 compute-0 sudo[199228]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:40 compute-0 auditd[699]: Audit daemon rotating log files
Dec 05 11:54:40 compute-0 sudo[199304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udaemiriqdknjcdjcjdapqmljozoydez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935679.669225-631-211216472428256/AnsiballZ_systemd.py'
Dec 05 11:54:40 compute-0 sudo[199304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:40 compute-0 python3.9[199306]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:54:40 compute-0 systemd[1]: Reloading.
Dec 05 11:54:40 compute-0 systemd-sysv-generator[199337]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:54:40 compute-0 systemd-rc-local-generator[199334]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:54:41 compute-0 sudo[199304]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:41 compute-0 sudo[199415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnypocmugvpkgwpndgujtrjifvbsnmar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935679.669225-631-211216472428256/AnsiballZ_systemd.py'
Dec 05 11:54:41 compute-0 sudo[199415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:41 compute-0 python3.9[199417]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:54:41 compute-0 systemd[1]: Reloading.
Dec 05 11:54:41 compute-0 systemd-rc-local-generator[199444]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:54:41 compute-0 systemd-sysv-generator[199447]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:54:42 compute-0 systemd[1]: Starting node_exporter container...
Dec 05 11:54:42 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:54:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:42 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.
Dec 05 11:54:42 compute-0 podman[199457]: 2025-12-05 11:54:42.710505684 +0000 UTC m=+0.541065266 container init 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.729Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.729Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.729Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.730Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.730Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.730Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.730Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.731Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.731Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.731Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=arp
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=bcache
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=bonding
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=cpu
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=edac
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=filefd
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=netclass
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=netdev
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=netstat
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=nfs
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=nvme
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=softnet
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=systemd
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=xfs
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=zfs
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.733Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 05 11:54:42 compute-0 node_exporter[199472]: ts=2025-12-05T11:54:42.734Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 05 11:54:42 compute-0 podman[199457]: 2025-12-05 11:54:42.751514371 +0000 UTC m=+0.582073953 container start 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 11:54:42 compute-0 podman[199457]: node_exporter
Dec 05 11:54:42 compute-0 systemd[1]: Started node_exporter container.
Dec 05 11:54:42 compute-0 sudo[199415]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:42 compute-0 podman[199481]: 2025-12-05 11:54:42.827881624 +0000 UTC m=+0.065746369 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 11:54:43 compute-0 sudo[199653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxddmsdorhuskdsapimstduawfkowiui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935682.99553-655-75523043727198/AnsiballZ_systemd.py'
Dec 05 11:54:43 compute-0 sudo[199653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:43 compute-0 python3.9[199655]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:54:43 compute-0 systemd[1]: Stopping node_exporter container...
Dec 05 11:54:43 compute-0 systemd[1]: libpod-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope: Deactivated successfully.
Dec 05 11:54:43 compute-0 podman[199659]: 2025-12-05 11:54:43.730295653 +0000 UTC m=+0.057059809 container died 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 11:54:43 compute-0 systemd[1]: 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5-655ef2dd01a3c178.timer: Deactivated successfully.
Dec 05 11:54:43 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.
Dec 05 11:54:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5-userdata-shm.mount: Deactivated successfully.
Dec 05 11:54:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f-merged.mount: Deactivated successfully.
Dec 05 11:54:43 compute-0 podman[199659]: 2025-12-05 11:54:43.912904545 +0000 UTC m=+0.239668701 container cleanup 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 11:54:43 compute-0 podman[199659]: node_exporter
Dec 05 11:54:43 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 05 11:54:43 compute-0 podman[199689]: node_exporter
Dec 05 11:54:43 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Dec 05 11:54:43 compute-0 systemd[1]: Stopped node_exporter container.
Dec 05 11:54:43 compute-0 systemd[1]: Starting node_exporter container...
Dec 05 11:54:44 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:54:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:44 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.
Dec 05 11:54:44 compute-0 podman[199703]: 2025-12-05 11:54:44.322542846 +0000 UTC m=+0.321939914 container init 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.338Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.338Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.338Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.339Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.339Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.339Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.339Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.340Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.340Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=arp
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=bcache
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=bonding
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=cpu
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=edac
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=filefd
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=netclass
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=netdev
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=netstat
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=nfs
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=nvme
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=softnet
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=systemd
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=xfs
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=zfs
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.342Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 05 11:54:44 compute-0 node_exporter[199718]: ts=2025-12-05T11:54:44.343Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 05 11:54:44 compute-0 podman[199703]: 2025-12-05 11:54:44.356627365 +0000 UTC m=+0.356024403 container start 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 11:54:44 compute-0 podman[199703]: node_exporter
Dec 05 11:54:44 compute-0 systemd[1]: Started node_exporter container.
Dec 05 11:54:44 compute-0 sudo[199653]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:44 compute-0 podman[199727]: 2025-12-05 11:54:44.427362445 +0000 UTC m=+0.062878095 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 11:54:44 compute-0 sudo[199898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsygvbmnneufzipuofztqaacxvczlnwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935684.5876107-663-9829364768610/AnsiballZ_stat.py'
Dec 05 11:54:44 compute-0 sudo[199898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:45 compute-0 python3.9[199900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:45 compute-0 sudo[199898]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:45 compute-0 podman[199930]: 2025-12-05 11:54:45.231133202 +0000 UTC m=+0.086589337 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 11:54:45 compute-0 sudo[200047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xodumboctugteltsqhusysridysejvgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935684.5876107-663-9829364768610/AnsiballZ_copy.py'
Dec 05 11:54:45 compute-0 sudo[200047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:45 compute-0 python3.9[200049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935684.5876107-663-9829364768610/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:54:45 compute-0 sudo[200047]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:46 compute-0 sudo[200199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pskhvuousbnzutkiakcdmyhvgdcpbzlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935685.8213043-680-198195536381174/AnsiballZ_container_config_data.py'
Dec 05 11:54:46 compute-0 sudo[200199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:46 compute-0 python3.9[200201]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 05 11:54:46 compute-0 sudo[200199]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:46 compute-0 sudo[200351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfolyxnbcfhgvjzzptiolrnqdjtmgilc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935686.5062242-689-248751607931203/AnsiballZ_container_config_hash.py'
Dec 05 11:54:46 compute-0 sudo[200351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:46 compute-0 python3.9[200353]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 11:54:47 compute-0 sudo[200351]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:47 compute-0 sudo[200503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pllmkxtadzkdgekevdmmdnhuzwefvexa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935687.3010535-699-80377536453242/AnsiballZ_edpm_container_manage.py'
Dec 05 11:54:47 compute-0 sudo[200503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:47 compute-0 python3[200505]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 11:54:49 compute-0 podman[200518]: 2025-12-05 11:54:49.59740021 +0000 UTC m=+1.655622865 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 05 11:54:49 compute-0 podman[200611]: 2025-12-05 11:54:49.731239672 +0000 UTC m=+0.040984767 container create 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible)
Dec 05 11:54:49 compute-0 podman[200611]: 2025-12-05 11:54:49.709611141 +0000 UTC m=+0.019356256 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 05 11:54:49 compute-0 python3[200505]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec 05 11:54:49 compute-0 sudo[200503]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:50 compute-0 sudo[200797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwqqmiyvgcpqgewvmruuvrvjslpoodmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935690.0172646-707-176834380947656/AnsiballZ_stat.py'
Dec 05 11:54:50 compute-0 sudo[200797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:50 compute-0 python3.9[200799]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:54:50 compute-0 sudo[200797]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:50 compute-0 sudo[200963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnfkakstzzxjlwwuguoangmfueidorhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935690.722312-716-52366595874805/AnsiballZ_file.py'
Dec 05 11:54:51 compute-0 sudo[200963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:51 compute-0 podman[200925]: 2025-12-05 11:54:51.030058273 +0000 UTC m=+0.080628466 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 11:54:51 compute-0 python3.9[200969]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:51 compute-0 sudo[200963]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:51 compute-0 sudo[201121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adgxgmarcbhwtpfxnpfchmzsotyftjem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935691.2705548-716-266877707357472/AnsiballZ_copy.py'
Dec 05 11:54:51 compute-0 sudo[201121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:51 compute-0 python3.9[201123]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935691.2705548-716-266877707357472/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:54:51 compute-0 sudo[201121]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:52 compute-0 sudo[201197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epytoopdienutffsmetcgehrkxgnokmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935691.2705548-716-266877707357472/AnsiballZ_systemd.py'
Dec 05 11:54:52 compute-0 sudo[201197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:52 compute-0 python3.9[201199]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:54:52 compute-0 systemd[1]: Reloading.
Dec 05 11:54:52 compute-0 systemd-rc-local-generator[201222]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:54:52 compute-0 systemd-sysv-generator[201228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:54:52 compute-0 sudo[201197]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:52 compute-0 sudo[201308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otcppxycobyruyvsysjweaorqigdclpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935691.2705548-716-266877707357472/AnsiballZ_systemd.py'
Dec 05 11:54:52 compute-0 sudo[201308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:53 compute-0 python3.9[201310]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:54:53 compute-0 systemd[1]: Reloading.
Dec 05 11:54:53 compute-0 systemd-rc-local-generator[201339]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:54:53 compute-0 systemd-sysv-generator[201344]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:54:53 compute-0 systemd[1]: Starting podman_exporter container...
Dec 05 11:54:53 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:54:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:53 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.
Dec 05 11:54:53 compute-0 podman[201350]: 2025-12-05 11:54:53.845403762 +0000 UTC m=+0.158567963 container init 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:54:53 compute-0 podman_exporter[201365]: ts=2025-12-05T11:54:53.861Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 05 11:54:53 compute-0 podman_exporter[201365]: ts=2025-12-05T11:54:53.861Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 05 11:54:53 compute-0 podman_exporter[201365]: ts=2025-12-05T11:54:53.861Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 05 11:54:53 compute-0 podman_exporter[201365]: ts=2025-12-05T11:54:53.861Z caller=handler.go:105 level=info collector=container
Dec 05 11:54:53 compute-0 podman[201350]: 2025-12-05 11:54:53.879780819 +0000 UTC m=+0.192945000 container start 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:54:53 compute-0 podman[201350]: podman_exporter
Dec 05 11:54:53 compute-0 systemd[1]: Starting Podman API Service...
Dec 05 11:54:53 compute-0 systemd[1]: Started Podman API Service.
Dec 05 11:54:53 compute-0 systemd[1]: Started podman_exporter container.
Dec 05 11:54:53 compute-0 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 05 11:54:53 compute-0 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="Setting parallel job count to 25"
Dec 05 11:54:53 compute-0 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="Using sqlite as database backend"
Dec 05 11:54:53 compute-0 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 05 11:54:53 compute-0 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 05 11:54:53 compute-0 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec 05 11:54:53 compute-0 sudo[201308]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:53 compute-0 podman[201376]: @ - - [05/Dec/2025:11:54:53 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 05 11:54:53 compute-0 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 11:54:53 compute-0 podman[201375]: 2025-12-05 11:54:53.957866601 +0000 UTC m=+0.057964885 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 11:54:53 compute-0 systemd[1]: 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6-3da900843a1ee59b.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 11:54:53 compute-0 systemd[1]: 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6-3da900843a1ee59b.service: Failed with result 'exit-code'.
Dec 05 11:54:53 compute-0 podman[201376]: @ - - [05/Dec/2025:11:54:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19566 "" "Go-http-client/1.1"
Dec 05 11:54:53 compute-0 podman_exporter[201365]: ts=2025-12-05T11:54:53.978Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 05 11:54:53 compute-0 podman_exporter[201365]: ts=2025-12-05T11:54:53.979Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 05 11:54:53 compute-0 podman_exporter[201365]: ts=2025-12-05T11:54:53.979Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 05 11:54:54 compute-0 sudo[201560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cttvnqxigktbswjwckkjyhbycycboxse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935694.1140034-740-100976750925734/AnsiballZ_systemd.py'
Dec 05 11:54:54 compute-0 sudo[201560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:54 compute-0 python3.9[201562]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:54:54 compute-0 systemd[1]: Stopping podman_exporter container...
Dec 05 11:54:54 compute-0 podman[201376]: @ - - [05/Dec/2025:11:54:53 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Dec 05 11:54:54 compute-0 systemd[1]: libpod-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope: Deactivated successfully.
Dec 05 11:54:54 compute-0 podman[201566]: 2025-12-05 11:54:54.903684116 +0000 UTC m=+0.062539926 container died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:54:54 compute-0 systemd[1]: 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6-3da900843a1ee59b.timer: Deactivated successfully.
Dec 05 11:54:54 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.
Dec 05 11:54:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6-userdata-shm.mount: Deactivated successfully.
Dec 05 11:54:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af-merged.mount: Deactivated successfully.
Dec 05 11:54:55 compute-0 podman[201566]: 2025-12-05 11:54:55.295206256 +0000 UTC m=+0.454062026 container cleanup 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 11:54:55 compute-0 podman[201566]: podman_exporter
Dec 05 11:54:55 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 05 11:54:55 compute-0 podman[201595]: podman_exporter
Dec 05 11:54:55 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 05 11:54:55 compute-0 systemd[1]: Stopped podman_exporter container.
Dec 05 11:54:55 compute-0 systemd[1]: Starting podman_exporter container...
Dec 05 11:54:55 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:54:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 11:54:55 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.
Dec 05 11:54:55 compute-0 podman[201608]: 2025-12-05 11:54:55.522848202 +0000 UTC m=+0.119051259 container init 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:54:55 compute-0 podman_exporter[201623]: ts=2025-12-05T11:54:55.538Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 05 11:54:55 compute-0 podman_exporter[201623]: ts=2025-12-05T11:54:55.538Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 05 11:54:55 compute-0 podman_exporter[201623]: ts=2025-12-05T11:54:55.538Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 05 11:54:55 compute-0 podman_exporter[201623]: ts=2025-12-05T11:54:55.538Z caller=handler.go:105 level=info collector=container
Dec 05 11:54:55 compute-0 podman[201376]: @ - - [05/Dec/2025:11:54:55 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 05 11:54:55 compute-0 podman[201376]: time="2025-12-05T11:54:55Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 11:54:55 compute-0 podman[201608]: 2025-12-05 11:54:55.563959653 +0000 UTC m=+0.160162690 container start 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 11:54:55 compute-0 podman[201608]: podman_exporter
Dec 05 11:54:55 compute-0 podman[201376]: @ - - [05/Dec/2025:11:54:55 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19568 "" "Go-http-client/1.1"
Dec 05 11:54:55 compute-0 systemd[1]: Started podman_exporter container.
Dec 05 11:54:55 compute-0 podman_exporter[201623]: ts=2025-12-05T11:54:55.573Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 05 11:54:55 compute-0 podman_exporter[201623]: ts=2025-12-05T11:54:55.573Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 05 11:54:55 compute-0 podman_exporter[201623]: ts=2025-12-05T11:54:55.574Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 05 11:54:55 compute-0 sudo[201560]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:55 compute-0 podman[201633]: 2025-12-05 11:54:55.620030012 +0000 UTC m=+0.047557766 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:54:56 compute-0 sudo[201805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bktjxzlsdljydppnckdvzaztlcpcsyyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935695.781892-748-183864846800840/AnsiballZ_stat.py'
Dec 05 11:54:56 compute-0 sudo[201805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:56 compute-0 python3.9[201807]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:54:56 compute-0 sudo[201805]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:56 compute-0 sudo[201928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyeuqqdtbrcxzygnyjncijmdmjtilrgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935695.781892-748-183864846800840/AnsiballZ_copy.py'
Dec 05 11:54:56 compute-0 sudo[201928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:56 compute-0 python3.9[201930]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935695.781892-748-183864846800840/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 11:54:56 compute-0 sudo[201928]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:57 compute-0 sudo[202080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bimfkusrskcrjbymzzmmtugvanyxcsve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935697.0279133-765-235737261553705/AnsiballZ_container_config_data.py'
Dec 05 11:54:57 compute-0 sudo[202080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:57 compute-0 python3.9[202082]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 05 11:54:57 compute-0 sudo[202080]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:57 compute-0 sudo[202232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elaxgsxlzuwxxskwagrmktfzaorfktli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935697.7052448-774-82634926018290/AnsiballZ_container_config_hash.py'
Dec 05 11:54:57 compute-0 sudo[202232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:58 compute-0 python3.9[202234]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 11:54:58 compute-0 sudo[202232]: pam_unix(sudo:session): session closed for user root
Dec 05 11:54:58 compute-0 sudo[202384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umjutollzgvlhltkcmdoqjejyfacykrr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935698.4926946-784-117283256555713/AnsiballZ_edpm_container_manage.py'
Dec 05 11:54:58 compute-0 sudo[202384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:54:59 compute-0 python3[202386]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 11:55:02 compute-0 podman[202399]: 2025-12-05 11:55:02.263944373 +0000 UTC m=+3.184843050 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 05 11:55:02 compute-0 podman[202494]: 2025-12-05 11:55:02.422867225 +0000 UTC m=+0.064128633 container create 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Dec 05 11:55:02 compute-0 podman[202494]: 2025-12-05 11:55:02.386933763 +0000 UTC m=+0.028195221 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 05 11:55:02 compute-0 python3[202386]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 05 11:55:02 compute-0 sudo[202384]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:55:03.000 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:55:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:55:03.001 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:55:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:55:03.001 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:55:03 compute-0 sudo[202682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnuuwzdseodyabofdcmtcpqrbhgsdxki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935702.7256124-792-46911863172159/AnsiballZ_stat.py'
Dec 05 11:55:03 compute-0 sudo[202682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:03 compute-0 python3.9[202684]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:55:03 compute-0 sudo[202682]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:03 compute-0 sudo[202836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yozcgomejhzmssohkfkuypgkxbcmkekg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935703.4681785-801-204914686789897/AnsiballZ_file.py'
Dec 05 11:55:03 compute-0 sudo[202836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:03 compute-0 python3.9[202838]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:04 compute-0 sudo[202836]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:04 compute-0 sudo[203006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kypnblhdkqjmuqdbohlwflzzqxzfgwue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935704.0704522-801-263071331169123/AnsiballZ_copy.py'
Dec 05 11:55:04 compute-0 podman[202961]: 2025-12-05 11:55:04.509120473 +0000 UTC m=+0.068126417 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 11:55:04 compute-0 sudo[203006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:04 compute-0 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-2261b38cdfea5d2d.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 11:55:04 compute-0 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-2261b38cdfea5d2d.service: Failed with result 'exit-code'.
Dec 05 11:55:04 compute-0 python3.9[203008]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935704.0704522-801-263071331169123/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:04 compute-0 sudo[203006]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:04 compute-0 sudo[203082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hniswzyxoystyztsybysppikpkutdirs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935704.0704522-801-263071331169123/AnsiballZ_systemd.py'
Dec 05 11:55:04 compute-0 sudo[203082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:05 compute-0 python3.9[203084]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 11:55:05 compute-0 systemd[1]: Reloading.
Dec 05 11:55:05 compute-0 systemd-rc-local-generator[203110]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:55:05 compute-0 systemd-sysv-generator[203113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:55:05 compute-0 sudo[203082]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:05 compute-0 sudo[203192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsquwvzxcroevxnwyahotgaoeptxmbex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935704.0704522-801-263071331169123/AnsiballZ_systemd.py'
Dec 05 11:55:05 compute-0 sudo[203192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:06 compute-0 python3.9[203194]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 11:55:06 compute-0 systemd[1]: Reloading.
Dec 05 11:55:06 compute-0 systemd-rc-local-generator[203219]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 11:55:06 compute-0 systemd-sysv-generator[203224]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 11:55:06 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 05 11:55:06 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:55:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 05 11:55:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 11:55:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 11:55:06 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.
Dec 05 11:55:06 compute-0 podman[203235]: 2025-12-05 11:55:06.686415704 +0000 UTC m=+0.139106775 container init 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *bridge.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *coverage.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *datapath.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *iface.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *memory.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *ovnnorthd.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *ovn.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *ovsdbserver.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *pmd_perf.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *pmd_rxq.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *vswitch.Collector
Dec 05 11:55:06 compute-0 openstack_network_exporter[203250]: NOTICE  11:55:06 main.go:76: listening on https://:9105/metrics
Dec 05 11:55:06 compute-0 podman[203235]: 2025-12-05 11:55:06.711075011 +0000 UTC m=+0.163766092 container start 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 05 11:55:06 compute-0 podman[203235]: openstack_network_exporter
Dec 05 11:55:06 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 05 11:55:06 compute-0 sudo[203192]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:06 compute-0 podman[203260]: 2025-12-05 11:55:06.801203609 +0000 UTC m=+0.076417155 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Dec 05 11:55:07 compute-0 sudo[203433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-govxrikoguteewiqabqhnxiittpwxwcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935707.1221285-825-52334970233543/AnsiballZ_systemd.py'
Dec 05 11:55:07 compute-0 sudo[203433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:07 compute-0 python3.9[203435]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 11:55:07 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Dec 05 11:55:08 compute-0 systemd[1]: libpod-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope: Deactivated successfully.
Dec 05 11:55:08 compute-0 podman[203440]: 2025-12-05 11:55:08.209467991 +0000 UTC m=+0.454671504 container died 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container)
Dec 05 11:55:08 compute-0 systemd[1]: 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7-28ff159ea7c4e12e.timer: Deactivated successfully.
Dec 05 11:55:08 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.
Dec 05 11:55:08 compute-0 podman[203439]: 2025-12-05 11:55:08.330586499 +0000 UTC m=+0.578864071 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 11:55:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7-userdata-shm.mount: Deactivated successfully.
Dec 05 11:55:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b-merged.mount: Deactivated successfully.
Dec 05 11:55:09 compute-0 podman[203440]: 2025-12-05 11:55:09.685980542 +0000 UTC m=+1.931184065 container cleanup 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 11:55:09 compute-0 podman[203440]: openstack_network_exporter
Dec 05 11:55:09 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 05 11:55:09 compute-0 podman[203486]: openstack_network_exporter
Dec 05 11:55:09 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 05 11:55:09 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Dec 05 11:55:09 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 05 11:55:09 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:55:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 05 11:55:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 11:55:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 11:55:09 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.
Dec 05 11:55:09 compute-0 podman[203499]: 2025-12-05 11:55:09.902334804 +0000 UTC m=+0.117048812 container init 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *bridge.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *coverage.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *datapath.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *iface.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *memory.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *ovnnorthd.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *ovn.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *ovsdbserver.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *pmd_perf.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *pmd_rxq.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *vswitch.Collector
Dec 05 11:55:09 compute-0 openstack_network_exporter[203514]: NOTICE  11:55:09 main.go:76: listening on https://:9105/metrics
Dec 05 11:55:09 compute-0 podman[203499]: 2025-12-05 11:55:09.933586391 +0000 UTC m=+0.148300419 container start 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41)
Dec 05 11:55:09 compute-0 podman[203499]: openstack_network_exporter
Dec 05 11:55:09 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 05 11:55:09 compute-0 sudo[203433]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:10 compute-0 podman[203524]: 2025-12-05 11:55:10.040849031 +0000 UTC m=+0.093178616 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Dec 05 11:55:10 compute-0 sudo[203695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qosxoagvhlxlobfqlmidgzxlyjtjqkwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935710.1341534-833-52951915233673/AnsiballZ_find.py'
Dec 05 11:55:10 compute-0 sudo[203695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:10 compute-0 python3.9[203697]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 11:55:10 compute-0 sudo[203695]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:11 compute-0 sudo[203847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmmxrfosagheuekycjpihfnmtyjzpijh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935710.9583933-843-133430322676382/AnsiballZ_podman_container_info.py'
Dec 05 11:55:11 compute-0 sudo[203847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:11 compute-0 python3.9[203849]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 05 11:55:11 compute-0 sudo[203847]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:12 compute-0 sudo[204012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfepkplkqjegmhvpfmskkjefluhdbniq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935711.9229996-851-233972078358640/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:12 compute-0 sudo[204012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:12 compute-0 python3.9[204014]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:12 compute-0 systemd[1]: Started libpod-conmon-6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.scope.
Dec 05 11:55:12 compute-0 podman[204015]: 2025-12-05 11:55:12.70045315 +0000 UTC m=+0.089806839 container exec 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 11:55:12 compute-0 podman[204015]: 2025-12-05 11:55:12.730783881 +0000 UTC m=+0.120137540 container exec_died 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 11:55:12 compute-0 systemd[1]: libpod-conmon-6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.scope: Deactivated successfully.
Dec 05 11:55:12 compute-0 sudo[204012]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:13 compute-0 sudo[204194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zldhoyvxnzclqzoujpdymwmfgaiwuvrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935712.9537218-859-123493813216453/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:13 compute-0 sudo[204194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:13 compute-0 python3.9[204196]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:13 compute-0 systemd[1]: Started libpod-conmon-6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.scope.
Dec 05 11:55:13 compute-0 podman[204197]: 2025-12-05 11:55:13.888105938 +0000 UTC m=+0.263189287 container exec 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 05 11:55:14 compute-0 podman[204217]: 2025-12-05 11:55:14.002306187 +0000 UTC m=+0.096751419 container exec_died 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:55:14 compute-0 podman[204197]: 2025-12-05 11:55:14.152802068 +0000 UTC m=+0.527885357 container exec_died 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 11:55:14 compute-0 systemd[1]: libpod-conmon-6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.scope: Deactivated successfully.
Dec 05 11:55:14 compute-0 sudo[204194]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:15 compute-0 sudo[204392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvbneojivyepxfmstvfryabgdrogvmgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935714.677735-867-279121544578531/AnsiballZ_file.py'
Dec 05 11:55:15 compute-0 sudo[204392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:15 compute-0 podman[204354]: 2025-12-05 11:55:15.051031516 +0000 UTC m=+0.098694954 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 11:55:15 compute-0 python3.9[204399]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:15 compute-0 sudo[204392]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:15 compute-0 sudo[204569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqzscxycdhrtxnfuiyqpnbnsidqnsytj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935715.4657307-876-237606810445712/AnsiballZ_podman_container_info.py'
Dec 05 11:55:15 compute-0 sudo[204569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:15 compute-0 podman[204530]: 2025-12-05 11:55:15.841973715 +0000 UTC m=+0.134031869 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:55:15 compute-0 python3.9[204577]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 05 11:55:16 compute-0 sudo[204569]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:16 compute-0 sudo[204747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mblrrzakmfpijilwplncqzfiidtldftv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935716.204306-884-115464588720530/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:16 compute-0 sudo[204747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:16 compute-0 python3.9[204749]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:16 compute-0 systemd[1]: Started libpod-conmon-de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.scope.
Dec 05 11:55:17 compute-0 podman[204750]: 2025-12-05 11:55:17.01109178 +0000 UTC m=+0.236450339 container exec de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:55:17 compute-0 podman[204770]: 2025-12-05 11:55:17.165265417 +0000 UTC m=+0.127221994 container exec_died de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 11:55:17 compute-0 podman[204750]: 2025-12-05 11:55:17.31235538 +0000 UTC m=+0.537713869 container exec_died de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 11:55:17 compute-0 systemd[1]: libpod-conmon-de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.scope: Deactivated successfully.
Dec 05 11:55:17 compute-0 sudo[204747]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:17 compute-0 sudo[204932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqqkpfseukoqprfiairsdsgqolorhvml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935717.6368582-892-265799865059615/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:17 compute-0 sudo[204932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:18 compute-0 python3.9[204934]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:18 compute-0 systemd[1]: Started libpod-conmon-de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.scope.
Dec 05 11:55:18 compute-0 podman[204935]: 2025-12-05 11:55:18.338914634 +0000 UTC m=+0.219176904 container exec de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 11:55:18 compute-0 podman[204955]: 2025-12-05 11:55:18.40321015 +0000 UTC m=+0.051747127 container exec_died de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 05 11:55:18 compute-0 podman[204935]: 2025-12-05 11:55:18.471756238 +0000 UTC m=+0.352018468 container exec_died de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 11:55:18 compute-0 systemd[1]: libpod-conmon-de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.scope: Deactivated successfully.
Dec 05 11:55:18 compute-0 sudo[204932]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:19 compute-0 sudo[205117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-przkdbdepphqbhrsuymgxjebsmzbnwib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935718.938837-900-81949731855661/AnsiballZ_file.py'
Dec 05 11:55:19 compute-0 sudo[205117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:19 compute-0 python3.9[205119]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:19 compute-0 sudo[205117]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:19 compute-0 sudo[205269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drcuwkftwwwohrgfbqoomqiwietzdens ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935719.671867-909-41760144941255/AnsiballZ_podman_container_info.py'
Dec 05 11:55:19 compute-0 sudo[205269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:20 compute-0 python3.9[205271]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 05 11:55:20 compute-0 sudo[205269]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:20 compute-0 sudo[205435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifwjzgytsvetlxfnuldvcgzawvhyxnqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935720.4319062-917-163056645375997/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:20 compute-0 sudo[205435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:20 compute-0 python3.9[205437]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:21 compute-0 systemd[1]: Started libpod-conmon-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope.
Dec 05 11:55:21 compute-0 podman[205438]: 2025-12-05 11:55:21.034458614 +0000 UTC m=+0.081655175 container exec 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 11:55:21 compute-0 podman[205438]: 2025-12-05 11:55:21.068492001 +0000 UTC m=+0.115688542 container exec_died 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 11:55:21 compute-0 systemd[1]: libpod-conmon-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope: Deactivated successfully.
Dec 05 11:55:21 compute-0 sudo[205435]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:21 compute-0 podman[205467]: 2025-12-05 11:55:21.198576586 +0000 UTC m=+0.065452450 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 11:55:21 compute-0 sudo[205635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uusbovxbqpuloqtolmzswfldpntfdzbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935721.2915916-925-133221650115013/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:21 compute-0 sudo[205635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:21 compute-0 python3.9[205637]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:21 compute-0 systemd[1]: Started libpod-conmon-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope.
Dec 05 11:55:22 compute-0 podman[205638]: 2025-12-05 11:55:21.999646366 +0000 UTC m=+0.094526085 container exec 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:55:22 compute-0 podman[205658]: 2025-12-05 11:55:22.079313033 +0000 UTC m=+0.063955347 container exec_died 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 11:55:22 compute-0 podman[205638]: 2025-12-05 11:55:22.086251822 +0000 UTC m=+0.181131541 container exec_died 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:55:22 compute-0 systemd[1]: libpod-conmon-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope: Deactivated successfully.
Dec 05 11:55:22 compute-0 sudo[205635]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:22 compute-0 sudo[205820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzhktxtroesovgqnozfdcjivohkajtje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935722.2971554-933-77646648562147/AnsiballZ_file.py'
Dec 05 11:55:22 compute-0 sudo[205820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:22 compute-0 python3.9[205822]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:22 compute-0 sudo[205820]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:23 compute-0 sudo[205972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwbtlhwzohaghqzqakhphlewxeftkxma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935722.9998395-942-94626560925407/AnsiballZ_podman_container_info.py'
Dec 05 11:55:23 compute-0 sudo[205972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:23 compute-0 python3.9[205974]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec 05 11:55:23 compute-0 sudo[205972]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:23 compute-0 sudo[206137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilnyhklfgygpvfwvbgaofkctjrkelpgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935723.722581-950-135932044145486/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:23 compute-0 sudo[206137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:24 compute-0 python3.9[206139]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:24 compute-0 systemd[1]: Started libpod-conmon-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope.
Dec 05 11:55:24 compute-0 podman[206140]: 2025-12-05 11:55:24.323560796 +0000 UTC m=+0.113585132 container exec 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 05 11:55:24 compute-0 podman[206140]: 2025-12-05 11:55:24.355180164 +0000 UTC m=+0.145204480 container exec_died 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 05 11:55:24 compute-0 systemd[1]: libpod-conmon-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope: Deactivated successfully.
Dec 05 11:55:24 compute-0 sudo[206137]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:24 compute-0 sudo[206322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjnvsuorvztnyaaalnfzitrmjoglmhec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935724.5861874-958-198344575281709/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:24 compute-0 sudo[206322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:25 compute-0 python3.9[206324]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:25 compute-0 systemd[1]: Started libpod-conmon-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope.
Dec 05 11:55:25 compute-0 podman[206325]: 2025-12-05 11:55:25.208579556 +0000 UTC m=+0.096263455 container exec 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm)
Dec 05 11:55:25 compute-0 podman[206325]: 2025-12-05 11:55:25.243419146 +0000 UTC m=+0.131102955 container exec_died 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 11:55:25 compute-0 systemd[1]: libpod-conmon-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope: Deactivated successfully.
Dec 05 11:55:25 compute-0 sudo[206322]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:25 compute-0 sudo[206518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcfqlbomhhdynjubpawnroibdhpgricd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935725.4810216-966-49725721548238/AnsiballZ_file.py'
Dec 05 11:55:25 compute-0 sudo[206518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:25 compute-0 podman[206478]: 2025-12-05 11:55:25.827490175 +0000 UTC m=+0.063377861 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:55:26 compute-0 python3.9[206530]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:26 compute-0 sudo[206518]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:26 compute-0 sudo[206680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfpqqtqqzlzfpwgatanfzqawikbytpdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935726.2692757-975-236421461975412/AnsiballZ_podman_container_info.py'
Dec 05 11:55:26 compute-0 sudo[206680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:26 compute-0 python3.9[206682]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec 05 11:55:26 compute-0 sudo[206680]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:27 compute-0 sudo[206845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tskwmhvpgopbqbcdhoiiljcfemhohfpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935727.1465666-983-185491069146960/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:27 compute-0 sudo[206845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:27 compute-0 python3.9[206847]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:27 compute-0 systemd[1]: Started libpod-conmon-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope.
Dec 05 11:55:27 compute-0 podman[206848]: 2025-12-05 11:55:27.763672683 +0000 UTC m=+0.092244689 container exec 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 11:55:27 compute-0 podman[206848]: 2025-12-05 11:55:27.800952994 +0000 UTC m=+0.129525020 container exec_died 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 11:55:27 compute-0 systemd[1]: libpod-conmon-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope: Deactivated successfully.
Dec 05 11:55:27 compute-0 sudo[206845]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:28 compute-0 sudo[207029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdghcrezsiuglvjshewoxvaetuhbjpdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935728.0073442-991-83520996522803/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:28 compute-0 sudo[207029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:28 compute-0 python3.9[207031]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:29 compute-0 systemd[1]: Started libpod-conmon-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope.
Dec 05 11:55:29 compute-0 podman[207032]: 2025-12-05 11:55:29.056243734 +0000 UTC m=+0.555236362 container exec 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 11:55:29 compute-0 podman[207052]: 2025-12-05 11:55:29.151239602 +0000 UTC m=+0.076054495 container exec_died 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 11:55:29 compute-0 podman[207032]: 2025-12-05 11:55:29.157148681 +0000 UTC m=+0.656141279 container exec_died 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 11:55:29 compute-0 systemd[1]: libpod-conmon-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope: Deactivated successfully.
Dec 05 11:55:29 compute-0 sudo[207029]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:29 compute-0 sudo[207214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsgyblzcfgnmhcskajubxipblsetigxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935729.3627474-999-239752064311906/AnsiballZ_file.py'
Dec 05 11:55:29 compute-0 sudo[207214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:29 compute-0 python3.9[207216]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:29 compute-0 sudo[207214]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:30 compute-0 sudo[207366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erwitthhuqplptloblbjpafxaggyypvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935730.0289004-1008-129971558002114/AnsiballZ_podman_container_info.py'
Dec 05 11:55:30 compute-0 sudo[207366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:30 compute-0 nova_compute[187208]: 2025-12-05 11:55:30.360 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:30 compute-0 nova_compute[187208]: 2025-12-05 11:55:30.384 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:30 compute-0 nova_compute[187208]: 2025-12-05 11:55:30.384 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:30 compute-0 python3.9[207368]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 05 11:55:30 compute-0 sudo[207366]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 11:55:31 compute-0 sudo[207531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwigenzuspsqkzbaqsnsbhajmwtypesc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935730.8107862-1016-222278645672341/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:31 compute-0 sudo[207531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.077 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.078 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.078 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.078 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.102 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.102 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.103 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.103 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.262 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.263 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5948MB free_disk=73.37312698364258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.263 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.263 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:55:31 compute-0 python3.9[207533]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.489 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.489 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.507 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.526 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.527 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 11:55:31 compute-0 nova_compute[187208]: 2025-12-05 11:55:31.527 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:55:31 compute-0 systemd[1]: Started libpod-conmon-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope.
Dec 05 11:55:31 compute-0 podman[207534]: 2025-12-05 11:55:31.616415608 +0000 UTC m=+0.322661475 container exec 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 11:55:31 compute-0 podman[207554]: 2025-12-05 11:55:31.787325195 +0000 UTC m=+0.151016327 container exec_died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 11:55:31 compute-0 podman[207534]: 2025-12-05 11:55:31.908089942 +0000 UTC m=+0.614335799 container exec_died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 11:55:31 compute-0 systemd[1]: libpod-conmon-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope: Deactivated successfully.
Dec 05 11:55:32 compute-0 sudo[207531]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:32 compute-0 nova_compute[187208]: 2025-12-05 11:55:32.509 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:32 compute-0 nova_compute[187208]: 2025-12-05 11:55:32.509 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:55:32 compute-0 nova_compute[187208]: 2025-12-05 11:55:32.509 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 11:55:32 compute-0 sudo[207716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgmepcpmiloehpyauobouudezhukelfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935732.6345913-1024-38708032564478/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:32 compute-0 sudo[207716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:33 compute-0 python3.9[207718]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:33 compute-0 systemd[1]: Started libpod-conmon-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope.
Dec 05 11:55:33 compute-0 podman[207719]: 2025-12-05 11:55:33.419169877 +0000 UTC m=+0.282574494 container exec 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:55:33 compute-0 podman[207738]: 2025-12-05 11:55:33.499269456 +0000 UTC m=+0.066296364 container exec_died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:55:33 compute-0 podman[207719]: 2025-12-05 11:55:33.505155925 +0000 UTC m=+0.368560502 container exec_died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 11:55:33 compute-0 systemd[1]: libpod-conmon-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope: Deactivated successfully.
Dec 05 11:55:33 compute-0 sudo[207716]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:34 compute-0 sudo[207900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpuowtkvwsgzyqgytydvbccxacirsgof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935733.6914322-1032-255093015674451/AnsiballZ_file.py'
Dec 05 11:55:34 compute-0 sudo[207900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:34 compute-0 python3.9[207902]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:34 compute-0 sudo[207900]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:34 compute-0 sudo[208063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttthvwypqemteaqxygwmhyjbvkuoeprv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935734.4372573-1041-221242082730090/AnsiballZ_podman_container_info.py'
Dec 05 11:55:34 compute-0 sudo[208063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:34 compute-0 podman[208026]: 2025-12-05 11:55:34.796386187 +0000 UTC m=+0.072866923 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:55:34 compute-0 python3.9[208068]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 05 11:55:35 compute-0 sudo[208063]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:35 compute-0 sudo[208236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmneegsohtbwcaswfkrfkozbnzjzbzsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935735.1933274-1049-255128746139667/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:35 compute-0 sudo[208236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:35 compute-0 python3.9[208238]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:35 compute-0 systemd[1]: Started libpod-conmon-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope.
Dec 05 11:55:35 compute-0 podman[208239]: 2025-12-05 11:55:35.853790076 +0000 UTC m=+0.110750811 container exec 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Dec 05 11:55:35 compute-0 podman[208239]: 2025-12-05 11:55:35.89224879 +0000 UTC m=+0.149209505 container exec_died 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Dec 05 11:55:35 compute-0 systemd[1]: libpod-conmon-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope: Deactivated successfully.
Dec 05 11:55:35 compute-0 sudo[208236]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:36 compute-0 sudo[208420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbhtqzqlqmgryucfhomotdbxvcpuzbvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935736.1025145-1057-256737987641710/AnsiballZ_podman_container_exec.py'
Dec 05 11:55:36 compute-0 sudo[208420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:36 compute-0 python3.9[208422]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 11:55:36 compute-0 systemd[1]: Started libpod-conmon-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope.
Dec 05 11:55:36 compute-0 podman[208423]: 2025-12-05 11:55:36.747171345 +0000 UTC m=+0.093539766 container exec 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm)
Dec 05 11:55:36 compute-0 podman[208423]: 2025-12-05 11:55:36.751888351 +0000 UTC m=+0.098256752 container exec_died 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 11:55:36 compute-0 sudo[208420]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:36 compute-0 systemd[1]: libpod-conmon-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope: Deactivated successfully.
Dec 05 11:55:37 compute-0 sudo[208604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiuxbzunghfvpezkiimyjccmsyrsnwvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935736.936122-1065-124129124359141/AnsiballZ_file.py'
Dec 05 11:55:37 compute-0 sudo[208604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:37 compute-0 python3.9[208606]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:37 compute-0 sudo[208604]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:37 compute-0 sudo[208756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqgjgzxjwqgcgzoqpxuppfxplatnbyua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935737.5908756-1074-142975822434163/AnsiballZ_file.py'
Dec 05 11:55:37 compute-0 sudo[208756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:38 compute-0 python3.9[208758]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:38 compute-0 sudo[208756]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:38 compute-0 sudo[208917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbmvfvmljltgulbfcewqlfskabfoezvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935738.2423933-1082-19824319945370/AnsiballZ_stat.py'
Dec 05 11:55:38 compute-0 sudo[208917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:38 compute-0 podman[208882]: 2025-12-05 11:55:38.616855834 +0000 UTC m=+0.079175264 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 11:55:38 compute-0 python3.9[208921]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:55:38 compute-0 sudo[208917]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:39 compute-0 sudo[209047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kryqcvyeutczodkpfbfsmhqejqrzzujb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935738.2423933-1082-19824319945370/AnsiballZ_copy.py'
Dec 05 11:55:39 compute-0 sudo[209047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:39 compute-0 python3.9[209049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935738.2423933-1082-19824319945370/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:39 compute-0 sudo[209047]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:39 compute-0 sudo[209199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-negrxmzqpxtoxghvpodnxawvkgerjuyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935739.6370895-1098-78713963184217/AnsiballZ_file.py'
Dec 05 11:55:39 compute-0 sudo[209199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:40 compute-0 python3.9[209201]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:40 compute-0 sudo[209199]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:40 compute-0 podman[209202]: 2025-12-05 11:55:40.22897386 +0000 UTC m=+0.070393573 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 05 11:55:40 compute-0 sudo[209372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlojurrfartzlcmodpkxnmxuelrbiike ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935740.269982-1106-236168172795011/AnsiballZ_stat.py'
Dec 05 11:55:40 compute-0 sudo[209372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:40 compute-0 python3.9[209374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:55:40 compute-0 sudo[209372]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:41 compute-0 sudo[209450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syoelqtjcsfaelfxcariybbesveksagf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935740.269982-1106-236168172795011/AnsiballZ_file.py'
Dec 05 11:55:41 compute-0 sudo[209450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:41 compute-0 python3.9[209452]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:41 compute-0 sudo[209450]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:41 compute-0 sudo[209602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjeglzfvgkcsjtloicdnfancccefmhhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935741.3919513-1118-104763369901463/AnsiballZ_stat.py'
Dec 05 11:55:41 compute-0 sudo[209602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:41 compute-0 python3.9[209604]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:55:41 compute-0 sudo[209602]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:42 compute-0 sudo[209680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uousbbhtswbzhwgraarqviayfxndfahk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935741.3919513-1118-104763369901463/AnsiballZ_file.py'
Dec 05 11:55:42 compute-0 sudo[209680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:42 compute-0 python3.9[209682]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.wz5evwrs recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:42 compute-0 sudo[209680]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:42 compute-0 sudo[209832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjwyzemiznosxpuumsfrhrlnotrqzztq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935742.5072646-1130-248373884736353/AnsiballZ_stat.py'
Dec 05 11:55:42 compute-0 sudo[209832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:43 compute-0 python3.9[209834]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:55:43 compute-0 sudo[209832]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:43 compute-0 sudo[209910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwwsfyjkluyozewlwkzavngodvhinceh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935742.5072646-1130-248373884736353/AnsiballZ_file.py'
Dec 05 11:55:43 compute-0 sudo[209910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:43 compute-0 python3.9[209912]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:43 compute-0 sudo[209910]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:44 compute-0 sudo[210062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sskxztdphillbgjgaakamryxmhusnueh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935743.7580314-1143-225176146704320/AnsiballZ_command.py'
Dec 05 11:55:44 compute-0 sudo[210062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:44 compute-0 python3.9[210064]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:55:44 compute-0 sudo[210062]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:44 compute-0 sudo[210215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obblkihnnjtzucslbwlxncxamxfjfhxc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764935744.4305327-1151-21094094697110/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 11:55:44 compute-0 sudo[210215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:45 compute-0 python3[210217]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 11:55:45 compute-0 sudo[210215]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:45 compute-0 podman[210218]: 2025-12-05 11:55:45.208155475 +0000 UTC m=+0.060630932 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 11:55:45 compute-0 sudo[210390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbrkgfrrgtstropapdyskbzuovlpjolu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935745.3220992-1159-25633837200155/AnsiballZ_stat.py'
Dec 05 11:55:45 compute-0 sudo[210390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:45 compute-0 python3.9[210392]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:55:45 compute-0 sudo[210390]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:46 compute-0 sudo[210481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aozcmeepavflsmujozqetkuifeiysfhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935745.3220992-1159-25633837200155/AnsiballZ_file.py'
Dec 05 11:55:46 compute-0 sudo[210481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:46 compute-0 podman[210442]: 2025-12-05 11:55:46.147409491 +0000 UTC m=+0.105655485 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 11:55:46 compute-0 python3.9[210485]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:46 compute-0 sudo[210481]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:46 compute-0 sudo[210647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odtbgbgtclmgvhdrlmkaxvfnxabppmoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935746.4887066-1171-250989049665428/AnsiballZ_stat.py'
Dec 05 11:55:46 compute-0 sudo[210647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:47 compute-0 python3.9[210649]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:55:47 compute-0 sudo[210647]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:47 compute-0 sudo[210725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdgcpqpojgimvqgbqhmiwpiuafqawwix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935746.4887066-1171-250989049665428/AnsiballZ_file.py'
Dec 05 11:55:47 compute-0 sudo[210725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:47 compute-0 python3.9[210727]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:47 compute-0 sudo[210725]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:47 compute-0 sudo[210877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvndsqvzopztjoyaremyfapannbcscrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935747.6763558-1183-130845027700749/AnsiballZ_stat.py'
Dec 05 11:55:47 compute-0 sudo[210877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:48 compute-0 python3.9[210879]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:55:48 compute-0 sudo[210877]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:48 compute-0 sudo[210955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txnpkdvfruwqjokgbbjjbwskqqyxeflo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935747.6763558-1183-130845027700749/AnsiballZ_file.py'
Dec 05 11:55:48 compute-0 sudo[210955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:48 compute-0 python3.9[210957]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:48 compute-0 sudo[210955]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:49 compute-0 sudo[211107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ingmrlouwixwjzdylvmrvqguwyjomhgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935748.8104177-1195-75346321138442/AnsiballZ_stat.py'
Dec 05 11:55:49 compute-0 sudo[211107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:49 compute-0 python3.9[211109]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:55:49 compute-0 sudo[211107]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:49 compute-0 sudo[211185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guoqvarbmqfpbehbdfggijytqmtsxslc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935748.8104177-1195-75346321138442/AnsiballZ_file.py'
Dec 05 11:55:49 compute-0 sudo[211185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:50 compute-0 python3.9[211187]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:50 compute-0 sudo[211185]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:50 compute-0 sudo[211337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpvngxmgxwtayfakorsshhkktqauxwwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935750.2456598-1207-140936292841325/AnsiballZ_stat.py'
Dec 05 11:55:50 compute-0 sudo[211337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:50 compute-0 python3.9[211339]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 11:55:50 compute-0 sudo[211337]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:51 compute-0 sudo[211462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdbluitlpjdvugemnkgygfryvqcxaegf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935750.2456598-1207-140936292841325/AnsiballZ_copy.py'
Dec 05 11:55:51 compute-0 sudo[211462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:51 compute-0 python3.9[211464]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935750.2456598-1207-140936292841325/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:51 compute-0 sudo[211462]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:51 compute-0 sudo[211624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miszagqmjpdbiicdqhombmpocdjtdlzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935751.5972679-1222-118056769798145/AnsiballZ_file.py'
Dec 05 11:55:51 compute-0 sudo[211624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:51 compute-0 podman[211588]: 2025-12-05 11:55:51.904886413 +0000 UTC m=+0.068760087 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 11:55:52 compute-0 python3.9[211632]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:52 compute-0 sudo[211624]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:52 compute-0 sudo[211785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzzhcqbfisisjzthrnydktggsowsxkfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935752.268227-1230-129525583302778/AnsiballZ_command.py'
Dec 05 11:55:52 compute-0 sudo[211785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:52 compute-0 python3.9[211787]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:55:52 compute-0 sudo[211785]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:53 compute-0 sudo[211940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnoakmkgzskkflewubqmawptigaxzyxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935752.9721336-1238-55954395265989/AnsiballZ_blockinfile.py'
Dec 05 11:55:53 compute-0 sudo[211940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:53 compute-0 python3.9[211942]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:53 compute-0 sudo[211940]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:54 compute-0 sudo[212092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grrmydywctkbdypkiiiutdxyslugnjjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935753.8842366-1247-71115123677910/AnsiballZ_command.py'
Dec 05 11:55:54 compute-0 sudo[212092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:54 compute-0 python3.9[212094]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:55:54 compute-0 sudo[212092]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:54 compute-0 sudo[212245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyqiasldtdpuvfhaxwpygpggjbqurevn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935754.501442-1255-173163597563194/AnsiballZ_stat.py'
Dec 05 11:55:54 compute-0 sudo[212245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:54 compute-0 python3.9[212247]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 11:55:55 compute-0 sudo[212245]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:55 compute-0 sudo[212399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbogruogtwpzrfcurjgrucnjbwkddpnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935755.1839073-1263-65093837435234/AnsiballZ_command.py'
Dec 05 11:55:55 compute-0 sudo[212399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:55 compute-0 python3.9[212401]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 11:55:55 compute-0 sudo[212399]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:56 compute-0 sudo[212569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dajzpmtpaoxxollasowzkvekbuindsqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764935755.8193126-1271-97198780358758/AnsiballZ_file.py'
Dec 05 11:55:56 compute-0 sudo[212569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 11:55:56 compute-0 podman[212528]: 2025-12-05 11:55:56.129065749 +0000 UTC m=+0.074218706 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 11:55:56 compute-0 python3.9[212580]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 11:55:56 compute-0 sudo[212569]: pam_unix(sudo:session): session closed for user root
Dec 05 11:55:56 compute-0 sshd-session[187533]: Connection closed by 192.168.122.30 port 40106
Dec 05 11:55:56 compute-0 sshd-session[187530]: pam_unix(sshd:session): session closed for user zuul
Dec 05 11:55:56 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Dec 05 11:55:56 compute-0 systemd[1]: session-25.scope: Consumed 1min 41.272s CPU time.
Dec 05 11:55:56 compute-0 systemd-logind[792]: Session 25 logged out. Waiting for processes to exit.
Dec 05 11:55:56 compute-0 systemd-logind[792]: Removed session 25.
Dec 05 11:56:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:56:03.002 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:56:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:56:03.002 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:56:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:56:03.002 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:56:05 compute-0 podman[212605]: 2025-12-05 11:56:05.201764718 +0000 UTC m=+0.057853283 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 05 11:56:09 compute-0 podman[212625]: 2025-12-05 11:56:09.208359998 +0000 UTC m=+0.057988166 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:56:11 compute-0 podman[212645]: 2025-12-05 11:56:11.196753482 +0000 UTC m=+0.055373881 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container)
Dec 05 11:56:16 compute-0 podman[212666]: 2025-12-05 11:56:16.21017849 +0000 UTC m=+0.068929152 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 11:56:16 compute-0 podman[212691]: 2025-12-05 11:56:16.350828434 +0000 UTC m=+0.110372270 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 11:56:17 compute-0 sshd-session[212719]: banner exchange: Connection from 40.124.114.161 port 38142: invalid format
Dec 05 11:56:22 compute-0 podman[212720]: 2025-12-05 11:56:22.206146894 +0000 UTC m=+0.061947911 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 11:56:27 compute-0 podman[212740]: 2025-12-05 11:56:27.244682047 +0000 UTC m=+0.087930262 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:56:27 compute-0 sshd-session[212717]: Connection closed by 40.124.114.161 port 38140 [preauth]
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.081 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.108 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.108 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.108 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.108 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.311 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.313 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6011MB free_disk=73.37332534790039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.313 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.314 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.405 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.406 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.429 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.446 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.449 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 11:56:31 compute-0 nova_compute[187208]: 2025-12-05 11:56:31.449 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:56:32 compute-0 nova_compute[187208]: 2025-12-05 11:56:32.428 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:56:32 compute-0 nova_compute[187208]: 2025-12-05 11:56:32.429 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:56:33 compute-0 nova_compute[187208]: 2025-12-05 11:56:33.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:56:33 compute-0 nova_compute[187208]: 2025-12-05 11:56:33.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:56:34 compute-0 nova_compute[187208]: 2025-12-05 11:56:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:56:34 compute-0 nova_compute[187208]: 2025-12-05 11:56:34.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:56:34 compute-0 nova_compute[187208]: 2025-12-05 11:56:34.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 11:56:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:56:35.203 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 11:56:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:56:35.204 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 11:56:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:56:35.205 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:56:36 compute-0 podman[212764]: 2025-12-05 11:56:36.232893036 +0000 UTC m=+0.089602470 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:56:40 compute-0 podman[212784]: 2025-12-05 11:56:40.205089342 +0000 UTC m=+0.063791154 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 05 11:56:42 compute-0 podman[212802]: 2025-12-05 11:56:42.19081911 +0000 UTC m=+0.049666596 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 11:56:47 compute-0 podman[212824]: 2025-12-05 11:56:47.201307971 +0000 UTC m=+0.055830784 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 11:56:47 compute-0 podman[212825]: 2025-12-05 11:56:47.264508187 +0000 UTC m=+0.114205670 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 11:56:53 compute-0 podman[212870]: 2025-12-05 11:56:53.209339159 +0000 UTC m=+0.061615180 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 11:56:58 compute-0 podman[212890]: 2025-12-05 11:56:58.21934809 +0000 UTC m=+0.067210328 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:57:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:57:03.003 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:57:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:57:03.003 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:57:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:57:03.003 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:57:07 compute-0 podman[212915]: 2025-12-05 11:57:07.228006261 +0000 UTC m=+0.074484283 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 05 11:57:11 compute-0 podman[212935]: 2025-12-05 11:57:11.232475343 +0000 UTC m=+0.086815280 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 11:57:13 compute-0 podman[212955]: 2025-12-05 11:57:13.205846426 +0000 UTC m=+0.064495501 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=)
Dec 05 11:57:18 compute-0 podman[212978]: 2025-12-05 11:57:18.206814493 +0000 UTC m=+0.058588925 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 11:57:18 compute-0 podman[212979]: 2025-12-05 11:57:18.236624154 +0000 UTC m=+0.089641112 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 11:57:24 compute-0 podman[213026]: 2025-12-05 11:57:24.219060959 +0000 UTC m=+0.069736219 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Dec 05 11:57:29 compute-0 podman[213046]: 2025-12-05 11:57:29.217710588 +0000 UTC m=+0.065266942 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 11:57:31 compute-0 nova_compute[187208]: 2025-12-05 11:57:31.057 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:31 compute-0 nova_compute[187208]: 2025-12-05 11:57:31.862 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:31 compute-0 nova_compute[187208]: 2025-12-05 11:57:31.862 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 11:57:31 compute-0 nova_compute[187208]: 2025-12-05 11:57:31.862 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.028 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.028 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.322 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.322 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.322 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.322 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.477 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.478 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6051MB free_disk=73.37330627441406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.478 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.479 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.559 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.560 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.587 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.605 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.606 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 11:57:32 compute-0 nova_compute[187208]: 2025-12-05 11:57:32.607 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:57:33 compute-0 nova_compute[187208]: 2025-12-05 11:57:33.609 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:34 compute-0 nova_compute[187208]: 2025-12-05 11:57:34.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:34 compute-0 nova_compute[187208]: 2025-12-05 11:57:34.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:34 compute-0 nova_compute[187208]: 2025-12-05 11:57:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:34 compute-0 nova_compute[187208]: 2025-12-05 11:57:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:57:34 compute-0 nova_compute[187208]: 2025-12-05 11:57:34.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 11:57:38 compute-0 podman[213072]: 2025-12-05 11:57:38.210217674 +0000 UTC m=+0.062728132 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 11:57:42 compute-0 podman[213093]: 2025-12-05 11:57:42.217081695 +0000 UTC m=+0.072836897 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 11:57:44 compute-0 podman[213112]: 2025-12-05 11:57:44.194956593 +0000 UTC m=+0.049314373 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 11:57:49 compute-0 podman[213133]: 2025-12-05 11:57:49.353737033 +0000 UTC m=+0.191976739 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 11:57:49 compute-0 podman[213134]: 2025-12-05 11:57:49.429850621 +0000 UTC m=+0.158327770 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 11:57:55 compute-0 podman[213184]: 2025-12-05 11:57:55.217869589 +0000 UTC m=+0.069006759 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:58:00 compute-0 podman[213204]: 2025-12-05 11:58:00.223099581 +0000 UTC m=+0.074885795 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 11:58:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:58:03.004 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:58:03.004 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:58:03.004 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:09 compute-0 podman[213230]: 2025-12-05 11:58:09.217890112 +0000 UTC m=+0.067926413 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 05 11:58:13 compute-0 podman[213250]: 2025-12-05 11:58:13.215945799 +0000 UTC m=+0.065438074 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:58:15 compute-0 podman[213269]: 2025-12-05 11:58:15.222840604 +0000 UTC m=+0.068543280 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 05 11:58:18 compute-0 sshd-session[213292]: Received disconnect from 43.225.159.111 port 41196:11:  [preauth]
Dec 05 11:58:18 compute-0 sshd-session[213292]: Disconnected from authenticating user root 43.225.159.111 port 41196 [preauth]
Dec 05 11:58:20 compute-0 podman[213294]: 2025-12-05 11:58:20.205906889 +0000 UTC m=+0.061255289 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 11:58:20 compute-0 podman[213295]: 2025-12-05 11:58:20.248275777 +0000 UTC m=+0.092486400 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 11:58:26 compute-0 podman[213341]: 2025-12-05 11:58:26.216890608 +0000 UTC m=+0.068190340 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 05 11:58:30 compute-0 nova_compute[187208]: 2025-12-05 11:58:30.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:30 compute-0 nova_compute[187208]: 2025-12-05 11:58:30.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 11:58:30 compute-0 nova_compute[187208]: 2025-12-05 11:58:30.079 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 11:58:30 compute-0 nova_compute[187208]: 2025-12-05 11:58:30.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:30 compute-0 nova_compute[187208]: 2025-12-05 11:58:30.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 11:58:30 compute-0 nova_compute[187208]: 2025-12-05 11:58:30.142 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:31 compute-0 nova_compute[187208]: 2025-12-05 11:58:31.151 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:31 compute-0 nova_compute[187208]: 2025-12-05 11:58:31.151 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 11:58:31 compute-0 nova_compute[187208]: 2025-12-05 11:58:31.151 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 11:58:31 compute-0 nova_compute[187208]: 2025-12-05 11:58:31.197 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 11:58:31 compute-0 nova_compute[187208]: 2025-12-05 11:58:31.197 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:31 compute-0 podman[213361]: 2025-12-05 11:58:31.220721756 +0000 UTC m=+0.068224922 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 11:58:33 compute-0 nova_compute[187208]: 2025-12-05 11:58:33.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.088 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.261 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.262 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6059MB free_disk=73.37367248535156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.262 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.262 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.451 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.451 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.517 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.610 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.611 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.645 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.673 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.695 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.718 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.720 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 11:58:34 compute-0 nova_compute[187208]: 2025-12-05 11:58:34.721 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:58:34.802 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 11:58:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:58:34.804 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 11:58:35 compute-0 nova_compute[187208]: 2025-12-05 11:58:35.720 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:35 compute-0 nova_compute[187208]: 2025-12-05 11:58:35.721 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:35 compute-0 nova_compute[187208]: 2025-12-05 11:58:35.721 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:35 compute-0 nova_compute[187208]: 2025-12-05 11:58:35.721 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 11:58:36 compute-0 nova_compute[187208]: 2025-12-05 11:58:36.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:36 compute-0 nova_compute[187208]: 2025-12-05 11:58:36.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.612 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.612 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.630 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.712 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.713 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.731 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.734 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.734 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.741 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.741 187212 INFO nova.compute.claims [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.841 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.932 187212 DEBUG nova.compute.provider_tree [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.955 187212 DEBUG nova.scheduler.client.report [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.976 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.976 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.978 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.983 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:58:37 compute-0 nova_compute[187208]: 2025-12-05 11:58:37.983 187212 INFO nova.compute.claims [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.055 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.077 187212 INFO nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.098 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.157 187212 DEBUG nova.compute.provider_tree [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.227 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.228 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.229 187212 INFO nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Creating image(s)
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.230 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.230 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.231 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.232 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.232 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.259 187212 DEBUG nova.scheduler.client.report [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.286 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.286 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.345 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.346 187212 DEBUG nova.network.neutron [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.370 187212 INFO nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.388 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.481 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.483 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.484 187212 INFO nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Creating image(s)
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.484 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.485 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.485 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:38 compute-0 nova_compute[187208]: 2025-12-05 11:58:38.486 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:39 compute-0 nova_compute[187208]: 2025-12-05 11:58:39.821 187212 DEBUG nova.network.neutron [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 11:58:39 compute-0 nova_compute[187208]: 2025-12-05 11:58:39.821 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:58:40 compute-0 podman[213386]: 2025-12-05 11:58:40.561277907 +0000 UTC m=+0.401903529 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.374 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.424 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.part --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.426 187212 DEBUG nova.virt.images [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] a6987852-063f-405d-a848-6b382694811e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.426 187212 DEBUG nova.privsep.utils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.427 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.part /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.649 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.part /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.converted" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.653 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.715 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.converted --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.716 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.733 187212 INFO oslo.privsep.daemon [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpllntry_p/privsep.sock']
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.735 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 3.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:41 compute-0 nova_compute[187208]: 2025-12-05 11:58:41.735 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.464 187212 INFO oslo.privsep.daemon [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Spawned new privsep daemon via rootwrap
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.320 213424 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.324 213424 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.326 213424 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.326 213424 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213424
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.468 187212 WARNING oslo_privsep.priv_context [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] privsep daemon already running
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.570 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.582 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.622 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.624 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.625 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.636 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.651 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.653 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.695 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.696 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.745 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.747 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.748 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.764 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.786 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.805 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.806 187212 DEBUG nova.virt.disk.api [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Checking if we can resize image /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.806 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:58:42.806 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.858 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.859 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.872 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.873 187212 DEBUG nova.virt.disk.api [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Cannot resize image /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.874 187212 DEBUG nova.objects.instance [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'migration_context' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.887 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.888 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.888 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.902 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.903 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Ensure instance console log exists: /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.903 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.903 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.904 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.906 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.910 187212 WARNING nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.918 187212 DEBUG nova.virt.libvirt.host [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.919 187212 DEBUG nova.virt.libvirt.host [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.923 187212 DEBUG nova.virt.libvirt.host [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.923 187212 DEBUG nova.virt.libvirt.host [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.924 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.924 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.925 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.925 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.925 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.926 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.926 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.926 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.927 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.927 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.927 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.927 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.931 187212 DEBUG nova.privsep.utils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.932 187212 DEBUG nova.objects.instance [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'pci_devices' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.941 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.942 187212 DEBUG nova.virt.disk.api [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Checking if we can resize image /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.942 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:42 compute-0 nova_compute[187208]: 2025-12-05 11:58:42.987 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <uuid>caa6c7c3-7eb3-4636-a7ad-7b605ef393ba</uuid>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <name>instance-00000001</name>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <nova:name>tempest-AutoAllocateNetworkTest-server-2092831344</nova:name>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:58:42</nova:creationTime>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:58:42 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:58:42 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:58:42 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:58:42 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:58:42 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:58:42 compute-0 nova_compute[187208]:         <nova:user uuid="c4c62f22ba09455995ea1bde6a93431e">tempest-AutoAllocateNetworkTest-275048159-project-member</nova:user>
Dec 05 11:58:42 compute-0 nova_compute[187208]:         <nova:project uuid="fb2c9c006bee4723bc8dd108e19a6728">tempest-AutoAllocateNetworkTest-275048159</nova:project>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <system>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <entry name="serial">caa6c7c3-7eb3-4636-a7ad-7b605ef393ba</entry>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <entry name="uuid">caa6c7c3-7eb3-4636-a7ad-7b605ef393ba</entry>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     </system>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <os>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   </os>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <features>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   </features>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.config"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/console.log" append="off"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <video>
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     </video>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:58:42 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:58:42 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:58:42 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:58:42 compute-0 nova_compute[187208]: </domain>
Dec 05 11:58:42 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.010 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.011 187212 DEBUG nova.virt.disk.api [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Cannot resize image /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.011 187212 DEBUG nova.objects.instance [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'migration_context' on Instance uuid c9498e91-01c5-47f7-b3ba-6291bb43635d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.026 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.026 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Ensure instance console log exists: /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.027 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.027 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.027 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.029 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.033 187212 WARNING nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.037 187212 DEBUG nova.virt.libvirt.host [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.037 187212 DEBUG nova.virt.libvirt.host [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.040 187212 DEBUG nova.virt.libvirt.host [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.040 187212 DEBUG nova.virt.libvirt.host [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.040 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.041 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.041 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.041 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.042 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.042 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.042 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.042 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.043 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.043 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.043 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.043 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.047 187212 DEBUG nova.objects.instance [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9498e91-01c5-47f7-b3ba-6291bb43635d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.053 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.054 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.054 187212 INFO nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Using config drive
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.065 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <uuid>c9498e91-01c5-47f7-b3ba-6291bb43635d</uuid>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <name>instance-00000002</name>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <nova:name>tempest-DeleteServersAdminTestJSON-server-640270599</nova:name>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:58:43</nova:creationTime>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:58:43 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:58:43 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:58:43 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:58:43 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:58:43 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:58:43 compute-0 nova_compute[187208]:         <nova:user uuid="e2dbb72c61fa4cdfa0de840c11264065">tempest-DeleteServersAdminTestJSON-1088655224-project-member</nova:user>
Dec 05 11:58:43 compute-0 nova_compute[187208]:         <nova:project uuid="43e0982f67c94ddb8da10556d22f6e39">tempest-DeleteServersAdminTestJSON-1088655224</nova:project>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <system>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <entry name="serial">c9498e91-01c5-47f7-b3ba-6291bb43635d</entry>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <entry name="uuid">c9498e91-01c5-47f7-b3ba-6291bb43635d</entry>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     </system>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <os>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   </os>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <features>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   </features>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.config"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/console.log" append="off"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <video>
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     </video>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:58:43 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:58:43 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:58:43 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:58:43 compute-0 nova_compute[187208]: </domain>
Dec 05 11:58:43 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.116 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.116 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:43 compute-0 nova_compute[187208]: 2025-12-05 11:58:43.117 187212 INFO nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Using config drive
Dec 05 11:58:44 compute-0 podman[213458]: 2025-12-05 11:58:44.248511756 +0000 UTC m=+0.079633705 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.298 187212 INFO nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Creating config drive at /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.config
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.303 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpthw70lrf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.329 187212 INFO nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Creating config drive at /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.config
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.335 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpigrnvx57 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.426 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpthw70lrf" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.461 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpigrnvx57" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:44 compute-0 systemd-machined[153543]: New machine qemu-1-instance-00000001.
Dec 05 11:58:44 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec 05 11:58:44 compute-0 systemd-machined[153543]: New machine qemu-2-instance-00000002.
Dec 05 11:58:44 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.858 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935924.8576803, c9498e91-01c5-47f7-b3ba-6291bb43635d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.859 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] VM Resumed (Lifecycle Event)
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.862 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.862 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.866 187212 INFO nova.virt.libvirt.driver [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance spawned successfully.
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.866 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.923 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.928 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.932 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.932 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.933 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.933 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.934 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.934 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.965 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.966 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935924.8586826, c9498e91-01c5-47f7-b3ba-6291bb43635d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:58:44 compute-0 nova_compute[187208]: 2025-12-05 11:58:44.966 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] VM Started (Lifecycle Event)
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.003 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.007 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.014 187212 INFO nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Took 6.53 seconds to spawn the instance on the hypervisor.
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.015 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.029 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.040 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935925.0404015, caa6c7c3-7eb3-4636-a7ad-7b605ef393ba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.040 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] VM Resumed (Lifecycle Event)
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.042 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.042 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.056 187212 INFO nova.virt.libvirt.driver [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance spawned successfully.
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.056 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.073 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.082 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.089 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.090 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.090 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.091 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.091 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.092 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.100 187212 INFO nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Took 7.27 seconds to build instance.
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.121 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.121 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935925.041058, caa6c7c3-7eb3-4636-a7ad-7b605ef393ba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.122 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] VM Started (Lifecycle Event)
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.157 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.158 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.161 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.165 187212 INFO nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Took 6.94 seconds to spawn the instance on the hypervisor.
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.165 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.176 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.217 187212 INFO nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Took 7.51 seconds to build instance.
Dec 05 11:58:45 compute-0 nova_compute[187208]: 2025-12-05 11:58:45.231 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:46 compute-0 podman[213530]: 2025-12-05 11:58:46.37179 +0000 UTC m=+0.061986900 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.210 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquiring lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.212 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.213 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquiring lock "c9498e91-01c5-47f7-b3ba-6291bb43635d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.213 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.213 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.214 187212 INFO nova.compute.manager [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Terminating instance
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.215 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquiring lock "refresh_cache-c9498e91-01c5-47f7-b3ba-6291bb43635d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.215 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquired lock "refresh_cache-c9498e91-01c5-47f7-b3ba-6291bb43635d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.216 187212 DEBUG nova.network.neutron [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:58:47 compute-0 nova_compute[187208]: 2025-12-05 11:58:47.570 187212 DEBUG nova.network.neutron [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.223 187212 DEBUG nova.network.neutron [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.237 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Releasing lock "refresh_cache-c9498e91-01c5-47f7-b3ba-6291bb43635d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.238 187212 DEBUG nova.compute.manager [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 11:58:48 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 05 11:58:48 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 3.673s CPU time.
Dec 05 11:58:48 compute-0 systemd-machined[153543]: Machine qemu-2-instance-00000002 terminated.
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.487 187212 INFO nova.virt.libvirt.driver [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance destroyed successfully.
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.488 187212 DEBUG nova.objects.instance [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lazy-loading 'resources' on Instance uuid c9498e91-01c5-47f7-b3ba-6291bb43635d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.514 187212 INFO nova.virt.libvirt.driver [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Deleting instance files /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d_del
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.516 187212 INFO nova.virt.libvirt.driver [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Deletion of /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d_del complete
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.600 187212 DEBUG nova.virt.libvirt.host [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.601 187212 INFO nova.virt.libvirt.host [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] UEFI support detected
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.602 187212 INFO nova.compute.manager [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.603 187212 DEBUG oslo.service.loopingcall [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.603 187212 DEBUG nova.compute.manager [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 11:58:48 compute-0 nova_compute[187208]: 2025-12-05 11:58:48.604 187212 DEBUG nova.network.neutron [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.222 187212 DEBUG nova.network.neutron [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.250 187212 DEBUG nova.network.neutron [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.276 187212 INFO nova.compute.manager [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Took 0.67 seconds to deallocate network for instance.
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.349 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.350 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.428 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.462 187212 ERROR nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [req-0090b723-cf18-45d2-bdaf-3a4affcbea42] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 5111707b-bdc3-4252-b5b7-b3e96ff05344.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-0090b723-cf18-45d2-bdaf-3a4affcbea42"}]}
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.477 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.494 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.495 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.514 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.538 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.611 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.656 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updated inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with generation 6 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.657 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 generation from 6 to 7 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.658 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.689 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.727 187212 INFO nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Deleted allocations for instance c9498e91-01c5-47f7-b3ba-6291bb43635d
Dec 05 11:58:49 compute-0 nova_compute[187208]: 2025-12-05 11:58:49.808 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:51 compute-0 podman[213560]: 2025-12-05 11:58:51.238949331 +0000 UTC m=+0.083415680 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 11:58:51 compute-0 podman[213561]: 2025-12-05 11:58:51.2628377 +0000 UTC m=+0.099037981 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.724 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.725 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.752 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.807 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.808 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.822 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.822 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.825 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.829 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.829 187212 INFO nova.compute.claims [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.853 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.854 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.891 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.892 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.894 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.919 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:53 compute-0 nova_compute[187208]: 2025-12-05 11:58:53.924 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.032 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.034 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.071 187212 DEBUG nova.compute.provider_tree [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.088 187212 DEBUG nova.scheduler.client.report [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.115 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.117 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.121 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.130 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.131 187212 INFO nova.compute.claims [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.172 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.173 187212 DEBUG nova.network.neutron [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.295 187212 INFO nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.322 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.415 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.416 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.417 187212 INFO nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Creating image(s)
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.417 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.418 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.419 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.432 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.458 187212 DEBUG nova.compute.provider_tree [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.477 187212 DEBUG nova.scheduler.client.report [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.497 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.498 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.500 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.506 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.507 187212 INFO nova.compute.claims [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.510 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.511 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.511 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.526 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.565 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.566 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.594 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.603 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.604 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.623 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.645 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.646 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.646 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.697 187212 DEBUG nova.network.neutron [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.697 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.698 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.699 187212 DEBUG nova.virt.disk.api [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Checking if we can resize image /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.699 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.723 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.725 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.726 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Creating image(s)
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.727 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.727 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.728 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.746 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.759 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.761 187212 DEBUG nova.virt.disk.api [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Cannot resize image /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.761 187212 DEBUG nova.objects.instance [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'migration_context' on Instance uuid 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.769 187212 DEBUG nova.compute.provider_tree [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.782 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.783 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Ensure instance console log exists: /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.784 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.784 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.785 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.786 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.788 187212 DEBUG nova.scheduler.client.report [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.795 187212 WARNING nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.799 187212 DEBUG nova.virt.libvirt.host [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.799 187212 DEBUG nova.virt.libvirt.host [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.802 187212 DEBUG nova.virt.libvirt.host [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.803 187212 DEBUG nova.virt.libvirt.host [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.804 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.804 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.805 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.805 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.805 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.806 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.806 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.806 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.807 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.807 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.807 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.808 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.810 187212 DEBUG nova.objects.instance [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.814 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.815 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.815 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.817 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.818 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.828 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.844 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <uuid>7af3a9ff-9ca1-4bf2-8688-01d7161ba33a</uuid>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <name>instance-00000004</name>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <nova:name>tempest-DeleteServersAdminTestJSON-server-410072514</nova:name>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:58:54</nova:creationTime>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:58:54 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:58:54 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:58:54 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:58:54 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:58:54 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:58:54 compute-0 nova_compute[187208]:         <nova:user uuid="e2dbb72c61fa4cdfa0de840c11264065">tempest-DeleteServersAdminTestJSON-1088655224-project-member</nova:user>
Dec 05 11:58:54 compute-0 nova_compute[187208]:         <nova:project uuid="43e0982f67c94ddb8da10556d22f6e39">tempest-DeleteServersAdminTestJSON-1088655224</nova:project>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <system>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <entry name="serial">7af3a9ff-9ca1-4bf2-8688-01d7161ba33a</entry>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <entry name="uuid">7af3a9ff-9ca1-4bf2-8688-01d7161ba33a</entry>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     </system>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <os>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   </os>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <features>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   </features>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.config"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/console.log" append="off"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <video>
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     </video>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:58:54 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:58:54 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:58:54 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:58:54 compute-0 nova_compute[187208]: </domain>
Dec 05 11:58:54 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.849 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.869 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.869 187212 INFO nova.compute.claims [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.896 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.896 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.925 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.926 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.926 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.956 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.957 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.967 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.968 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.972 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.973 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.973 187212 INFO nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Using config drive
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.975 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.978 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.979 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Checking if we can resize image /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:58:54 compute-0 nova_compute[187208]: 2025-12-05 11:58:54.979 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.013 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.035 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.041 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.043 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Cannot resize image /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.043 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'migration_context' on Instance uuid e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.073 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.073 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Ensure instance console log exists: /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.074 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.074 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.074 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.095 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.140 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.141 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.142 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Creating image(s)
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.142 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.143 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.143 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.156 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.169 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Automatically allocating a network for project fb2c9c006bee4723bc8dd108e19a6728. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.190 187212 DEBUG nova.compute.provider_tree [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.208 187212 DEBUG nova.scheduler.client.report [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.213 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.213 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.214 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.229 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.248 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.249 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.252 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.261 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.262 187212 INFO nova.compute.claims [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.290 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.291 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.325 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.326 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.326 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.332 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.332 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.350 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.373 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.379 187212 INFO nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Creating config drive at /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.config
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.384 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaxf_d6jl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.399 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.400 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Checking if we can resize image /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.400 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.458 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.459 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Cannot resize image /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.459 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'migration_context' on Instance uuid 04518502-62f1-44c3-8c57-b3404958536f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.468 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.469 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.470 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Creating image(s)
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.470 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.471 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.471 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.486 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.486 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Ensure instance console log exists: /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.486 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.487 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.487 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.488 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.505 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaxf_d6jl" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.548 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.549 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.550 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.561 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 systemd-machined[153543]: New machine qemu-3-instance-00000004.
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.582 187212 DEBUG nova.compute.provider_tree [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:58:55 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.599 187212 DEBUG nova.scheduler.client.report [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.617 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.618 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.639 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.640 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.663 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.664 187212 DEBUG nova.network.neutron [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.676 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.676 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.677 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.692 187212 INFO nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.716 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Automatically allocating a network for project fb2c9c006bee4723bc8dd108e19a6728. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.721 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.728 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.729 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Checking if we can resize image /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.730 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.799 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.800 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Cannot resize image /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.800 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'migration_context' on Instance uuid b2e8212c-084c-4a4f-b930-56560ae4da12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.823 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.824 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Ensure instance console log exists: /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.825 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.825 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.825 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.827 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.829 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.830 187212 INFO nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Creating image(s)
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.831 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.831 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.832 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.851 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.910 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.911 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.912 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.922 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.941 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Automatically allocating a network for project fb2c9c006bee4723bc8dd108e19a6728. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.983 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:55 compute-0 nova_compute[187208]: 2025-12-05 11:58:55.984 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.013 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.015 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.016 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.070 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.071 187212 DEBUG nova.virt.disk.api [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Checking if we can resize image /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.072 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.094 187212 DEBUG nova.network.neutron [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.095 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.124 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.125 187212 DEBUG nova.virt.disk.api [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Cannot resize image /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.126 187212 DEBUG nova.objects.instance [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'migration_context' on Instance uuid d4d2145a-a261-4c1c-82e7-3f595f46aec6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.140 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.140 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Ensure instance console log exists: /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.141 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.141 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.141 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.143 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.147 187212 WARNING nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.151 187212 DEBUG nova.virt.libvirt.host [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.151 187212 DEBUG nova.virt.libvirt.host [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.154 187212 DEBUG nova.virt.libvirt.host [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.154 187212 DEBUG nova.virt.libvirt.host [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.155 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.155 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.156 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.156 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.156 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.157 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.157 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.157 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.157 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.158 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.158 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.158 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.162 187212 DEBUG nova.objects.instance [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'pci_devices' on Instance uuid d4d2145a-a261-4c1c-82e7-3f595f46aec6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.183 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <uuid>d4d2145a-a261-4c1c-82e7-3f595f46aec6</uuid>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <name>instance-00000007</name>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdminNegativeTestJSON-server-1874212349</nova:name>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:58:56</nova:creationTime>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:58:56 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:58:56 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:58:56 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:58:56 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:58:56 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:58:56 compute-0 nova_compute[187208]:         <nova:user uuid="7f2d3ae86c634e16a369a01df5a1a50d">tempest-ServersAdminNegativeTestJSON-1418429776-project-member</nova:user>
Dec 05 11:58:56 compute-0 nova_compute[187208]:         <nova:project uuid="806d12fc57454fa3a9768a9e6ec2b812">tempest-ServersAdminNegativeTestJSON-1418429776</nova:project>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <system>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <entry name="serial">d4d2145a-a261-4c1c-82e7-3f595f46aec6</entry>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <entry name="uuid">d4d2145a-a261-4c1c-82e7-3f595f46aec6</entry>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     </system>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <os>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   </os>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <features>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   </features>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.config"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/console.log" append="off"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <video>
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     </video>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:58:56 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:58:56 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:58:56 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:58:56 compute-0 nova_compute[187208]: </domain>
Dec 05 11:58:56 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.238 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.239 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.240 187212 INFO nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Using config drive
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.328 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935936.3283365, 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.329 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] VM Resumed (Lifecycle Event)
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.333 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.334 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.338 187212 INFO nova.virt.libvirt.driver [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance spawned successfully.
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.339 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:58:56 compute-0 podman[213723]: 2025-12-05 11:58:56.348288148 +0000 UTC m=+0.064547381 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.358 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.366 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.372 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.373 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.373 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.374 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.375 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.376 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.403 187212 INFO nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Creating config drive at /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.config
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.408 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph1_8zocm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.426 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.427 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935936.331603, 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.428 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] VM Started (Lifecycle Event)
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.433 187212 INFO nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Took 2.02 seconds to spawn the instance on the hypervisor.
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.434 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.446 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.449 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.478 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.503 187212 INFO nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Took 2.70 seconds to build instance.
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.520 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:56 compute-0 nova_compute[187208]: 2025-12-05 11:58:56.535 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph1_8zocm" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:56 compute-0 systemd-machined[153543]: New machine qemu-4-instance-00000007.
Dec 05 11:58:56 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.367 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.368 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.368 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935937.3671722, d4d2145a-a261-4c1c-82e7-3f595f46aec6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.369 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] VM Resumed (Lifecycle Event)
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.373 187212 INFO nova.virt.libvirt.driver [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance spawned successfully.
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.373 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.465 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.470 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.497 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.498 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935937.3672824, d4d2145a-a261-4c1c-82e7-3f595f46aec6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.498 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] VM Started (Lifecycle Event)
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.504 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.504 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.505 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.506 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.506 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.507 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.529 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.533 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.556 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.567 187212 INFO nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Took 1.74 seconds to spawn the instance on the hypervisor.
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.567 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.593 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "c0320574-7866-4fe3-a513-f678d16b8726" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.594 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.619 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.630 187212 INFO nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Took 2.56 seconds to build instance.
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.648 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.683 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.684 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.689 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.689 187212 INFO nova.compute.claims [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.926 187212 DEBUG nova.compute.provider_tree [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.947 187212 DEBUG nova.scheduler.client.report [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.970 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:57 compute-0 nova_compute[187208]: 2025-12-05 11:58:57.971 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.021 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.022 187212 DEBUG nova.network.neutron [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.045 187212 INFO nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.063 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.149 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.151 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.151 187212 INFO nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Creating image(s)
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.152 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.152 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.153 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.165 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.220 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.225 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.226 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.244 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.312 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.313 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.428 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk 1073741824" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.430 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.430 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.487 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.489 187212 DEBUG nova.virt.disk.api [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Checking if we can resize image /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.490 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.542 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.543 187212 DEBUG nova.virt.disk.api [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Cannot resize image /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.544 187212 DEBUG nova.objects.instance [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lazy-loading 'migration_context' on Instance uuid c0320574-7866-4fe3-a513-f678d16b8726 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.563 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.563 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Ensure instance console log exists: /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.564 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.564 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.565 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.605 187212 DEBUG nova.network.neutron [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.606 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.607 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.611 187212 WARNING nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.616 187212 DEBUG nova.virt.libvirt.host [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.617 187212 DEBUG nova.virt.libvirt.host [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.621 187212 DEBUG nova.virt.libvirt.host [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.622 187212 DEBUG nova.virt.libvirt.host [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.622 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.623 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.623 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.624 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.624 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.624 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.624 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.625 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.625 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.625 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.626 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.626 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.630 187212 DEBUG nova.objects.instance [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lazy-loading 'pci_devices' on Instance uuid c0320574-7866-4fe3-a513-f678d16b8726 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.644 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <uuid>c0320574-7866-4fe3-a513-f678d16b8726</uuid>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <name>instance-00000008</name>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerExternalEventsTest-server-560546370</nova:name>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:58:58</nova:creationTime>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:58:58 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:58:58 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:58:58 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:58:58 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:58:58 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:58:58 compute-0 nova_compute[187208]:         <nova:user uuid="1665784ba58c46f3b4db3ca4aadf4148">tempest-ServerExternalEventsTest-467289434-project-member</nova:user>
Dec 05 11:58:58 compute-0 nova_compute[187208]:         <nova:project uuid="6543c5ed4cf043ffa1c5da8b358945f0">tempest-ServerExternalEventsTest-467289434</nova:project>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <system>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <entry name="serial">c0320574-7866-4fe3-a513-f678d16b8726</entry>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <entry name="uuid">c0320574-7866-4fe3-a513-f678d16b8726</entry>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     </system>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <os>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   </os>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <features>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   </features>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.config"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/console.log" append="off"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <video>
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     </video>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:58:58 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:58:58 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:58:58 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:58:58 compute-0 nova_compute[187208]: </domain>
Dec 05 11:58:58 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.700 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.702 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:58:58 compute-0 nova_compute[187208]: 2025-12-05 11:58:58.702 187212 INFO nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Using config drive
Dec 05 11:58:59 compute-0 nova_compute[187208]: 2025-12-05 11:58:59.157 187212 INFO nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Creating config drive at /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.config
Dec 05 11:58:59 compute-0 nova_compute[187208]: 2025-12-05 11:58:59.162 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp18mzspq0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:58:59 compute-0 nova_compute[187208]: 2025-12-05 11:58:59.291 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp18mzspq0" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:58:59 compute-0 systemd-machined[153543]: New machine qemu-5-instance-00000008.
Dec 05 11:58:59 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000008.
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.274 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.277 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.278 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935940.2737994, c0320574-7866-4fe3-a513-f678d16b8726 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.278 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] VM Resumed (Lifecycle Event)
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.284 187212 INFO nova.virt.libvirt.driver [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance spawned successfully.
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.284 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.306 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.312 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.315 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.316 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.316 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.317 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.317 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.318 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.351 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.352 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935940.2750397, c0320574-7866-4fe3-a513-f678d16b8726 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.352 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] VM Started (Lifecycle Event)
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.384 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.386 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.399 187212 INFO nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Took 2.25 seconds to spawn the instance on the hypervisor.
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.400 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.408 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.458 187212 INFO nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Took 2.79 seconds to build instance.
Dec 05 11:59:00 compute-0 nova_compute[187208]: 2025-12-05 11:59:00.473 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.879s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:01 compute-0 nova_compute[187208]: 2025-12-05 11:59:01.939 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:01 compute-0 nova_compute[187208]: 2025-12-05 11:59:01.940 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:01 compute-0 nova_compute[187208]: 2025-12-05 11:59:01.941 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:01 compute-0 nova_compute[187208]: 2025-12-05 11:59:01.941 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:01 compute-0 nova_compute[187208]: 2025-12-05 11:59:01.941 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:01 compute-0 nova_compute[187208]: 2025-12-05 11:59:01.943 187212 INFO nova.compute.manager [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Terminating instance
Dec 05 11:59:01 compute-0 nova_compute[187208]: 2025-12-05 11:59:01.943 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "refresh_cache-7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:01 compute-0 nova_compute[187208]: 2025-12-05 11:59:01.944 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquired lock "refresh_cache-7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:01 compute-0 nova_compute[187208]: 2025-12-05 11:59:01.944 187212 DEBUG nova.network.neutron [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.118 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.118 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.133 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.205 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.207 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.214 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.214 187212 INFO nova.compute.claims [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.271 187212 DEBUG nova.network.neutron [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:02 compute-0 podman[213814]: 2025-12-05 11:59:02.318730592 +0000 UTC m=+0.081351629 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.413 187212 DEBUG nova.compute.provider_tree [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.433 187212 DEBUG nova.scheduler.client.report [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.456 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.457 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.501 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.501 187212 DEBUG nova.network.neutron [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.618 187212 INFO nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:59:02 compute-0 nova_compute[187208]: 2025-12-05 11:59:02.734 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:59:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:03.005 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:03.006 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:03.006 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.398 187212 DEBUG nova.compute.manager [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Received event network-changed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.399 187212 DEBUG nova.compute.manager [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Refreshing instance network info cache due to event network-changed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.399 187212 DEBUG oslo_concurrency.lockutils [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] Acquiring lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.399 187212 DEBUG oslo_concurrency.lockutils [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] Acquired lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.399 187212 DEBUG nova.network.neutron [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.486 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935928.4857826, c9498e91-01c5-47f7-b3ba-6291bb43635d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.487 187212 INFO nova.compute.manager [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] VM Stopped (Lifecycle Event)
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.629 187212 DEBUG nova.compute.manager [None req-d877335e-6b07-4608-86b7-e59593134487 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.650 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.651 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.651 187212 INFO nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Creating image(s)
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.652 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.652 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.652 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.667 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.717 187212 DEBUG nova.network.neutron [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.718 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.718 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.719 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.719 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.731 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.746 187212 DEBUG nova.network.neutron [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.762 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Releasing lock "refresh_cache-7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.763 187212 DEBUG nova.compute.manager [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.793 187212 DEBUG nova.network.neutron [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:03 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec 05 11:59:03 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 8.076s CPU time.
Dec 05 11:59:03 compute-0 systemd-machined[153543]: Machine qemu-3-instance-00000004 terminated.
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.804 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.807 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.839 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.840 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.840 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.896 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.897 187212 DEBUG nova.virt.disk.api [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Checking if we can resize image /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.897 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.953 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.953 187212 DEBUG nova.virt.disk.api [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Cannot resize image /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.954 187212 DEBUG nova.objects.instance [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'migration_context' on Instance uuid 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.962 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "c0320574-7866-4fe3-a513-f678d16b8726" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.963 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.963 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "c0320574-7866-4fe3-a513-f678d16b8726-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.963 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.963 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.964 187212 INFO nova.compute.manager [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Terminating instance
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.965 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.972 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.972 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Ensure instance console log exists: /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.973 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.974 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.974 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.975 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.984 187212 WARNING nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.992 187212 DEBUG nova.virt.libvirt.host [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.992 187212 DEBUG nova.virt.libvirt.host [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.996 187212 DEBUG nova.virt.libvirt.host [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.996 187212 DEBUG nova.virt.libvirt.host [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.997 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.997 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.997 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.997 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:59:03 compute-0 nova_compute[187208]: 2025-12-05 11:59:03.999 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.002 187212 DEBUG nova.objects.instance [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.011 187212 INFO nova.virt.libvirt.driver [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance destroyed successfully.
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.011 187212 DEBUG nova.objects.instance [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'resources' on Instance uuid 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.013 187212 DEBUG nova.network.neutron [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.018 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <uuid>9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70</uuid>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <name>instance-00000009</name>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdminNegativeTestJSON-server-357102285</nova:name>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:59:03</nova:creationTime>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:59:04 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:59:04 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:59:04 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:59:04 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:59:04 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:59:04 compute-0 nova_compute[187208]:         <nova:user uuid="7f2d3ae86c634e16a369a01df5a1a50d">tempest-ServersAdminNegativeTestJSON-1418429776-project-member</nova:user>
Dec 05 11:59:04 compute-0 nova_compute[187208]:         <nova:project uuid="806d12fc57454fa3a9768a9e6ec2b812">tempest-ServersAdminNegativeTestJSON-1418429776</nova:project>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <system>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <entry name="serial">9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70</entry>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <entry name="uuid">9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70</entry>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     </system>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <os>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   </os>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <features>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   </features>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.config"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/console.log" append="off"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <video>
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     </video>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:59:04 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:59:04 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:59:04 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:59:04 compute-0 nova_compute[187208]: </domain>
Dec 05 11:59:04 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.025 187212 INFO nova.virt.libvirt.driver [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Deleting instance files /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a_del
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.026 187212 INFO nova.virt.libvirt.driver [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Deletion of /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a_del complete
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.029 187212 DEBUG oslo_concurrency.lockutils [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] Releasing lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.029 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquired lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.029 187212 DEBUG nova.network.neutron [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.127 187212 INFO nova.compute.manager [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.127 187212 DEBUG oslo.service.loopingcall [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.127 187212 DEBUG nova.compute.manager [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.128 187212 DEBUG nova.network.neutron [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.131 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.131 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.131 187212 INFO nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Using config drive
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.200 187212 DEBUG nova.network.neutron [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.318 187212 INFO nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Creating config drive at /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.config
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.327 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmporrb0r4b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.461 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmporrb0r4b" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.485 187212 DEBUG nova.network.neutron [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.504 187212 DEBUG nova.network.neutron [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:04 compute-0 systemd-machined[153543]: New machine qemu-6-instance-00000009.
Dec 05 11:59:04 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000009.
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.544 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Releasing lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.544 187212 DEBUG nova.compute.manager [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.545 187212 DEBUG nova.network.neutron [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.562 187212 INFO nova.compute.manager [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Took 0.43 seconds to deallocate network for instance.
Dec 05 11:59:04 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.609 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.610 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:04 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Consumed 5.252s CPU time.
Dec 05 11:59:04 compute-0 systemd-machined[153543]: Machine qemu-5-instance-00000008 terminated.
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.784 187212 DEBUG nova.compute.provider_tree [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.800 187212 INFO nova.virt.libvirt.driver [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance destroyed successfully.
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.801 187212 DEBUG nova.objects.instance [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lazy-loading 'resources' on Instance uuid c0320574-7866-4fe3-a513-f678d16b8726 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.803 187212 DEBUG nova.scheduler.client.report [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.824 187212 INFO nova.virt.libvirt.driver [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Deleting instance files /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726_del
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.825 187212 INFO nova.virt.libvirt.driver [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Deletion of /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726_del complete
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.830 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.851 187212 INFO nova.scheduler.client.report [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Deleted allocations for instance 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.879 187212 INFO nova.compute.manager [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Took 0.33 seconds to destroy the instance on the hypervisor.
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.879 187212 DEBUG oslo.service.loopingcall [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.880 187212 DEBUG nova.compute.manager [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.880 187212 DEBUG nova.network.neutron [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 11:59:04 compute-0 nova_compute[187208]: 2025-12-05 11:59:04.918 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.978s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.297 187212 DEBUG nova.network.neutron [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.314 187212 DEBUG nova.network.neutron [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.327 187212 INFO nova.compute.manager [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Took 0.45 seconds to deallocate network for instance.
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.378 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.379 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.537 187212 DEBUG nova.compute.provider_tree [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.555 187212 DEBUG nova.scheduler.client.report [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.581 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.591 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935945.5915148, 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.593 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] VM Resumed (Lifecycle Event)
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.595 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.595 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.600 187212 INFO nova.virt.libvirt.driver [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance spawned successfully.
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.600 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.604 187212 INFO nova.scheduler.client.report [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Deleted allocations for instance c0320574-7866-4fe3-a513-f678d16b8726
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.610 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.613 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.622 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.623 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.623 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.624 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.624 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.624 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.653 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.654 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935945.596224, 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.654 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] VM Started (Lifecycle Event)
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.686 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.689 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.693 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.697 187212 INFO nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Took 2.05 seconds to spawn the instance on the hypervisor.
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.697 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.708 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.755 187212 INFO nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Took 3.57 seconds to build instance.
Dec 05 11:59:05 compute-0 nova_compute[187208]: 2025-12-05 11:59:05.772 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:08 compute-0 nova_compute[187208]: 2025-12-05 11:59:08.413 187212 DEBUG nova.objects.instance [None req-4419b7ac-3943-4fe3-a54d-9365fc4a978c 578057bdefa3459f8e1d51e5b47d9030 dc0ef83f10594fbe978d85c5f4210cfe - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:08 compute-0 nova_compute[187208]: 2025-12-05 11:59:08.432 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935948.432069, 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:08 compute-0 nova_compute[187208]: 2025-12-05 11:59:08.432 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] VM Paused (Lifecycle Event)
Dec 05 11:59:08 compute-0 nova_compute[187208]: 2025-12-05 11:59:08.449 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:08 compute-0 nova_compute[187208]: 2025-12-05 11:59:08.453 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:08 compute-0 nova_compute[187208]: 2025-12-05 11:59:08.483 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] During sync_power_state the instance has a pending task (suspending). Skip.
Dec 05 11:59:09 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec 05 11:59:09 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Consumed 4.102s CPU time.
Dec 05 11:59:09 compute-0 systemd-machined[153543]: Machine qemu-6-instance-00000009 terminated.
Dec 05 11:59:09 compute-0 nova_compute[187208]: 2025-12-05 11:59:09.402 187212 DEBUG nova.compute.manager [None req-4419b7ac-3943-4fe3-a54d-9365fc4a978c 578057bdefa3459f8e1d51e5b47d9030 dc0ef83f10594fbe978d85c5f4210cfe - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:11 compute-0 podman[213919]: 2025-12-05 11:59:11.242308068 +0000 UTC m=+0.079611440 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:11.999 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:12.000 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:12.000 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:12.000 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:12.000 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:12.001 187212 INFO nova.compute.manager [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Terminating instance
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:12.002 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "refresh_cache-9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:12.002 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquired lock "refresh_cache-9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:12.002 187212 DEBUG nova.network.neutron [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:59:12 compute-0 nova_compute[187208]: 2025-12-05 11:59:12.537 187212 DEBUG nova.network.neutron [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.638 187212 DEBUG nova.network.neutron [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.655 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Releasing lock "refresh_cache-9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.656 187212 DEBUG nova.compute.manager [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.663 187212 INFO nova.virt.libvirt.driver [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance destroyed successfully.
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.663 187212 DEBUG nova.objects.instance [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'resources' on Instance uuid 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.682 187212 INFO nova.virt.libvirt.driver [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Deleting instance files /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70_del
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.683 187212 INFO nova.virt.libvirt.driver [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Deletion of /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70_del complete
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.738 187212 INFO nova.compute.manager [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Took 0.08 seconds to destroy the instance on the hypervisor.
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.739 187212 DEBUG oslo.service.loopingcall [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.739 187212 DEBUG nova.compute.manager [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 11:59:13 compute-0 nova_compute[187208]: 2025-12-05 11:59:13.739 187212 DEBUG nova.network.neutron [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.251 187212 DEBUG nova.network.neutron [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.277 187212 DEBUG nova.network.neutron [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.299 187212 INFO nova.compute.manager [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Took 0.56 seconds to deallocate network for instance.
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.367 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.367 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.744 187212 DEBUG nova.compute.provider_tree [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.771 187212 DEBUG nova.scheduler.client.report [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.802 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.840 187212 INFO nova.scheduler.client.report [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Deleted allocations for instance 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70
Dec 05 11:59:14 compute-0 nova_compute[187208]: 2025-12-05 11:59:14.916 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:15 compute-0 podman[213939]: 2025-12-05 11:59:15.230154111 +0000 UTC m=+0.070030168 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 11:59:15 compute-0 nova_compute[187208]: 2025-12-05 11:59:15.821 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:15 compute-0 nova_compute[187208]: 2025-12-05 11:59:15.822 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:15 compute-0 nova_compute[187208]: 2025-12-05 11:59:15.822 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:15 compute-0 nova_compute[187208]: 2025-12-05 11:59:15.822 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:15 compute-0 nova_compute[187208]: 2025-12-05 11:59:15.823 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:15 compute-0 nova_compute[187208]: 2025-12-05 11:59:15.823 187212 INFO nova.compute.manager [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Terminating instance
Dec 05 11:59:15 compute-0 nova_compute[187208]: 2025-12-05 11:59:15.824 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "refresh_cache-d4d2145a-a261-4c1c-82e7-3f595f46aec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:15 compute-0 nova_compute[187208]: 2025-12-05 11:59:15.824 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquired lock "refresh_cache-d4d2145a-a261-4c1c-82e7-3f595f46aec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:15 compute-0 nova_compute[187208]: 2025-12-05 11:59:15.825 187212 DEBUG nova.network.neutron [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:59:16 compute-0 nova_compute[187208]: 2025-12-05 11:59:16.637 187212 DEBUG nova.network.neutron [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:17 compute-0 podman[213959]: 2025-12-05 11:59:17.217804639 +0000 UTC m=+0.070848810 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, distribution-scope=public, release=1755695350, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.040 187212 DEBUG nova.network.neutron [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.055 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Releasing lock "refresh_cache-d4d2145a-a261-4c1c-82e7-3f595f46aec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.056 187212 DEBUG nova.compute.manager [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 11:59:18 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 05 11:59:18 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 12.852s CPU time.
Dec 05 11:59:18 compute-0 systemd-machined[153543]: Machine qemu-4-instance-00000007 terminated.
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.299 187212 INFO nova.virt.libvirt.driver [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance destroyed successfully.
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.301 187212 DEBUG nova.objects.instance [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'resources' on Instance uuid d4d2145a-a261-4c1c-82e7-3f595f46aec6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.318 187212 INFO nova.virt.libvirt.driver [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Deleting instance files /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6_del
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.319 187212 INFO nova.virt.libvirt.driver [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Deletion of /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6_del complete
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.392 187212 INFO nova.compute.manager [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.392 187212 DEBUG oslo.service.loopingcall [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.393 187212 DEBUG nova.compute.manager [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.393 187212 DEBUG nova.network.neutron [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 11:59:18 compute-0 nova_compute[187208]: 2025-12-05 11:59:18.994 187212 DEBUG nova.network.neutron [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.009 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935944.0075085, 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.009 187212 INFO nova.compute.manager [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] VM Stopped (Lifecycle Event)
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.011 187212 DEBUG nova.network.neutron [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.047 187212 DEBUG nova.compute.manager [None req-93a90937-dc92-46a3-9970-aafcf36afaed - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.049 187212 INFO nova.compute.manager [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Took 0.66 seconds to deallocate network for instance.
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.143 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.143 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.326 187212 DEBUG nova.compute.provider_tree [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.348 187212 DEBUG nova.scheduler.client.report [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.381 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.405 187212 INFO nova.scheduler.client.report [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Deleted allocations for instance d4d2145a-a261-4c1c-82e7-3f595f46aec6
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.481 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.799 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935944.7983358, c0320574-7866-4fe3-a513-f678d16b8726 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.799 187212 INFO nova.compute.manager [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] VM Stopped (Lifecycle Event)
Dec 05 11:59:19 compute-0 nova_compute[187208]: 2025-12-05 11:59:19.833 187212 DEBUG nova.compute.manager [None req-a89ff6b4-1b8d-4e72-ac3a-c247ff80a93b - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:20 compute-0 nova_compute[187208]: 2025-12-05 11:59:20.342 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Automatically allocated network: {'id': 'ca5a0748-2268-4f31-a673-9ef2606c4273', 'name': 'auto_allocated_network', 'tenant_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['3601d1a0-6de8-4dc7-839c-f1b2d1901b80', 'c438a52b-4019-49b5-8423-28f69ccbec64'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-12-05T11:58:56Z', 'updated_at': '2025-12-05T11:59:18Z', 'revision_number': 4, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Dec 05 11:59:20 compute-0 nova_compute[187208]: 2025-12-05 11:59:20.495 187212 WARNING oslo_policy.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 05 11:59:20 compute-0 nova_compute[187208]: 2025-12-05 11:59:20.496 187212 WARNING oslo_policy.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 05 11:59:20 compute-0 nova_compute[187208]: 2025-12-05 11:59:20.498 187212 DEBUG nova.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 11:59:22 compute-0 podman[213990]: 2025-12-05 11:59:22.220880743 +0000 UTC m=+0.062350326 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 11:59:22 compute-0 podman[213991]: 2025-12-05 11:59:22.273512161 +0000 UTC m=+0.104174027 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 05 11:59:24 compute-0 nova_compute[187208]: 2025-12-05 11:59:24.405 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935949.4035456, 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:24 compute-0 nova_compute[187208]: 2025-12-05 11:59:24.405 187212 INFO nova.compute.manager [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] VM Stopped (Lifecycle Event)
Dec 05 11:59:24 compute-0 nova_compute[187208]: 2025-12-05 11:59:24.438 187212 DEBUG nova.compute.manager [None req-c562e51b-c98e-4abe-ab33-402fae500899 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:25 compute-0 nova_compute[187208]: 2025-12-05 11:59:25.607 187212 DEBUG oslo_concurrency.processutils [None req-26862be7-1bd7-40f8-956d-7c5def84be87 7db28c1b5e8344a1931b5e793e16923f 0d0e512e95bb4c6787173c66c924ae20 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:25 compute-0 nova_compute[187208]: 2025-12-05 11:59:25.649 187212 DEBUG oslo_concurrency.processutils [None req-26862be7-1bd7-40f8-956d-7c5def84be87 7db28c1b5e8344a1931b5e793e16923f 0d0e512e95bb4c6787173c66c924ae20 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:26 compute-0 nova_compute[187208]: 2025-12-05 11:59:26.014 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Successfully created port: a5ad03eb-1959-4b2d-a437-979506e6b988 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 11:59:27 compute-0 podman[214040]: 2025-12-05 11:59:27.259371502 +0000 UTC m=+0.092530156 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 11:59:31 compute-0 nova_compute[187208]: 2025-12-05 11:59:31.113 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Successfully updated port: a5ad03eb-1959-4b2d-a437-979506e6b988 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 11:59:31 compute-0 nova_compute[187208]: 2025-12-05 11:59:31.138 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:31 compute-0 nova_compute[187208]: 2025-12-05 11:59:31.139 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquired lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:31 compute-0 nova_compute[187208]: 2025-12-05 11:59:31.139 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:59:31 compute-0 nova_compute[187208]: 2025-12-05 11:59:31.533 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:32 compute-0 nova_compute[187208]: 2025-12-05 11:59:32.855 187212 DEBUG nova.compute.manager [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received event network-changed-a5ad03eb-1959-4b2d-a437-979506e6b988 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:32 compute-0 nova_compute[187208]: 2025-12-05 11:59:32.856 187212 DEBUG nova.compute.manager [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Refreshing instance network info cache due to event network-changed-a5ad03eb-1959-4b2d-a437-979506e6b988. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 11:59:32 compute-0 nova_compute[187208]: 2025-12-05 11:59:32.856 187212 DEBUG oslo_concurrency.lockutils [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.107 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.107 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.108 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 11:59:33 compute-0 podman[214066]: 2025-12-05 11:59:33.211106621 +0000 UTC m=+0.064222418 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.297 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935958.296861, d4d2145a-a261-4c1c-82e7-3f595f46aec6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.298 187212 INFO nova.compute.manager [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] VM Stopped (Lifecycle Event)
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.320 187212 DEBUG nova.compute.manager [None req-db1724b9-21dd-4ed2-83e1-a04e9f2ff516 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.456 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.456 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.456 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 11:59:33 compute-0 nova_compute[187208]: 2025-12-05 11:59:33.456 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.654 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.911 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Updating instance_info_cache with network_info: [{"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:34.945 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 11:59:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:34.948 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.959 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Releasing lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.960 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance network_info: |[{"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.960 187212 DEBUG oslo_concurrency.lockutils [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.961 187212 DEBUG nova.network.neutron [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Refreshing network info cache for port a5ad03eb-1959-4b2d-a437-979506e6b988 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.964 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start _get_guest_xml network_info=[{"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.971 187212 WARNING nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.987 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.988 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.997 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.998 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:59:34 compute-0 nova_compute[187208]: 2025-12-05 11:59:34.999 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.000 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.000 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.001 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.001 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.001 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.002 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.002 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.003 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.003 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.004 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.004 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.009 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-1',id=3,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:54Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=e83b5d7d-04a7-44d9-a6fe-580f1cfa5838,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.010 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.011 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.013 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'pci_devices' on Instance uuid e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.037 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <uuid>e83b5d7d-04a7-44d9-a6fe-580f1cfa5838</uuid>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <name>instance-00000003</name>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <nova:name>tempest-tempest.common.compute-instance-445293436-1</nova:name>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:59:34</nova:creationTime>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:59:35 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:59:35 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:59:35 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:59:35 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:59:35 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:59:35 compute-0 nova_compute[187208]:         <nova:user uuid="c4c62f22ba09455995ea1bde6a93431e">tempest-AutoAllocateNetworkTest-275048159-project-member</nova:user>
Dec 05 11:59:35 compute-0 nova_compute[187208]:         <nova:project uuid="fb2c9c006bee4723bc8dd108e19a6728">tempest-AutoAllocateNetworkTest-275048159</nova:project>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 11:59:35 compute-0 nova_compute[187208]:         <nova:port uuid="a5ad03eb-1959-4b2d-a437-979506e6b988">
Dec 05 11:59:35 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.1.0.55" ipVersion="4"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="fdfe:381f:8400::38b" ipVersion="6"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <system>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <entry name="serial">e83b5d7d-04a7-44d9-a6fe-580f1cfa5838</entry>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <entry name="uuid">e83b5d7d-04a7-44d9-a6fe-580f1cfa5838</entry>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     </system>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <os>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   </os>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <features>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   </features>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:2b:76:46"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <target dev="tapa5ad03eb-19"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     </interface>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/console.log" append="off"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <video>
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     </video>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:59:35 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:59:35 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:59:35 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:59:35 compute-0 nova_compute[187208]: </domain>
Dec 05 11:59:35 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.039 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Preparing to wait for external event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.039 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.040 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.040 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.041 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-1',id=3,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:54Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=e83b5d7d-04a7-44d9-a6fe-580f1cfa5838,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.042 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.043 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.043 187212 DEBUG os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.081 187212 DEBUG ovsdbapp.backend.ovs_idl [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.081 187212 DEBUG ovsdbapp.backend.ovs_idl [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.082 187212 DEBUG ovsdbapp.backend.ovs_idl [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.082 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.082 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [POLLOUT] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.082 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.083 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.084 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.086 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.093 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.093 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.094 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.095 187212 INFO oslo.privsep.daemon [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpq2dvfmb4/privsep.sock']
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.827 187212 INFO oslo.privsep.daemon [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Spawned new privsep daemon via rootwrap
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.700 214095 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.705 214095 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.707 214095 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 05 11:59:35 compute-0 nova_compute[187208]: 2025-12-05 11:59:35.707 214095 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214095
Dec 05 11:59:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:35.951 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.129 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa5ad03eb-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.129 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa5ad03eb-19, col_values=(('external_ids', {'iface-id': 'a5ad03eb-1959-4b2d-a437-979506e6b988', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:76:46', 'vm-uuid': 'e83b5d7d-04a7-44d9-a6fe-580f1cfa5838'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.131 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:36 compute-0 NetworkManager[55691]: <info>  [1764935976.1323] manager: (tapa5ad03eb-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.137 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.138 187212 INFO os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19')
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.144 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.160 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.160 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.161 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.162 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.162 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.163 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.163 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.188 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.188 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.188 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.189 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.346 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.346 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.346 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No VIF found with MAC fa:16:3e:2b:76:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.347 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Using config drive
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.377 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.462 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.463 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.520 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.528 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.603 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.604 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.662 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.663 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000003, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config'
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.815 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.816 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5724MB free_disk=73.31016540527344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.817 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.817 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.921 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance caa6c7c3-7eb3-4636-a7ad-7b605ef393ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 04518502-62f1-44c3-8c57-b3404958536f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b2e8212c-084c-4a4f-b930-56560ae4da12 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 11:59:36 compute-0 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 11:59:37 compute-0 nova_compute[187208]: 2025-12-05 11:59:37.013 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:59:37 compute-0 nova_compute[187208]: 2025-12-05 11:59:37.026 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:59:37 compute-0 nova_compute[187208]: 2025-12-05 11:59:37.050 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 11:59:37 compute-0 nova_compute[187208]: 2025-12-05 11:59:37.050 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:37 compute-0 nova_compute[187208]: 2025-12-05 11:59:37.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:37 compute-0 nova_compute[187208]: 2025-12-05 11:59:37.947 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:37 compute-0 nova_compute[187208]: 2025-12-05 11:59:37.948 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:37 compute-0 nova_compute[187208]: 2025-12-05 11:59:37.979 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:37 compute-0 nova_compute[187208]: 2025-12-05 11:59:37.979 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 11:59:38 compute-0 nova_compute[187208]: 2025-12-05 11:59:38.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 11:59:38 compute-0 nova_compute[187208]: 2025-12-05 11:59:38.748 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Automatically allocated network: {'id': 'ca5a0748-2268-4f31-a673-9ef2606c4273', 'name': 'auto_allocated_network', 'tenant_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['3601d1a0-6de8-4dc7-839c-f1b2d1901b80', 'c438a52b-4019-49b5-8423-28f69ccbec64'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-12-05T11:58:56Z', 'updated_at': '2025-12-05T11:59:18Z', 'revision_number': 4, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Dec 05 11:59:38 compute-0 nova_compute[187208]: 2025-12-05 11:59:38.749 187212 DEBUG nova.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 11:59:39 compute-0 nova_compute[187208]: 2025-12-05 11:59:39.154 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Creating config drive at /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config
Dec 05 11:59:39 compute-0 nova_compute[187208]: 2025-12-05 11:59:39.160 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxwg12it9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:39 compute-0 nova_compute[187208]: 2025-12-05 11:59:39.285 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxwg12it9" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:39 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 05 11:59:39 compute-0 kernel: tapa5ad03eb-19: entered promiscuous mode
Dec 05 11:59:39 compute-0 NetworkManager[55691]: <info>  [1764935979.3566] manager: (tapa5ad03eb-19): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Dec 05 11:59:39 compute-0 nova_compute[187208]: 2025-12-05 11:59:39.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:39 compute-0 ovn_controller[95610]: 2025-12-05T11:59:39Z|00033|binding|INFO|Claiming lport a5ad03eb-1959-4b2d-a437-979506e6b988 for this chassis.
Dec 05 11:59:39 compute-0 ovn_controller[95610]: 2025-12-05T11:59:39Z|00034|binding|INFO|a5ad03eb-1959-4b2d-a437-979506e6b988: Claiming fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b
Dec 05 11:59:39 compute-0 systemd-udevd[214132]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:59:39 compute-0 NetworkManager[55691]: <info>  [1764935979.4220] device (tapa5ad03eb-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:59:39 compute-0 NetworkManager[55691]: <info>  [1764935979.4227] device (tapa5ad03eb-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 11:59:39 compute-0 nova_compute[187208]: 2025-12-05 11:59:39.484 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:39 compute-0 ovn_controller[95610]: 2025-12-05T11:59:39Z|00035|binding|INFO|Setting lport a5ad03eb-1959-4b2d-a437-979506e6b988 ovn-installed in OVS
Dec 05 11:59:39 compute-0 nova_compute[187208]: 2025-12-05 11:59:39.490 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:39 compute-0 systemd-machined[153543]: New machine qemu-7-instance-00000003.
Dec 05 11:59:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:39.527 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b'], port_security=['fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.55/26 fdfe:381f:8400::38b/64', 'neutron:device_id': 'e83b5d7d-04a7-44d9-a6fe-580f1cfa5838', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a5ad03eb-1959-4b2d-a437-979506e6b988) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 11:59:39 compute-0 ovn_controller[95610]: 2025-12-05T11:59:39Z|00036|binding|INFO|Setting lport a5ad03eb-1959-4b2d-a437-979506e6b988 up in Southbound
Dec 05 11:59:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:39.528 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a5ad03eb-1959-4b2d-a437-979506e6b988 in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 bound to our chassis
Dec 05 11:59:39 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000003.
Dec 05 11:59:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:39.530 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273
Dec 05 11:59:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:39.532 104471 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpppfurjfv/privsep.sock']
Dec 05 11:59:39 compute-0 nova_compute[187208]: 2025-12-05 11:59:39.992 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935979.991877, e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:39 compute-0 nova_compute[187208]: 2025-12-05 11:59:39.993 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] VM Started (Lifecycle Event)
Dec 05 11:59:40 compute-0 nova_compute[187208]: 2025-12-05 11:59:40.011 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:40 compute-0 nova_compute[187208]: 2025-12-05 11:59:40.015 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935979.992355, e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:40 compute-0 nova_compute[187208]: 2025-12-05 11:59:40.015 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] VM Paused (Lifecycle Event)
Dec 05 11:59:40 compute-0 nova_compute[187208]: 2025-12-05 11:59:40.038 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:40 compute-0 nova_compute[187208]: 2025-12-05 11:59:40.040 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:40 compute-0 nova_compute[187208]: 2025-12-05 11:59:40.061 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.208 104471 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.209 104471 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpppfurjfv/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.086 214158 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.090 214158 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.091 214158 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.092 214158 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214158
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.213 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc3d578-a5db-4d98-8404-fee48b08ddd9]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.799 214158 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.800 214158 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.800 214158 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:41 compute-0 nova_compute[187208]: 2025-12-05 11:59:41.132 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:41 compute-0 nova_compute[187208]: 2025-12-05 11:59:41.225 187212 DEBUG nova.network.neutron [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Updated VIF entry in instance network info cache for port a5ad03eb-1959-4b2d-a437-979506e6b988. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 11:59:41 compute-0 nova_compute[187208]: 2025-12-05 11:59:41.225 187212 DEBUG nova.network.neutron [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Updating instance_info_cache with network_info: [{"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:41 compute-0 nova_compute[187208]: 2025-12-05 11:59:41.290 187212 DEBUG oslo_concurrency.lockutils [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.382 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4655addf-07ee-41bf-925c-a4d94b663704]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.384 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapca5a0748-21 in ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 11:59:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.386 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapca5a0748-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 11:59:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.386 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3bac0a-4bc8-4f78-898e-37bc1d732cc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.389 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3d20a50b-b439-4ae6-be62-fd0838ec19b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.409 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[19e1ab11-b796-4f1d-8273-bf5ad4d08955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.421 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[182f53da-6cfa-47d9-8807-5570c8de5a3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.424 104471 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp1rim3niy/privsep.sock']
Dec 05 11:59:41 compute-0 podman[214167]: 2025-12-05 11:59:41.509118522 +0000 UTC m=+0.077455821 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 11:59:41 compute-0 nova_compute[187208]: 2025-12-05 11:59:41.536 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Automatically allocated network: {'id': 'ca5a0748-2268-4f31-a673-9ef2606c4273', 'name': 'auto_allocated_network', 'tenant_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['3601d1a0-6de8-4dc7-839c-f1b2d1901b80', 'c438a52b-4019-49b5-8423-28f69ccbec64'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-12-05T11:58:56Z', 'updated_at': '2025-12-05T11:59:18Z', 'revision_number': 4, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Dec 05 11:59:41 compute-0 nova_compute[187208]: 2025-12-05 11:59:41.537 187212 DEBUG nova.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 11:59:41 compute-0 nova_compute[187208]: 2025-12-05 11:59:41.578 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Successfully created port: 0d1b5558-6557-43e9-8cac-a00b4e97ea8b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.132 104471 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.133 104471 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1rim3niy/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.994 214193 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.998 214193 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.000 214193 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.000 214193 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214193
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.136 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[668690a6-dc49-42b1-b03a-ff3ee1d69622]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:42 compute-0 nova_compute[187208]: 2025-12-05 11:59:42.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.620 214193 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.620 214193 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.621 214193 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.221 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2345a418-2b50-4221-9625-1a90a97d22c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.247 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1273ab9b-a534-4039-9649-16d8fed359fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 NetworkManager[55691]: <info>  [1764935983.2488] manager: (tapca5a0748-20): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Dec 05 11:59:43 compute-0 nova_compute[187208]: 2025-12-05 11:59:43.251 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Successfully created port: 06886ab7-aa74-4f44-b509-94e27d585818 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 11:59:43 compute-0 systemd-udevd[214205]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.284 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9e041ebd-9d8c-4aab-a491-2b3742ad8997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.287 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f089ad38-4e49-49e2-8d34-ab49c9781893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 NetworkManager[55691]: <info>  [1764935983.3154] device (tapca5a0748-20): carrier: link connected
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.324 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0c306d05-7b09-4b6f-a1ba-ac723a6f2889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.342 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0cbb79b2-6538-41f0-9a11-5c2b910f27c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214223, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.358 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f55174c6-bf97-4ab9-bada-49425e51a430]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed9:49b1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336387, 'tstamp': 336387}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214224, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.375 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd310d7-e336-44ed-8781-acfde9972292]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214225, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.410 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5dabe1dc-c799-40ca-aefd-b08319a23390]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.472 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6daa22-4bf7-463e-bff6-b7b6fe24729a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.474 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.474 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.475 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:43 compute-0 nova_compute[187208]: 2025-12-05 11:59:43.476 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:43 compute-0 NetworkManager[55691]: <info>  [1764935983.4772] manager: (tapca5a0748-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Dec 05 11:59:43 compute-0 kernel: tapca5a0748-20: entered promiscuous mode
Dec 05 11:59:43 compute-0 nova_compute[187208]: 2025-12-05 11:59:43.479 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.481 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:43 compute-0 nova_compute[187208]: 2025-12-05 11:59:43.482 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:43 compute-0 ovn_controller[95610]: 2025-12-05T11:59:43Z|00037|binding|INFO|Releasing lport 4248cb8a-d980-4682-8c47-d6faac0a26bc from this chassis (sb_readonly=0)
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.483 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ca5a0748-2268-4f31-a673-9ef2606c4273.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ca5a0748-2268-4f31-a673-9ef2606c4273.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.484 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3d953b3c-df09-4a3b-a118-80d8bfb6d1ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.485 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: global
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-ca5a0748-2268-4f31-a673-9ef2606c4273
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/ca5a0748-2268-4f31-a673-9ef2606c4273.pid.haproxy
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID ca5a0748-2268-4f31-a673-9ef2606c4273
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 11:59:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.485 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'env', 'PROCESS_TAG=haproxy-ca5a0748-2268-4f31-a673-9ef2606c4273', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ca5a0748-2268-4f31-a673-9ef2606c4273.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 11:59:43 compute-0 nova_compute[187208]: 2025-12-05 11:59:43.494 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:43 compute-0 podman[214258]: 2025-12-05 11:59:43.872884196 +0000 UTC m=+0.057640237 container create 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 11:59:43 compute-0 systemd[1]: Started libpod-conmon-99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31.scope.
Dec 05 11:59:43 compute-0 podman[214258]: 2025-12-05 11:59:43.842819139 +0000 UTC m=+0.027575190 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 11:59:43 compute-0 nova_compute[187208]: 2025-12-05 11:59:43.938 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Successfully updated port: 0d1b5558-6557-43e9-8cac-a00b4e97ea8b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 11:59:43 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:59:43 compute-0 nova_compute[187208]: 2025-12-05 11:59:43.953 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:43 compute-0 nova_compute[187208]: 2025-12-05 11:59:43.953 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquired lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:43 compute-0 nova_compute[187208]: 2025-12-05 11:59:43.954 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:59:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77b012e43fc6df7609492b693cc8452628271339171d87e515263e44dc855891/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 11:59:43 compute-0 podman[214258]: 2025-12-05 11:59:43.968834555 +0000 UTC m=+0.153590596 container init 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 05 11:59:43 compute-0 podman[214258]: 2025-12-05 11:59:43.974191982 +0000 UTC m=+0.158948013 container start 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 11:59:43 compute-0 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [NOTICE]   (214277) : New worker (214279) forked
Dec 05 11:59:43 compute-0 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [NOTICE]   (214277) : Loading success.
Dec 05 11:59:44 compute-0 nova_compute[187208]: 2025-12-05 11:59:44.379 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:44 compute-0 nova_compute[187208]: 2025-12-05 11:59:44.611 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Successfully updated port: 06886ab7-aa74-4f44-b509-94e27d585818 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 11:59:44 compute-0 nova_compute[187208]: 2025-12-05 11:59:44.650 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:44 compute-0 nova_compute[187208]: 2025-12-05 11:59:44.650 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquired lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:44 compute-0 nova_compute[187208]: 2025-12-05 11:59:44.651 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:59:44 compute-0 nova_compute[187208]: 2025-12-05 11:59:44.983 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:44 compute-0 nova_compute[187208]: 2025-12-05 11:59:44.984 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.050 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.117 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.146 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.147 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.163 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.163 187212 INFO nova.compute.claims [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.376 187212 DEBUG nova.compute.provider_tree [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.445 187212 DEBUG nova.scheduler.client.report [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.482 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.483 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.535 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.535 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.555 187212 INFO nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.571 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.656 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.658 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.658 187212 INFO nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Creating image(s)
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.659 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.659 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.660 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.672 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.729 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.731 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.732 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.742 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.804 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.805 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.839 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.840 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.840 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.903 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.905 187212 DEBUG nova.virt.disk.api [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Checking if we can resize image /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.906 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.957 187212 DEBUG nova.policy [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.967 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.968 187212 DEBUG nova.virt.disk.api [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Cannot resize image /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:59:45 compute-0 nova_compute[187208]: 2025-12-05 11:59:45.969 187212 DEBUG nova.objects.instance [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lazy-loading 'migration_context' on Instance uuid 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.077 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.078 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Ensure instance console log exists: /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.079 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.079 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.079 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.135 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.151 187212 DEBUG nova.compute.manager [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-changed-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.152 187212 DEBUG nova.compute.manager [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Refreshing instance network info cache due to event network-changed-0d1b5558-6557-43e9-8cac-a00b4e97ea8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.152 187212 DEBUG oslo_concurrency.lockutils [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:46 compute-0 podman[214303]: 2025-12-05 11:59:46.207626351 +0000 UTC m=+0.058030217 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.345 187212 DEBUG nova.compute.manager [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-changed-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.346 187212 DEBUG nova.compute.manager [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Refreshing instance network info cache due to event network-changed-06886ab7-aa74-4f44-b509-94e27d585818. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 11:59:46 compute-0 nova_compute[187208]: 2025-12-05 11:59:46.346 187212 DEBUG oslo_concurrency.lockutils [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:47 compute-0 nova_compute[187208]: 2025-12-05 11:59:47.170 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.463 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Successfully created port: 9275d01b-3eb9-429b-a0ba-0cb60048987a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.466 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Updating instance_info_cache with network_info: [{"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.484 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Releasing lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.485 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance network_info: |[{"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.485 187212 DEBUG oslo_concurrency.lockutils [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.485 187212 DEBUG nova.network.neutron [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Refreshing network info cache for port 0d1b5558-6557-43e9-8cac-a00b4e97ea8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.488 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start _get_guest_xml network_info=[{"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.493 187212 WARNING nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.498 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.499 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.509 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.509 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.510 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.510 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.511 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.511 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.511 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.511 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.513 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.517 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-3',id=6,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=b2e8212c-084c-4a4f-b930-56560ae4da12,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.518 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.519 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.520 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'pci_devices' on Instance uuid b2e8212c-084c-4a4f-b930-56560ae4da12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.539 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <uuid>b2e8212c-084c-4a4f-b930-56560ae4da12</uuid>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <name>instance-00000006</name>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:name>tempest-tempest.common.compute-instance-445293436-3</nova:name>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:59:48</nova:creationTime>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:user uuid="c4c62f22ba09455995ea1bde6a93431e">tempest-AutoAllocateNetworkTest-275048159-project-member</nova:user>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:project uuid="fb2c9c006bee4723bc8dd108e19a6728">tempest-AutoAllocateNetworkTest-275048159</nova:project>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:port uuid="0d1b5558-6557-43e9-8cac-a00b4e97ea8b">
Dec 05 11:59:48 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.1.0.6" ipVersion="4"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="fdfe:381f:8400::100" ipVersion="6"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <system>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="serial">b2e8212c-084c-4a4f-b930-56560ae4da12</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="uuid">b2e8212c-084c-4a4f-b930-56560ae4da12</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </system>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <os>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </os>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <features>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </features>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.config"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:05:76:3a"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <target dev="tap0d1b5558-65"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </interface>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/console.log" append="off"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <video>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </video>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:59:48 compute-0 nova_compute[187208]: </domain>
Dec 05 11:59:48 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.539 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Preparing to wait for external event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.539 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.540 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.540 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.540 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-3',id=6,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=b2e8212c-084c-4a4f-b930-56560ae4da12,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.541 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.541 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.542 187212 DEBUG os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.542 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.543 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.543 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.545 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.546 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d1b5558-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.546 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0d1b5558-65, col_values=(('external_ids', {'iface-id': '0d1b5558-6557-43e9-8cac-a00b4e97ea8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:76:3a', 'vm-uuid': 'b2e8212c-084c-4a4f-b930-56560ae4da12'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:48 compute-0 NetworkManager[55691]: <info>  [1764935988.5484] manager: (tap0d1b5558-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.547 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.554 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.555 187212 INFO os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65')
Dec 05 11:59:48 compute-0 podman[214323]: 2025-12-05 11:59:48.565329658 +0000 UTC m=+0.065931945 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.616 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.617 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.617 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No VIF found with MAC fa:16:3e:05:76:3a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.617 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Using config drive
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.662 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Updating instance_info_cache with network_info: [{"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.678 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Releasing lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.678 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance network_info: |[{"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.678 187212 DEBUG oslo_concurrency.lockutils [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.678 187212 DEBUG nova.network.neutron [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Refreshing network info cache for port 06886ab7-aa74-4f44-b509-94e27d585818 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.681 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start _get_guest_xml network_info=[{"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.684 187212 WARNING nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.689 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.689 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.692 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.692 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.692 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.692 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.694 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.694 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.694 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.694 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.697 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-2',id=5,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=04518502-62f1-44c3-8c57-b3404958536f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.697 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.698 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.698 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'pci_devices' on Instance uuid 04518502-62f1-44c3-8c57-b3404958536f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.713 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <uuid>04518502-62f1-44c3-8c57-b3404958536f</uuid>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <name>instance-00000005</name>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:name>tempest-tempest.common.compute-instance-445293436-2</nova:name>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:59:48</nova:creationTime>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:user uuid="c4c62f22ba09455995ea1bde6a93431e">tempest-AutoAllocateNetworkTest-275048159-project-member</nova:user>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:project uuid="fb2c9c006bee4723bc8dd108e19a6728">tempest-AutoAllocateNetworkTest-275048159</nova:project>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         <nova:port uuid="06886ab7-aa74-4f44-b509-94e27d585818">
Dec 05 11:59:48 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.1.0.8" ipVersion="4"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="fdfe:381f:8400::241" ipVersion="6"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <system>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="serial">04518502-62f1-44c3-8c57-b3404958536f</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="uuid">04518502-62f1-44c3-8c57-b3404958536f</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </system>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <os>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </os>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <features>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </features>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.config"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:61:58:b9"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <target dev="tap06886ab7-aa"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </interface>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/console.log" append="off"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <video>
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </video>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:59:48 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:59:48 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:59:48 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:59:48 compute-0 nova_compute[187208]: </domain>
Dec 05 11:59:48 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.714 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Preparing to wait for external event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.714 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.714 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.714 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.715 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-2',id=5,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=04518502-62f1-44c3-8c57-b3404958536f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.715 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.716 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.716 187212 DEBUG os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.717 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.717 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.717 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.719 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.720 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06886ab7-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.720 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06886ab7-aa, col_values=(('external_ids', {'iface-id': '06886ab7-aa74-4f44-b509-94e27d585818', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:58:b9', 'vm-uuid': '04518502-62f1-44c3-8c57-b3404958536f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:48 compute-0 NetworkManager[55691]: <info>  [1764935988.7222] manager: (tap06886ab7-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.721 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.724 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.728 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.729 187212 INFO os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa')
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.770 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.771 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.771 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No VIF found with MAC fa:16:3e:61:58:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 11:59:48 compute-0 nova_compute[187208]: 2025-12-05 11:59:48.772 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Using config drive
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.131 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Creating config drive at /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.config
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.137 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmk4fwg1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.186 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Creating config drive at /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.config
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.191 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6uro0gci execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.269 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmk4fwg1" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.314 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6uro0gci" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:49 compute-0 NetworkManager[55691]: <info>  [1764935989.3332] manager: (tap06886ab7-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Dec 05 11:59:49 compute-0 kernel: tap06886ab7-aa: entered promiscuous mode
Dec 05 11:59:49 compute-0 ovn_controller[95610]: 2025-12-05T11:59:49Z|00038|binding|INFO|Claiming lport 06886ab7-aa74-4f44-b509-94e27d585818 for this chassis.
Dec 05 11:59:49 compute-0 ovn_controller[95610]: 2025-12-05T11:59:49Z|00039|binding|INFO|06886ab7-aa74-4f44-b509-94e27d585818: Claiming fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.340 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.349 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241'], port_security=['fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.8/26 fdfe:381f:8400::241/64', 'neutron:device_id': '04518502-62f1-44c3-8c57-b3404958536f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=06886ab7-aa74-4f44-b509-94e27d585818) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.350 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 06886ab7-aa74-4f44-b509-94e27d585818 in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 bound to our chassis
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.352 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273
Dec 05 11:59:49 compute-0 systemd-udevd[214375]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.373 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8b4c09-27d7-45b9-bbba-8ffcb895c025]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 ovn_controller[95610]: 2025-12-05T11:59:49Z|00040|binding|INFO|Setting lport 06886ab7-aa74-4f44-b509-94e27d585818 ovn-installed in OVS
Dec 05 11:59:49 compute-0 ovn_controller[95610]: 2025-12-05T11:59:49Z|00041|binding|INFO|Setting lport 06886ab7-aa74-4f44-b509-94e27d585818 up in Southbound
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.381 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:49 compute-0 kernel: tap0d1b5558-65: entered promiscuous mode
Dec 05 11:59:49 compute-0 NetworkManager[55691]: <info>  [1764935989.3863] manager: (tap0d1b5558-65): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Dec 05 11:59:49 compute-0 ovn_controller[95610]: 2025-12-05T11:59:49Z|00042|binding|INFO|Claiming lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b for this chassis.
Dec 05 11:59:49 compute-0 ovn_controller[95610]: 2025-12-05T11:59:49Z|00043|binding|INFO|0d1b5558-6557-43e9-8cac-a00b4e97ea8b: Claiming fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100
Dec 05 11:59:49 compute-0 systemd-udevd[214385]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.389 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:49 compute-0 NetworkManager[55691]: <info>  [1764935989.3945] device (tap06886ab7-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:59:49 compute-0 NetworkManager[55691]: <info>  [1764935989.3962] device (tap06886ab7-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.393 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100'], port_security=['fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.6/26 fdfe:381f:8400::100/64', 'neutron:device_id': 'b2e8212c-084c-4a4f-b930-56560ae4da12', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0d1b5558-6557-43e9-8cac-a00b4e97ea8b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 11:59:49 compute-0 ovn_controller[95610]: 2025-12-05T11:59:49Z|00044|binding|INFO|Setting lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b ovn-installed in OVS
Dec 05 11:59:49 compute-0 ovn_controller[95610]: 2025-12-05T11:59:49Z|00045|binding|INFO|Setting lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b up in Southbound
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:49 compute-0 NetworkManager[55691]: <info>  [1764935989.4049] device (tap0d1b5558-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:59:49 compute-0 NetworkManager[55691]: <info>  [1764935989.4058] device (tap0d1b5558-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.405 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.406 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[37cde7ab-37ca-414f-9d51-81fef53dcd98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.409 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.410 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[84365e37-eb13-46f3-bcca-5ae955efee08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 systemd-machined[153543]: New machine qemu-8-instance-00000005.
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.436 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b88566b5-2230-4744-8423-cf457fe22554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000005.
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.453 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[81180780-57b1-4a0c-a5d9-c4ee4e41602a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214393, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.467 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[416c6d41-d5dc-467e-b39b-a3df0682fcd2]: (4, ({'family': 2, 'prefixlen': 26, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.0.2'], ['IFA_LOCAL', '10.1.0.2'], ['IFA_BROADCAST', '10.1.0.63'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336400, 'tstamp': 336400}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214396, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336403, 'tstamp': 336403}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214396, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 systemd-machined[153543]: New machine qemu-9-instance-00000006.
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.468 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:49 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000006.
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.469 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.471 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.471 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.471 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.472 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.472 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.473 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0d1b5558-6557-43e9-8cac-a00b4e97ea8b in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 unbound from our chassis
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.475 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.487 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[70f242d4-a54d-4cff-98a0-b3dabb4f6e6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.510 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[65e56bc7-5806-430d-b85c-d128e4aebf27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.513 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d56ea429-b1aa-435b-bc29-71cf1cd0d7b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.536 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c5830703-d0ed-40f8-bb69-ee7a8fc2741e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.554 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6c965220-76a4-4ae2-9c42-d1ceebb2ce41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214415, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.574 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72627b1d-ff34-4149-9bbb-cf53b8e54c03]: (4, ({'family': 2, 'prefixlen': 26, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.0.2'], ['IFA_LOCAL', '10.1.0.2'], ['IFA_BROADCAST', '10.1.0.63'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336400, 'tstamp': 336400}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214416, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336403, 'tstamp': 336403}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214416, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.576 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.745 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935989.7450576, 04518502-62f1-44c3-8c57-b3404958536f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.746 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] VM Started (Lifecycle Event)
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.763 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.769 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935989.7452853, 04518502-62f1-44c3-8c57-b3404958536f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.769 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] VM Paused (Lifecycle Event)
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.786 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.792 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.811 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.840 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935989.840538, b2e8212c-084c-4a4f-b930-56560ae4da12 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.841 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] VM Started (Lifecycle Event)
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.862 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.865 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935989.8428938, b2e8212c-084c-4a4f-b930-56560ae4da12 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.865 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] VM Paused (Lifecycle Event)
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.887 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.889 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:49 compute-0 nova_compute[187208]: 2025-12-05 11:59:49.915 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.217 187212 DEBUG nova.network.neutron [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Updated VIF entry in instance network info cache for port 06886ab7-aa74-4f44-b509-94e27d585818. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.218 187212 DEBUG nova.network.neutron [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Updating instance_info_cache with network_info: [{"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.234 187212 DEBUG oslo_concurrency.lockutils [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.750 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Successfully updated port: 9275d01b-3eb9-429b-a0ba-0cb60048987a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.767 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.767 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquired lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.768 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.930 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.930 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:50 compute-0 nova_compute[187208]: 2025-12-05 11:59:50.956 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.008 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.042 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.043 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.049 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.049 187212 INFO nova.compute.claims [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.298 187212 DEBUG nova.compute.provider_tree [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.313 187212 DEBUG nova.scheduler.client.report [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.336 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.336 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.394 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.394 187212 DEBUG nova.network.neutron [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.411 187212 INFO nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.425 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.507 187212 DEBUG nova.network.neutron [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Updated VIF entry in instance network info cache for port 0d1b5558-6557-43e9-8cac-a00b4e97ea8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.508 187212 DEBUG nova.network.neutron [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Updating instance_info_cache with network_info: [{"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.521 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.523 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.524 187212 INFO nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Creating image(s)
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.525 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.525 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.526 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.556 187212 DEBUG oslo_concurrency.lockutils [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.558 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.623 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.624 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.625 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.654 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.721 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.723 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.768 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.770 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.771 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.845 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.847 187212 DEBUG nova.virt.disk.api [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Checking if we can resize image /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.849 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.871 187212 DEBUG nova.network.neutron [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.872 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.908 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.910 187212 DEBUG nova.virt.disk.api [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Cannot resize image /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.910 187212 DEBUG nova.objects.instance [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'migration_context' on Instance uuid 5150eaf5-c0ca-48ab-9045-af5a1c785c8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.924 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.925 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Ensure instance console log exists: /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.925 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.926 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.926 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.928 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.931 187212 DEBUG nova.compute.manager [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.932 187212 DEBUG nova.compute.manager [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing instance network info cache due to event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.932 187212 DEBUG oslo_concurrency.lockutils [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.937 187212 WARNING nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.942 187212 DEBUG nova.virt.libvirt.host [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.943 187212 DEBUG nova.virt.libvirt.host [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.945 187212 DEBUG nova.virt.libvirt.host [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.946 187212 DEBUG nova.virt.libvirt.host [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.946 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.946 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.947 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.947 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.948 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.948 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.948 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.948 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.949 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.949 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.949 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.949 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.953 187212 DEBUG nova.objects.instance [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5150eaf5-c0ca-48ab-9045-af5a1c785c8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:51 compute-0 nova_compute[187208]: 2025-12-05 11:59:51.966 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <uuid>5150eaf5-c0ca-48ab-9045-af5a1c785c8e</uuid>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <name>instance-0000000b</name>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <nova:name>tempest-LiveMigrationNegativeTest-server-865064456</nova:name>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:59:51</nova:creationTime>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:59:51 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:59:51 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:59:51 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:59:51 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:59:51 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:59:51 compute-0 nova_compute[187208]:         <nova:user uuid="28407300b110465d9748f60fa4ee4945">tempest-LiveMigrationNegativeTest-1771731310-project-member</nova:user>
Dec 05 11:59:51 compute-0 nova_compute[187208]:         <nova:project uuid="6592a6d983f44d9e94749f0e3e94c689">tempest-LiveMigrationNegativeTest-1771731310</nova:project>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <system>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <entry name="serial">5150eaf5-c0ca-48ab-9045-af5a1c785c8e</entry>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <entry name="uuid">5150eaf5-c0ca-48ab-9045-af5a1c785c8e</entry>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     </system>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <os>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   </os>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <features>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   </features>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.config"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/console.log" append="off"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <video>
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     </video>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:59:51 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:59:51 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:59:51 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:59:51 compute-0 nova_compute[187208]: </domain>
Dec 05 11:59:51 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.038 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.038 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.039 187212 INFO nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Using config drive
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.439 187212 INFO nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Creating config drive at /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.config
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.451 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjteqlr9h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.520 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.542 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Releasing lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.543 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance network_info: |[{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.543 187212 DEBUG oslo_concurrency.lockutils [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.544 187212 DEBUG nova.network.neutron [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.549 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start _get_guest_xml network_info=[{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.555 187212 WARNING nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.561 187212 DEBUG nova.virt.libvirt.host [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.562 187212 DEBUG nova.virt.libvirt.host [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.571 187212 DEBUG nova.virt.libvirt.host [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.572 187212 DEBUG nova.virt.libvirt.host [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.573 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.574 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.575 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.576 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.577 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.577 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.578 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.579 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.579 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.580 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.580 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.580 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.585 187212 DEBUG nova.virt.libvirt.vif [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-147223876',id=10,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d2f26b00364f84b1702bb7219b8d31',ramdisk_id='',reservation_id='r-5f154sc3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:59:45Z,user_data=None,user_id='d4754b88440a4ea08a37067ef9234672',uuid=597f2994-fdad-46b1-9ef7-f56d62b4bbd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.585 187212 DEBUG nova.network.os_vif_util [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converting VIF {"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.586 187212 DEBUG nova.network.os_vif_util [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.587 187212 DEBUG nova.objects.instance [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lazy-loading 'pci_devices' on Instance uuid 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.588 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjteqlr9h" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.603 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] End _get_guest_xml xml=<domain type="kvm">
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <uuid>597f2994-fdad-46b1-9ef7-f56d62b4bbd0</uuid>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <name>instance-0000000a</name>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <metadata>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876</nova:name>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 11:59:52</nova:creationTime>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 11:59:52 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 11:59:52 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 11:59:52 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 11:59:52 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 11:59:52 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 11:59:52 compute-0 nova_compute[187208]:         <nova:user uuid="d4754b88440a4ea08a37067ef9234672">tempest-FloatingIPsAssociationNegativeTestJSON-4920441-project-member</nova:user>
Dec 05 11:59:52 compute-0 nova_compute[187208]:         <nova:project uuid="16d2f26b00364f84b1702bb7219b8d31">tempest-FloatingIPsAssociationNegativeTestJSON-4920441</nova:project>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 11:59:52 compute-0 nova_compute[187208]:         <nova:port uuid="9275d01b-3eb9-429b-a0ba-0cb60048987a">
Dec 05 11:59:52 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   </metadata>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <system>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <entry name="serial">597f2994-fdad-46b1-9ef7-f56d62b4bbd0</entry>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <entry name="uuid">597f2994-fdad-46b1-9ef7-f56d62b4bbd0</entry>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     </system>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <os>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   </os>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <features>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <apic/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   </features>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   </clock>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   </cpu>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   <devices>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.config"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     </disk>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:f5:93:9d"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <target dev="tap9275d01b-3e"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     </interface>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/console.log" append="off"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     </serial>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <video>
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     </video>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     </rng>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 11:59:52 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 11:59:52 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 11:59:52 compute-0 nova_compute[187208]:   </devices>
Dec 05 11:59:52 compute-0 nova_compute[187208]: </domain>
Dec 05 11:59:52 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.603 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Preparing to wait for external event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.604 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.604 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.604 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.605 187212 DEBUG nova.virt.libvirt.vif [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-147223876',id=10,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d2f26b00364f84b1702bb7219b8d31',ramdisk_id='',reservation_id='r-5f154sc3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:59:45Z,user_data=None,user_id='d4754b88440a4ea08a37067ef9234672',uuid=597f2994-fdad-46b1-9ef7-f56d62b4bbd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.605 187212 DEBUG nova.network.os_vif_util [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converting VIF {"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.606 187212 DEBUG nova.network.os_vif_util [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.606 187212 DEBUG os_vif [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.607 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.607 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.607 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.610 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.610 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9275d01b-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.610 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9275d01b-3e, col_values=(('external_ids', {'iface-id': '9275d01b-3eb9-429b-a0ba-0cb60048987a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f5:93:9d', 'vm-uuid': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.611 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:52 compute-0 NetworkManager[55691]: <info>  [1764935992.6125] manager: (tap9275d01b-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.614 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.617 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.618 187212 INFO os_vif [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e')
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.668 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.669 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.669 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] No VIF found with MAC fa:16:3e:f5:93:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 11:59:52 compute-0 nova_compute[187208]: 2025-12-05 11:59:52.670 187212 INFO nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Using config drive
Dec 05 11:59:52 compute-0 systemd-machined[153543]: New machine qemu-10-instance-0000000b.
Dec 05 11:59:52 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000b.
Dec 05 11:59:52 compute-0 podman[214459]: 2025-12-05 11:59:52.71025557 +0000 UTC m=+0.061526423 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 11:59:52 compute-0 podman[214460]: 2025-12-05 11:59:52.745939902 +0000 UTC m=+0.095710524 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.112 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935993.1117496, 5150eaf5-c0ca-48ab-9045-af5a1c785c8e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.113 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] VM Resumed (Lifecycle Event)
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.116 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.117 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.120 187212 INFO nova.virt.libvirt.driver [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance spawned successfully.
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.121 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.127 187212 INFO nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Creating config drive at /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.config
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.133 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgdy85fq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.151 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.157 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.161 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.162 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.162 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.163 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.163 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.164 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.196 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.197 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935993.1165237, 5150eaf5-c0ca-48ab-9045-af5a1c785c8e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.197 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] VM Started (Lifecycle Event)
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.228 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.233 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.242 187212 INFO nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Took 1.72 seconds to spawn the instance on the hypervisor.
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.243 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.254 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.256 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgdy85fq" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.316 187212 INFO nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Took 2.30 seconds to build instance.
Dec 05 11:59:53 compute-0 kernel: tap9275d01b-3e: entered promiscuous mode
Dec 05 11:59:53 compute-0 NetworkManager[55691]: <info>  [1764935993.3235] manager: (tap9275d01b-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Dec 05 11:59:53 compute-0 systemd-udevd[214523]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.324 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:53 compute-0 ovn_controller[95610]: 2025-12-05T11:59:53Z|00046|binding|INFO|Claiming lport 9275d01b-3eb9-429b-a0ba-0cb60048987a for this chassis.
Dec 05 11:59:53 compute-0 ovn_controller[95610]: 2025-12-05T11:59:53Z|00047|binding|INFO|9275d01b-3eb9-429b-a0ba-0cb60048987a: Claiming fa:16:3e:f5:93:9d 10.100.0.7
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.330 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.333 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.335 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.340 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:93:9d 10.100.0.7'], port_security=['fa:16:3e:f5:93:9d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d2f26b00364f84b1702bb7219b8d31', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ba7f2e39-8114-45e5-bd44-4ae84ab46fc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd38fa62-d49e-4607-8d3e-179b767c8786, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9275d01b-3eb9-429b-a0ba-0cb60048987a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.341 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9275d01b-3eb9-429b-a0ba-0cb60048987a in datapath e5a9559e-b860-47a2-b44b-45c7f67f2119 bound to our chassis
Dec 05 11:59:53 compute-0 NetworkManager[55691]: <info>  [1764935993.3436] device (tap9275d01b-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 11:59:53 compute-0 NetworkManager[55691]: <info>  [1764935993.3447] device (tap9275d01b-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.346 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e5a9559e-b860-47a2-b44b-45c7f67f2119
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.361 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d55da232-a444-4415-a03d-5233eb83894e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.361 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape5a9559e-b1 in ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 11:59:53 compute-0 systemd-machined[153543]: New machine qemu-11-instance-0000000a.
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.368 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape5a9559e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.368 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a79f120b-9557-4c47-99db-ff705c6abc62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.369 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c3be518c-1f1e-4547-9570-0ea33e7a0ae2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000a.
Dec 05 11:59:53 compute-0 ovn_controller[95610]: 2025-12-05T11:59:53Z|00048|binding|INFO|Setting lport 9275d01b-3eb9-429b-a0ba-0cb60048987a ovn-installed in OVS
Dec 05 11:59:53 compute-0 ovn_controller[95610]: 2025-12-05T11:59:53Z|00049|binding|INFO|Setting lport 9275d01b-3eb9-429b-a0ba-0cb60048987a up in Southbound
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.394 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b705c481-30d1-4d27-8686-49b4aebf26cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.396 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.412 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[06573a8f-f3f5-49c9-b5c8-ee65f0dcbd3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.466 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9a05c7e9-84ba-4cee-92fd-90c61a63709d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 NetworkManager[55691]: <info>  [1764935993.4754] manager: (tape5a9559e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.473 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[82d80d9f-3d22-4a57-87ce-4c1a9ef5de4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.521 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3824b5f1-f11d-428c-aa1d-5ce179113b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.525 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[37c24756-4840-4ee7-ac1a-07c9e27b7224]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 NetworkManager[55691]: <info>  [1764935993.5603] device (tape5a9559e-b0): carrier: link connected
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.567 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff99cd9-0f25-4d2b-b548-323a60d84680]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.593 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c309175f-8ecd-4327-836a-fe5bad8cc34c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5a9559e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:26:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337412, 'reachable_time': 24464, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214576, 'error': None, 'target': 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.615 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d02d49b3-04a2-4ad9-896b-600e43f29f4d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:26b8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 337412, 'tstamp': 337412}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214577, 'error': None, 'target': 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.641 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[619cceeb-dd3e-4ee4-b54e-0303cacbe918]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5a9559e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:26:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337412, 'reachable_time': 24464, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214578, 'error': None, 'target': 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.699 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0322fcea-3785-4df9-bd1c-0fd4c8e90fa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.777 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d2fad411-5aa3-4ddf-b4af-4424d9fd34ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.778 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5a9559e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.778 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.778 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5a9559e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:53 compute-0 NetworkManager[55691]: <info>  [1764935993.7813] manager: (tape5a9559e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec 05 11:59:53 compute-0 kernel: tape5a9559e-b0: entered promiscuous mode
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.785 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape5a9559e-b0, col_values=(('external_ids', {'iface-id': '79bf1a96-6e90-41b7-8356-9756185de59f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 11:59:53 compute-0 ovn_controller[95610]: 2025-12-05T11:59:53Z|00050|binding|INFO|Releasing lport 79bf1a96-6e90-41b7-8356-9756185de59f from this chassis (sb_readonly=0)
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.786 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:53 compute-0 nova_compute[187208]: 2025-12-05 11:59:53.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.808 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e5a9559e-b860-47a2-b44b-45c7f67f2119.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e5a9559e-b860-47a2-b44b-45c7f67f2119.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.816 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[81b6f407-2650-4358-9f51-b1765d612ce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.817 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: global
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-e5a9559e-b860-47a2-b44b-45c7f67f2119
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/e5a9559e-b860-47a2-b44b-45c7f67f2119.pid.haproxy
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID e5a9559e-b860-47a2-b44b-45c7f67f2119
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 11:59:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.817 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'env', 'PROCESS_TAG=haproxy-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e5a9559e-b860-47a2-b44b-45c7f67f2119.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 11:59:54 compute-0 nova_compute[187208]: 2025-12-05 11:59:54.312 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935994.3124888, 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:54 compute-0 nova_compute[187208]: 2025-12-05 11:59:54.313 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] VM Started (Lifecycle Event)
Dec 05 11:59:54 compute-0 podman[214614]: 2025-12-05 11:59:54.379680027 +0000 UTC m=+0.052311870 container create 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 05 11:59:54 compute-0 podman[214614]: 2025-12-05 11:59:54.348767637 +0000 UTC m=+0.021399510 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 11:59:54 compute-0 nova_compute[187208]: 2025-12-05 11:59:54.598 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:54 compute-0 nova_compute[187208]: 2025-12-05 11:59:54.609 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935994.312636, 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:54 compute-0 nova_compute[187208]: 2025-12-05 11:59:54.610 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] VM Paused (Lifecycle Event)
Dec 05 11:59:54 compute-0 nova_compute[187208]: 2025-12-05 11:59:54.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:54 compute-0 nova_compute[187208]: 2025-12-05 11:59:54.661 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:54 compute-0 systemd[1]: Started libpod-conmon-24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c.scope.
Dec 05 11:59:54 compute-0 nova_compute[187208]: 2025-12-05 11:59:54.678 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:54 compute-0 systemd[1]: Started libcrun container.
Dec 05 11:59:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15ac9401aab02c277b66f0b8b3e087367793eb4fcc0a66aa2c56cb8b76ba06f3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 11:59:54 compute-0 podman[214614]: 2025-12-05 11:59:54.736680626 +0000 UTC m=+0.409312469 container init 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 11:59:54 compute-0 podman[214614]: 2025-12-05 11:59:54.746510286 +0000 UTC m=+0.419142119 container start 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 11:59:54 compute-0 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [NOTICE]   (214635) : New worker (214637) forked
Dec 05 11:59:54 compute-0 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [NOTICE]   (214635) : Loading success.
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.270 187212 DEBUG nova.network.neutron [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updated VIF entry in instance network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.271 187212 DEBUG nova.network.neutron [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.286 187212 DEBUG oslo_concurrency.lockutils [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.449 187212 DEBUG nova.compute.manager [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.449 187212 DEBUG oslo_concurrency.lockutils [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.450 187212 DEBUG oslo_concurrency.lockutils [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.450 187212 DEBUG oslo_concurrency.lockutils [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.450 187212 DEBUG nova.compute.manager [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Processing event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.450 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance event wait completed in 15 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.475 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935995.4598885, e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.475 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] VM Resumed (Lifecycle Event)
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.476 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.479 187212 INFO nova.virt.libvirt.driver [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance spawned successfully.
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.480 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.499 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.503 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.504 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.504 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.504 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.505 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.505 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.510 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.543 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.571 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Took 60.85 seconds to spawn the instance on the hypervisor.
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.572 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.626 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Took 61.75 seconds to build instance.
Dec 05 11:59:55 compute-0 nova_compute[187208]: 2025-12-05 11:59:55.642 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 61.834s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:56 compute-0 nova_compute[187208]: 2025-12-05 11:59:56.756 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:56 compute-0 nova_compute[187208]: 2025-12-05 11:59:56.756 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:56 compute-0 nova_compute[187208]: 2025-12-05 11:59:56.775 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 11:59:56 compute-0 nova_compute[187208]: 2025-12-05 11:59:56.898 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:56 compute-0 nova_compute[187208]: 2025-12-05 11:59:56.899 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:56 compute-0 nova_compute[187208]: 2025-12-05 11:59:56.905 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 11:59:56 compute-0 nova_compute[187208]: 2025-12-05 11:59:56.906 187212 INFO nova.compute.claims [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Claim successful on node compute-0.ctlplane.example.com
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.153 187212 DEBUG nova.compute.provider_tree [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.172 187212 DEBUG nova.scheduler.client.report [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.200 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.201 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.253 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.253 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.280 187212 INFO nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.299 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.388 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.389 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.390 187212 INFO nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating image(s)
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.391 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.391 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.392 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.407 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.486 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.489 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.491 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.504 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.582 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.584 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.618 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.619 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.619 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.676 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.678 187212 DEBUG nova.virt.disk.api [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.678 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.709 187212 DEBUG nova.policy [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.732 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.733 187212 DEBUG nova.virt.disk.api [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.734 187212 DEBUG nova.objects.instance [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.754 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.755 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Ensure instance console log exists: /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.755 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.756 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:57 compute-0 nova_compute[187208]: 2025-12-05 11:59:57.756 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.109 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.110 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.110 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.110 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.111 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] No waiting events found dispatching network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.111 187212 WARNING nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received unexpected event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 for instance with vm_state active and task_state None.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.111 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.112 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.112 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.112 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.113 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Processing event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.113 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.114 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.114 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.114 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.114 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] No waiting events found dispatching network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.115 187212 WARNING nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received unexpected event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 for instance with vm_state building and task_state spawning.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.115 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.115 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.116 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.116 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.116 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Processing event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.117 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.117 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.117 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.118 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.118 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] No waiting events found dispatching network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.118 187212 WARNING nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received unexpected event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b for instance with vm_state building and task_state spawning.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.120 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.120 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.125 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935998.1242163, b2e8212c-084c-4a4f-b930-56560ae4da12 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.126 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] VM Resumed (Lifecycle Event)
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.129 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.129 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.134 187212 INFO nova.virt.libvirt.driver [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance spawned successfully.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.134 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.138 187212 INFO nova.virt.libvirt.driver [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance spawned successfully.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.139 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.145 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.155 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.167 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.168 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.168 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.169 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.169 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.170 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.175 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.175 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.176 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.176 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.177 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.177 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.182 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.183 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935998.1260316, 04518502-62f1-44c3-8c57-b3404958536f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.183 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] VM Resumed (Lifecycle Event)
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.226 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.228 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 11:59:58 compute-0 podman[214662]: 2025-12-05 11:59:58.243319803 +0000 UTC m=+0.094887741 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.265 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.271 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Took 62.80 seconds to spawn the instance on the hypervisor.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.272 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.279 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Took 63.14 seconds to spawn the instance on the hypervisor.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.279 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.363 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Took 64.37 seconds to build instance.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.371 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Took 64.37 seconds to build instance.
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.386 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 64.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:58 compute-0 nova_compute[187208]: 2025-12-05 11:59:58.388 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 64.496s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 11:59:59 compute-0 nova_compute[187208]: 2025-12-05 11:59:59.027 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Successfully created port: 380c99a7-9480-45f8-b2f4-adfcdfa8576d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:00:00 compute-0 nova_compute[187208]: 2025-12-05 12:00:00.065 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Successfully updated port: 380c99a7-9480-45f8-b2f4-adfcdfa8576d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:00:00 compute-0 nova_compute[187208]: 2025-12-05 12:00:00.084 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:00 compute-0 nova_compute[187208]: 2025-12-05 12:00:00.085 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquired lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:00 compute-0 nova_compute[187208]: 2025-12-05 12:00:00.085 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:00 compute-0 nova_compute[187208]: 2025-12-05 12:00:00.601 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.063 187212 DEBUG nova.compute.manager [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-changed-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.063 187212 DEBUG nova.compute.manager [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Refreshing instance network info cache due to event network-changed-380c99a7-9480-45f8-b2f4-adfcdfa8576d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.064 187212 DEBUG oslo_concurrency.lockutils [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.162 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.163 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.185 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.305 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.306 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.316 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.318 187212 INFO nova.compute.claims [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.601 187212 DEBUG nova.compute.provider_tree [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.618 187212 DEBUG nova.scheduler.client.report [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.642 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.643 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.685 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.686 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.708 187212 INFO nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.727 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.810 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.812 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.812 187212 INFO nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Creating image(s)
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.813 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.813 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.814 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.830 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.909 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.910 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.911 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.923 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.987 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:01 compute-0 nova_compute[187208]: 2025-12-05 12:00:01.988 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.009 187212 DEBUG nova.policy [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.033 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.034 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.035 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.090 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.091 187212 DEBUG nova.virt.disk.api [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.093 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.159 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.167 187212 DEBUG nova.virt.disk.api [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.168 187212 DEBUG nova.objects.instance [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.198 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.200 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Ensure instance console log exists: /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.200 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.201 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.201 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.541 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.542 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.570 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.614 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updating instance_info_cache with network_info: [{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.616 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.634 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Releasing lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.635 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance network_info: |[{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.635 187212 DEBUG oslo_concurrency.lockutils [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.636 187212 DEBUG nova.network.neutron [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Refreshing network info cache for port 380c99a7-9480-45f8-b2f4-adfcdfa8576d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.641 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start _get_guest_xml network_info=[{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.643 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.644 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.652 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.653 187212 INFO nova.compute.claims [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.657 187212 WARNING nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.673 187212 DEBUG nova.virt.libvirt.host [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.676 187212 DEBUG nova.virt.libvirt.host [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.679 187212 DEBUG nova.virt.libvirt.host [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.682 187212 DEBUG nova.virt.libvirt.host [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.682 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.682 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.683 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.683 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.684 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.684 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.684 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.685 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.685 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.685 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.686 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.686 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.690 187212 DEBUG nova.virt.libvirt.vif [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:59:57Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.691 187212 DEBUG nova.network.os_vif_util [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.692 187212 DEBUG nova.network.os_vif_util [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.693 187212 DEBUG nova.objects.instance [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.712 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <uuid>982a8e69-5181-4847-bdfe-8d4de12bb2e4</uuid>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <name>instance-0000000c</name>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdminTestJSON-server-1785289561</nova:name>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:02</nova:creationTime>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:02 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:02 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:02 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:02 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:02 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:02 compute-0 nova_compute[187208]:         <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec 05 12:00:02 compute-0 nova_compute[187208]:         <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:00:02 compute-0 nova_compute[187208]:         <nova:port uuid="380c99a7-9480-45f8-b2f4-adfcdfa8576d">
Dec 05 12:00:02 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <entry name="serial">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <entry name="uuid">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:24:4f:38"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <target dev="tap380c99a7-94"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log" append="off"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:02 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:02 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:02 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:02 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:02 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.718 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Preparing to wait for external event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.718 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.719 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.719 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.720 187212 DEBUG nova.virt.libvirt.vif [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:59:57Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.720 187212 DEBUG nova.network.os_vif_util [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.721 187212 DEBUG nova.network.os_vif_util [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.722 187212 DEBUG os_vif [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.723 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.723 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.724 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.732 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.733 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap380c99a7-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.734 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap380c99a7-94, col_values=(('external_ids', {'iface-id': '380c99a7-9480-45f8-b2f4-adfcdfa8576d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:4f:38', 'vm-uuid': '982a8e69-5181-4847-bdfe-8d4de12bb2e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:02 compute-0 NetworkManager[55691]: <info>  [1764936002.7363] manager: (tap380c99a7-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.735 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.738 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.748 187212 INFO os_vif [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.805 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.816 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.817 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:24:4f:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.817 187212 INFO nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Using config drive
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.821 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Successfully created port: f194d74d-a9ec-4838-b35d-8393a2087ec5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.952 187212 DEBUG nova.compute.provider_tree [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.968 187212 DEBUG nova.scheduler.client.report [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.991 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:02 compute-0 nova_compute[187208]: 2025-12-05 12:00:02.992 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:00:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:03.006 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:03.006 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:03.007 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.055 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.056 187212 DEBUG nova.network.neutron [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.083 187212 INFO nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.105 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.216 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.217 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.218 187212 INFO nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Creating image(s)
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.218 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.218 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.219 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.232 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.291 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.292 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.293 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.304 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.375 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.376 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.459 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk 1073741824" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.461 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.461 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.526 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.527 187212 DEBUG nova.virt.disk.api [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Checking if we can resize image /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.530 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.600 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.602 187212 DEBUG nova.virt.disk.api [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Cannot resize image /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.602 187212 DEBUG nova.objects.instance [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'migration_context' on Instance uuid 8c58d60e-b997-4eed-8cd4-33ac07d9727a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.626 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.627 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Ensure instance console log exists: /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.627 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.627 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:03 compute-0 nova_compute[187208]: 2025-12-05 12:00:03.628 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.223 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Successfully updated port: f194d74d-a9ec-4838-b35d-8393a2087ec5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.226 187212 INFO nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating config drive at /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.232 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1qu0cyew execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.256 187212 DEBUG nova.network.neutron [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.257 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.261 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.266 187212 WARNING nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.271 187212 DEBUG nova.virt.libvirt.host [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.272 187212 DEBUG nova.virt.libvirt.host [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.276 187212 DEBUG nova.virt.libvirt.host [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.276 187212 DEBUG nova.virt.libvirt.host [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.276 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.277 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.277 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.277 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.278 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.278 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.279 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.279 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.279 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.279 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.280 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.280 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.283 187212 DEBUG nova.objects.instance [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8c58d60e-b997-4eed-8cd4-33ac07d9727a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.286 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.286 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquired lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.287 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:04 compute-0 podman[214715]: 2025-12-05 12:00:04.303240396 +0000 UTC m=+0.064444613 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.312 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <uuid>8c58d60e-b997-4eed-8cd4-33ac07d9727a</uuid>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <name>instance-0000000e</name>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <nova:name>tempest-LiveMigrationNegativeTest-server-1525054618</nova:name>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:04</nova:creationTime>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:04 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:04 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:04 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:04 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:04 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:04 compute-0 nova_compute[187208]:         <nova:user uuid="28407300b110465d9748f60fa4ee4945">tempest-LiveMigrationNegativeTest-1771731310-project-member</nova:user>
Dec 05 12:00:04 compute-0 nova_compute[187208]:         <nova:project uuid="6592a6d983f44d9e94749f0e3e94c689">tempest-LiveMigrationNegativeTest-1771731310</nova:project>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <entry name="serial">8c58d60e-b997-4eed-8cd4-33ac07d9727a</entry>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <entry name="uuid">8c58d60e-b997-4eed-8cd4-33ac07d9727a</entry>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.config"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/console.log" append="off"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:04 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:04 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:04 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:04 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:04 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.359 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1qu0cyew" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.373 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.373 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.374 187212 INFO nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Using config drive
Dec 05 12:00:04 compute-0 NetworkManager[55691]: <info>  [1764936004.4055] manager: (tap380c99a7-94): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Dec 05 12:00:04 compute-0 kernel: tap380c99a7-94: entered promiscuous mode
Dec 05 12:00:04 compute-0 ovn_controller[95610]: 2025-12-05T12:00:04Z|00051|binding|INFO|Claiming lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d for this chassis.
Dec 05 12:00:04 compute-0 ovn_controller[95610]: 2025-12-05T12:00:04Z|00052|binding|INFO|380c99a7-9480-45f8-b2f4-adfcdfa8576d: Claiming fa:16:3e:24:4f:38 10.100.0.13
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.416 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.424 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.425 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.427 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.443 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3e028fba-1834-42f7-878f-6804fe7cdf62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.444 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24c61e5e-71 in ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.448 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24c61e5e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.448 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f05a197c-ec93-4cca-b7f6-53d8a4a12916]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fbae72d1-5a58-4cff-96a5-13413b115286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 systemd-machined[153543]: New machine qemu-12-instance-0000000c.
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:04 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Dec 05 12:00:04 compute-0 ovn_controller[95610]: 2025-12-05T12:00:04Z|00053|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d ovn-installed in OVS
Dec 05 12:00:04 compute-0 ovn_controller[95610]: 2025-12-05T12:00:04Z|00054|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d up in Southbound
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.473 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.477 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[2a518ba3-d129-458e-aeb4-5d58c3666f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 systemd-udevd[214758]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:00:04 compute-0 NetworkManager[55691]: <info>  [1764936004.4898] device (tap380c99a7-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:00:04 compute-0 NetworkManager[55691]: <info>  [1764936004.4907] device (tap380c99a7-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.498 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[27b07ad7-d2f4-4487-86ec-bd235d832564]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.540 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5a6f908e-c48d-4c44-8b69-87d70a016fab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.549 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[331f8097-091d-49e9-8c3a-4f72ea8dbcf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 NetworkManager[55691]: <info>  [1764936004.5506] manager: (tap24c61e5e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Dec 05 12:00:04 compute-0 systemd-udevd[214761]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.590 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f96ae67a-5e47-49f1-a267-636b7c9f700c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.593 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b087cb02-6658-49ca-956c-bc8bc3323fa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.614 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:04 compute-0 NetworkManager[55691]: <info>  [1764936004.6210] device (tap24c61e5e-70): carrier: link connected
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.637 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0e3b0ddc-5d4a-4eeb-a1dd-f7a21685cde2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.672 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0bdda9eb-f2b4-4db6-b61b-2dc5ed5aace9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214789, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.686 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[905b216b-01cc-4e38-98b9-dd0ef67aced1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:ede6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338518, 'tstamp': 338518}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214790, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.701 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e45034cd-7431-4a40-8907-c9f05556a516]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214791, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.731 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[167e386a-3c5e-4e2d-9a93-589280ba971d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.800 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1e556323-eab0-4e1f-afee-7d37489dad07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.802 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:04 compute-0 kernel: tap24c61e5e-70: entered promiscuous mode
Dec 05 12:00:04 compute-0 NetworkManager[55691]: <info>  [1764936004.8056] manager: (tap24c61e5e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.805 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.810 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:04 compute-0 ovn_controller[95610]: 2025-12-05T12:00:04Z|00055|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.812 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.816 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24c61e5e-7d15-4019-b1bd-d2e253f41aa5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24c61e5e-7d15-4019-b1bd-d2e253f41aa5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.817 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9acecc3f-e731-4485-a9f7-1af549f40b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.818 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/24c61e5e-7d15-4019-b1bd-d2e253f41aa5.pid.haproxy
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:00:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.820 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'env', 'PROCESS_TAG=haproxy-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24c61e5e-7d15-4019-b1bd-d2e253f41aa5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.885 187212 INFO nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Creating config drive at /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.config
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.891 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxlk2ui2b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.991 187212 DEBUG nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.992 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.993 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.993 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.994 187212 DEBUG nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Processing event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.994 187212 DEBUG nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.994 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.995 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.996 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.996 187212 DEBUG nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] No waiting events found dispatching network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.996 187212 WARNING nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received unexpected event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a for instance with vm_state building and task_state spawning.
Dec 05 12:00:04 compute-0 nova_compute[187208]: 2025-12-05 12:00:04.997 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.006 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936005.0048325, 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.006 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] VM Resumed (Lifecycle Event)
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.009 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.013 187212 INFO nova.virt.libvirt.driver [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance spawned successfully.
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.013 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.028 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.033 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxlk2ui2b" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.054 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.062 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.063 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.063 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.064 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.064 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.064 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.071 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:05 compute-0 systemd-machined[153543]: New machine qemu-13-instance-0000000e.
Dec 05 12:00:05 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000e.
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.132 187212 INFO nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Took 19.48 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.133 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.199 187212 INFO nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Took 20.09 seconds to build instance.
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.217 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:05 compute-0 podman[214840]: 2025-12-05 12:00:05.249467618 +0000 UTC m=+0.033085054 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:00:05 compute-0 podman[214840]: 2025-12-05 12:00:05.38981317 +0000 UTC m=+0.173430586 container create 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.449 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936005.4494874, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.452 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Started (Lifecycle Event)
Dec 05 12:00:05 compute-0 systemd[1]: Started libpod-conmon-37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0.scope.
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.482 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.488 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936005.4495957, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.489 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Paused (Lifecycle Event)
Dec 05 12:00:05 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.510 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.513 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38bc886dbbbea2769a63b04a9e8180064790337d80822dc7c2e5b30fc62aed96/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.529 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:05 compute-0 podman[214840]: 2025-12-05 12:00:05.532337894 +0000 UTC m=+0.315955340 container init 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 05 12:00:05 compute-0 podman[214840]: 2025-12-05 12:00:05.538127709 +0000 UTC m=+0.321745125 container start 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.561 187212 DEBUG nova.compute.manager [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-changed-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.561 187212 DEBUG nova.compute.manager [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Refreshing instance network info cache due to event network-changed-f194d74d-a9ec-4838-b35d-8393a2087ec5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.561 187212 DEBUG oslo_concurrency.lockutils [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.629 187212 DEBUG nova.network.neutron [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updated VIF entry in instance network info cache for port 380c99a7-9480-45f8-b2f4-adfcdfa8576d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.630 187212 DEBUG nova.network.neutron [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updating instance_info_cache with network_info: [{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.645 187212 DEBUG oslo_concurrency.lockutils [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:05 compute-0 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [NOTICE]   (214873) : New worker (214875) forked
Dec 05 12:00:05 compute-0 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [NOTICE]   (214873) : Loading success.
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.895 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Updating instance_info_cache with network_info: [{"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.920 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Releasing lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.921 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance network_info: |[{"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.921 187212 DEBUG oslo_concurrency.lockutils [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.922 187212 DEBUG nova.network.neutron [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Refreshing network info cache for port f194d74d-a9ec-4838-b35d-8393a2087ec5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.924 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start _get_guest_xml network_info=[{"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.927 187212 WARNING nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.932 187212 DEBUG nova.virt.libvirt.host [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.933 187212 DEBUG nova.virt.libvirt.host [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.939 187212 DEBUG nova.virt.libvirt.host [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.940 187212 DEBUG nova.virt.libvirt.host [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.940 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.940 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.940 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.942 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.942 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.942 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.942 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.946 187212 DEBUG nova.virt.libvirt.vif [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1562123791',display_name='tempest-ServersAdminTestJSON-server-1562123791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1562123791',id=13,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-vj86fqlt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:01Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.946 187212 DEBUG nova.network.os_vif_util [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.947 187212 DEBUG nova.network.os_vif_util [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.948 187212 DEBUG nova.objects.instance [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.984 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <uuid>3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa</uuid>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <name>instance-0000000d</name>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdminTestJSON-server-1562123791</nova:name>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:05</nova:creationTime>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:05 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:05 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:05 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:05 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:05 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:05 compute-0 nova_compute[187208]:         <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec 05 12:00:05 compute-0 nova_compute[187208]:         <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:00:05 compute-0 nova_compute[187208]:         <nova:port uuid="f194d74d-a9ec-4838-b35d-8393a2087ec5">
Dec 05 12:00:05 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <entry name="serial">3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa</entry>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <entry name="uuid">3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa</entry>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.config"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:d0:fa:14"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <target dev="tapf194d74d-a9"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/console.log" append="off"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:05 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:05 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:05 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:05 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:05 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.985 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Preparing to wait for external event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.985 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.985 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.985 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.986 187212 DEBUG nova.virt.libvirt.vif [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1562123791',display_name='tempest-ServersAdminTestJSON-server-1562123791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1562123791',id=13,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-vj86fqlt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:01Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.986 187212 DEBUG nova.network.os_vif_util [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.987 187212 DEBUG nova.network.os_vif_util [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.987 187212 DEBUG os_vif [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.987 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.988 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.988 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.992 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.992 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf194d74d-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.993 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf194d74d-a9, col_values=(('external_ids', {'iface-id': 'f194d74d-a9ec-4838-b35d-8393a2087ec5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:fa:14', 'vm-uuid': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.994 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:05 compute-0 NetworkManager[55691]: <info>  [1764936005.9954] manager: (tapf194d74d-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Dec 05 12:00:05 compute-0 nova_compute[187208]: 2025-12-05 12:00:05.996 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.003 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.004 187212 INFO os_vif [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9')
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.153 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.153 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.153 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:d0:fa:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.154 187212 INFO nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Using config drive
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.554 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.555 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.555 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936006.5551016, 8c58d60e-b997-4eed-8cd4-33ac07d9727a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.555 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] VM Resumed (Lifecycle Event)
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.559 187212 INFO nova.virt.libvirt.driver [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance spawned successfully.
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.559 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.602 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.605 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.605 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.606 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.606 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.606 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.607 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.612 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.643 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.644 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936006.5555263, 8c58d60e-b997-4eed-8cd4-33ac07d9727a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.644 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] VM Started (Lifecycle Event)
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.672 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.674 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.693 187212 INFO nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Took 3.48 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.693 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.701 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.763 187212 INFO nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Took 4.14 seconds to build instance.
Dec 05 12:00:06 compute-0 nova_compute[187208]: 2025-12-05 12:00:06.792 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.223 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.491 187212 INFO nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Creating config drive at /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.config
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.496 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxuftx3q6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.657 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxuftx3q6" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:07 compute-0 kernel: tapf194d74d-a9: entered promiscuous mode
Dec 05 12:00:07 compute-0 NetworkManager[55691]: <info>  [1764936007.7124] manager: (tapf194d74d-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.712 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:07 compute-0 ovn_controller[95610]: 2025-12-05T12:00:07Z|00056|binding|INFO|Claiming lport f194d74d-a9ec-4838-b35d-8393a2087ec5 for this chassis.
Dec 05 12:00:07 compute-0 ovn_controller[95610]: 2025-12-05T12:00:07Z|00057|binding|INFO|f194d74d-a9ec-4838-b35d-8393a2087ec5: Claiming fa:16:3e:d0:fa:14 10.100.0.14
Dec 05 12:00:07 compute-0 ovn_controller[95610]: 2025-12-05T12:00:07Z|00058|binding|INFO|Setting lport f194d74d-a9ec-4838-b35d-8393a2087ec5 ovn-installed in OVS
Dec 05 12:00:07 compute-0 ovn_controller[95610]: 2025-12-05T12:00:07Z|00059|binding|INFO|Setting lport f194d74d-a9ec-4838-b35d-8393a2087ec5 up in Southbound
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.734 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.736 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.734 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:fa:14 10.100.0.14'], port_security=['fa:16:3e:d0:fa:14 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f194d74d-a9ec-4838-b35d-8393a2087ec5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.740 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f194d74d-a9ec-4838-b35d-8393a2087ec5 in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.744 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:00:07 compute-0 systemd-machined[153543]: New machine qemu-14-instance-0000000d.
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.769 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af48df2a-8001-40bc-ad0b-0834be9f8f88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:07 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-0000000d.
Dec 05 12:00:07 compute-0 systemd-udevd[214933]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:00:07 compute-0 NetworkManager[55691]: <info>  [1764936007.8205] device (tapf194d74d-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:00:07 compute-0 NetworkManager[55691]: <info>  [1764936007.8212] device (tapf194d74d-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.828 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[671469c2-dce3-4090-9b63-c972349804f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.836 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[57f3c232-cc9c-4cd6-86f9-9d0f3dfe84ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.868 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[17044681-77d0-4026-9d2b-54afebc4b21f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.868 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "48f123c5-f925-4f6f-94e5-d109e25ef206" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.868 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.889 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.891 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[addf6894-5e10-4e17-8c03-fd5274a67225]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214946, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.913 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[de1b78eb-1e2d-4ca3-94f8-7ed01ff7d616]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214950, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214950, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.915 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:07 compute-0 nova_compute[187208]: 2025-12-05 12:00:07.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.918 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.919 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.919 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.920 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.089 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.089 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.100 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.101 187212 INFO nova.compute.claims [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.299 187212 DEBUG nova.network.neutron [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Updated VIF entry in instance network info cache for port f194d74d-a9ec-4838-b35d-8393a2087ec5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.299 187212 DEBUG nova.network.neutron [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Updating instance_info_cache with network_info: [{"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.317 187212 DEBUG oslo_concurrency.lockutils [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.378 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936008.3781261, 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.378 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] VM Started (Lifecycle Event)
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.407 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.413 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936008.3800335, 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.414 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] VM Paused (Lifecycle Event)
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.434 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.438 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.457 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.471 187212 DEBUG nova.compute.provider_tree [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.493 187212 DEBUG nova.scheduler.client.report [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.519 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.520 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:00:08 compute-0 ovn_controller[95610]: 2025-12-05T12:00:08Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:76:46 10.1.0.55
Dec 05 12:00:08 compute-0 ovn_controller[95610]: 2025-12-05T12:00:08Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:76:46 10.1.0.55
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.722 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.723 187212 DEBUG nova.network.neutron [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.835 187212 DEBUG nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.835 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.836 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.836 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.836 187212 DEBUG nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Processing event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.837 187212 DEBUG nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.837 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.837 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.837 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.838 187212 DEBUG nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.838 187212 WARNING nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state building and task_state spawning.
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.839 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.844 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.846 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936008.8457544, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.846 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Resumed (Lifecycle Event)
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.866 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance spawned successfully.
Dec 05 12:00:08 compute-0 nova_compute[187208]: 2025-12-05 12:00:08.866 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.197 187212 INFO nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.204 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.204 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.204 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.204 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.205 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.205 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.208 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.210 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.233 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.257 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.309 187212 INFO nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Took 11.92 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.309 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.369 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.370 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.371 187212 INFO nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Creating image(s)
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.372 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.372 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.372 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.394 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.422 187212 INFO nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Took 12.55 seconds to build instance.
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.443 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.512 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.513 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.513 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.524 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.547 187212 DEBUG nova.network.neutron [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.547 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.601 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.601 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.653 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.655 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.655 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.733 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.734 187212 DEBUG nova.virt.disk.api [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Checking if we can resize image /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.735 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.829 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.830 187212 DEBUG nova.virt.disk.api [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Cannot resize image /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.831 187212 DEBUG nova.objects.instance [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lazy-loading 'migration_context' on Instance uuid 48f123c5-f925-4f6f-94e5-d109e25ef206 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.845 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.846 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Ensure instance console log exists: /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.846 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.846 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.847 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.848 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.853 187212 WARNING nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.863 187212 DEBUG nova.virt.libvirt.host [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.864 187212 DEBUG nova.virt.libvirt.host [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.869 187212 DEBUG nova.virt.libvirt.host [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.870 187212 DEBUG nova.virt.libvirt.host [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.871 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.871 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.878 187212 DEBUG nova.objects.instance [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lazy-loading 'pci_devices' on Instance uuid 48f123c5-f925-4f6f-94e5-d109e25ef206 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.897 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <uuid>48f123c5-f925-4f6f-94e5-d109e25ef206</uuid>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <name>instance-0000000f</name>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerDiagnosticsTest-server-464968639</nova:name>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:09</nova:creationTime>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:09 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:09 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:09 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:09 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:09 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:09 compute-0 nova_compute[187208]:         <nova:user uuid="b90e703b69ae4296bdb7708c3a32bb96">tempest-ServerDiagnosticsTest-765338568-project-member</nova:user>
Dec 05 12:00:09 compute-0 nova_compute[187208]:         <nova:project uuid="be4874dd3f38484aa6f1bf8ba69c451f">tempest-ServerDiagnosticsTest-765338568</nova:project>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <entry name="serial">48f123c5-f925-4f6f-94e5-d109e25ef206</entry>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <entry name="uuid">48f123c5-f925-4f6f-94e5-d109e25ef206</entry>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.config"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/console.log" append="off"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:09 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:09 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:09 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:09 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:09 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.941 187212 DEBUG nova.objects.instance [None req-a771051f-9d8e-46d7-8a79-60d4e88701d7 f7c1f6297b534089b496cf7a88d8731e 8dd78283a39d4967be13c14c9c55054a - - default default] Lazy-loading 'pci_devices' on Instance uuid 8c58d60e-b997-4eed-8cd4-33ac07d9727a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.965 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.965 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.966 187212 INFO nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Using config drive
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.981 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936009.9808989, 8c58d60e-b997-4eed-8cd4-33ac07d9727a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.981 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] VM Paused (Lifecycle Event)
Dec 05 12:00:09 compute-0 nova_compute[187208]: 2025-12-05 12:00:09.997 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.003 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.020 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] During sync_power_state the instance has a pending task (suspending). Skip.
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.395 187212 INFO nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Creating config drive at /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.config
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.401 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp44jfpx8h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.532 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp44jfpx8h" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:10 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Dec 05 12:00:10 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000e.scope: Consumed 4.660s CPU time.
Dec 05 12:00:10 compute-0 systemd-machined[153543]: Machine qemu-13-instance-0000000e terminated.
Dec 05 12:00:10 compute-0 systemd-machined[153543]: New machine qemu-15-instance-0000000f.
Dec 05 12:00:10 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-0000000f.
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.760 187212 DEBUG nova.compute.manager [None req-a771051f-9d8e-46d7-8a79-60d4e88701d7 f7c1f6297b534089b496cf7a88d8731e 8dd78283a39d4967be13c14c9c55054a - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.983 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936010.9827046, 48f123c5-f925-4f6f-94e5-d109e25ef206 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.984 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] VM Resumed (Lifecycle Event)
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.986 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.986 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.990 187212 INFO nova.virt.libvirt.driver [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance spawned successfully.
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.990 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:10 compute-0 nova_compute[187208]: 2025-12-05 12:00:10.996 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.034 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.034 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.035 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.035 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.035 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.036 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.039 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.043 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.086 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.086 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936010.9844227, 48f123c5-f925-4f6f-94e5-d109e25ef206 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.086 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] VM Started (Lifecycle Event)
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.133 187212 INFO nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Took 1.76 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.133 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.144 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.147 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.171 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.199 187212 INFO nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Took 3.15 seconds to build instance.
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.216 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.686 187212 DEBUG nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.686 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.686 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.687 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.687 187212 DEBUG nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Processing event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.687 187212 DEBUG nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.687 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.688 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.688 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.688 187212 DEBUG nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] No waiting events found dispatching network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.688 187212 WARNING nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received unexpected event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 for instance with vm_state building and task_state spawning.
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.689 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.698 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936011.6981196, 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.698 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] VM Resumed (Lifecycle Event)
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.701 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.705 187212 INFO nova.virt.libvirt.driver [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance spawned successfully.
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.706 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.723 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.729 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.732 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.732 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.732 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.733 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.733 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.734 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.781 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.831 187212 INFO nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Took 10.02 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.832 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.901 187212 INFO nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Took 10.65 seconds to build instance.
Dec 05 12:00:11 compute-0 nova_compute[187208]: 2025-12-05 12:00:11.923 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.225 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.236 187212 DEBUG nova.compute.manager [None req-b3dadc14-99f6-48a1-b77e-867c9113c725 0a7c1fec28ba47a491ffab0046222160 4b63a617d21d4836b40a81129fab3990 - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.239 187212 INFO nova.compute.manager [None req-b3dadc14-99f6-48a1-b77e-867c9113c725 0a7c1fec28ba47a491ffab0046222160 4b63a617d21d4836b40a81129fab3990 - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Retrieving diagnostics
Dec 05 12:00:12 compute-0 podman[215054]: 2025-12-05 12:00:12.249633016 +0000 UTC m=+0.100046854 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:00:12 compute-0 ovn_controller[95610]: 2025-12-05T12:00:12Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:61:58:b9 10.1.0.8
Dec 05 12:00:12 compute-0 ovn_controller[95610]: 2025-12-05T12:00:12Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:61:58:b9 10.1.0.8
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.618 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.619 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.619 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.619 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.619 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.621 187212 INFO nova.compute.manager [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Terminating instance
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.621 187212 DEBUG nova.compute.manager [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:00:12 compute-0 kernel: tapa5ad03eb-19 (unregistering): left promiscuous mode
Dec 05 12:00:12 compute-0 NetworkManager[55691]: <info>  [1764936012.6497] device (tapa5ad03eb-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:00:12 compute-0 ovn_controller[95610]: 2025-12-05T12:00:12Z|00060|binding|INFO|Releasing lport a5ad03eb-1959-4b2d-a437-979506e6b988 from this chassis (sb_readonly=0)
Dec 05 12:00:12 compute-0 ovn_controller[95610]: 2025-12-05T12:00:12Z|00061|binding|INFO|Setting lport a5ad03eb-1959-4b2d-a437-979506e6b988 down in Southbound
Dec 05 12:00:12 compute-0 ovn_controller[95610]: 2025-12-05T12:00:12Z|00062|binding|INFO|Removing iface tapa5ad03eb-19 ovn-installed in OVS
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.660 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.668 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.668 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b'], port_security=['fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.55/26 fdfe:381f:8400::38b/64', 'neutron:device_id': 'e83b5d7d-04a7-44d9-a6fe-580f1cfa5838', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a5ad03eb-1959-4b2d-a437-979506e6b988) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.670 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a5ad03eb-1959-4b2d-a437-979506e6b988 in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 unbound from our chassis
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.672 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.688 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eca578b3-1caf-4b90-9e12-c6bbe4ba22cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:12 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec 05 12:00:12 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000003.scope: Consumed 14.231s CPU time.
Dec 05 12:00:12 compute-0 systemd-machined[153543]: Machine qemu-7-instance-00000003 terminated.
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.733 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[202e0d91-9d29-44d1-b86e-1784ff4561a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.741 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[35362dd8-27cd-4e49-83af-ddbf06841efa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:12 compute-0 ovn_controller[95610]: 2025-12-05T12:00:12Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:76:3a 10.1.0.6
Dec 05 12:00:12 compute-0 ovn_controller[95610]: 2025-12-05T12:00:12Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:76:3a 10.1.0.6
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.768 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[92f20bcb-b19a-4952-bb54-bac456da9091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.786 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa08e6d0-895a-4af4-99c9-718bc1c56253]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 18, 'tx_packets': 9, 'rx_bytes': 1580, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 18, 'tx_packets': 9, 'rx_bytes': 1580, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 18, 'inoctets': 1328, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 18, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1328, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 18, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215083, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.801 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db2a05b2-f251-47bf-8735-e9809a7b047b]: (4, ({'family': 2, 'prefixlen': 26, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.0.2'], ['IFA_LOCAL', '10.1.0.2'], ['IFA_BROADCAST', '10.1.0.63'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336400, 'tstamp': 336400}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215085, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336403, 'tstamp': 336403}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215085, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.810 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.811 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.811 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.812 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.845 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.889 187212 INFO nova.virt.libvirt.driver [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance destroyed successfully.
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.889 187212 DEBUG nova.objects.instance [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'resources' on Instance uuid e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.901 187212 DEBUG nova.virt.libvirt.vif [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-1',id=3,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T11:59:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T11:59:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=e83b5d7d-04a7-44d9-a6fe-580f1cfa5838,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.901 187212 DEBUG nova.network.os_vif_util [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.902 187212 DEBUG nova.network.os_vif_util [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.903 187212 DEBUG os_vif [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.905 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.906 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5ad03eb-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.910 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.912 187212 INFO os_vif [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19')
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.912 187212 INFO nova.virt.libvirt.driver [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Deleting instance files /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838_del
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.913 187212 INFO nova.virt.libvirt.driver [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Deletion of /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838_del complete
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.966 187212 INFO nova.compute.manager [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.966 187212 DEBUG oslo.service.loopingcall [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.967 187212 DEBUG nova.compute.manager [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:00:12 compute-0 nova_compute[187208]: 2025-12-05 12:00:12.967 187212 DEBUG nova.network.neutron [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.114 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "48f123c5-f925-4f6f-94e5-d109e25ef206" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.115 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.115 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "48f123c5-f925-4f6f-94e5-d109e25ef206-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.115 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.116 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.117 187212 INFO nova.compute.manager [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Terminating instance
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.118 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "refresh_cache-48f123c5-f925-4f6f-94e5-d109e25ef206" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.118 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquired lock "refresh_cache-48f123c5-f925-4f6f-94e5-d109e25ef206" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.118 187212 DEBUG nova.network.neutron [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.485 187212 DEBUG nova.network.neutron [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.776 187212 DEBUG nova.network.neutron [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.793 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Releasing lock "refresh_cache-48f123c5-f925-4f6f-94e5-d109e25ef206" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.794 187212 DEBUG nova.compute.manager [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:00:13 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Dec 05 12:00:13 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000000f.scope: Consumed 3.148s CPU time.
Dec 05 12:00:13 compute-0 systemd-machined[153543]: Machine qemu-15-instance-0000000f terminated.
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.938 187212 DEBUG nova.network.neutron [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:13 compute-0 nova_compute[187208]: 2025-12-05 12:00:13.958 187212 INFO nova.compute.manager [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Took 0.99 seconds to deallocate network for instance.
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.011 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.015 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.045 187212 INFO nova.virt.libvirt.driver [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance destroyed successfully.
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.046 187212 DEBUG nova.objects.instance [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lazy-loading 'resources' on Instance uuid 48f123c5-f925-4f6f-94e5-d109e25ef206 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.066 187212 INFO nova.virt.libvirt.driver [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Deleting instance files /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206_del
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.067 187212 INFO nova.virt.libvirt.driver [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Deletion of /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206_del complete
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.122 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:14 compute-0 NetworkManager[55691]: <info>  [1764936014.1244] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/37)
Dec 05 12:00:14 compute-0 NetworkManager[55691]: <info>  [1764936014.1254] device (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 12:00:14 compute-0 NetworkManager[55691]: <info>  [1764936014.1264] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/38)
Dec 05 12:00:14 compute-0 NetworkManager[55691]: <info>  [1764936014.1267] device (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 12:00:14 compute-0 NetworkManager[55691]: <info>  [1764936014.1274] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Dec 05 12:00:14 compute-0 NetworkManager[55691]: <info>  [1764936014.1280] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Dec 05 12:00:14 compute-0 NetworkManager[55691]: <info>  [1764936014.1284] device (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 05 12:00:14 compute-0 NetworkManager[55691]: <info>  [1764936014.1286] device (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.139 187212 INFO nova.compute.manager [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.139 187212 DEBUG oslo.service.loopingcall [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.140 187212 DEBUG nova.compute.manager [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.140 187212 DEBUG nova.network.neutron [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:14 compute-0 ovn_controller[95610]: 2025-12-05T12:00:14Z|00063|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:00:14 compute-0 ovn_controller[95610]: 2025-12-05T12:00:14Z|00064|binding|INFO|Releasing lport 4248cb8a-d980-4682-8c47-d6faac0a26bc from this chassis (sb_readonly=0)
Dec 05 12:00:14 compute-0 ovn_controller[95610]: 2025-12-05T12:00:14Z|00065|binding|INFO|Releasing lport 79bf1a96-6e90-41b7-8356-9756185de59f from this chassis (sb_readonly=0)
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.201 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.292 187212 DEBUG nova.compute.provider_tree [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.308 187212 DEBUG nova.scheduler.client.report [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.343 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.372 187212 INFO nova.scheduler.client.report [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Deleted allocations for instance e83b5d7d-04a7-44d9-a6fe-580f1cfa5838
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.391 187212 DEBUG nova.network.neutron [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.423 187212 DEBUG nova.network.neutron [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.451 187212 INFO nova.compute.manager [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Took 0.31 seconds to deallocate network for instance.
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.458 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.495 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.496 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.701 187212 DEBUG nova.compute.provider_tree [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.718 187212 DEBUG nova.scheduler.client.report [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.750 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.780 187212 INFO nova.scheduler.client.report [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Deleted allocations for instance 48f123c5-f925-4f6f-94e5-d109e25ef206
Dec 05 12:00:14 compute-0 nova_compute[187208]: 2025-12-05 12:00:14.857 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:15 compute-0 nova_compute[187208]: 2025-12-05 12:00:15.890 187212 DEBUG nova.compute.manager [req-4d496ded-9450-40f6-9b12-d759fb051ee8 req-0deaac7f-360a-4128-9959-7bc942d1ed29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received event network-vif-deleted-a5ad03eb-1959-4b2d-a437-979506e6b988 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:16 compute-0 nova_compute[187208]: 2025-12-05 12:00:16.708 187212 DEBUG nova.compute.manager [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:16 compute-0 nova_compute[187208]: 2025-12-05 12:00:16.708 187212 DEBUG nova.compute.manager [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing instance network info cache due to event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:00:16 compute-0 nova_compute[187208]: 2025-12-05 12:00:16.708 187212 DEBUG oslo_concurrency.lockutils [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:16 compute-0 nova_compute[187208]: 2025-12-05 12:00:16.709 187212 DEBUG oslo_concurrency.lockutils [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:16 compute-0 nova_compute[187208]: 2025-12-05 12:00:16.709 187212 DEBUG nova.network.neutron [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.214 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.214 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.215 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.215 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.215 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.216 187212 INFO nova.compute.manager [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Terminating instance
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.217 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "refresh_cache-8c58d60e-b997-4eed-8cd4-33ac07d9727a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.218 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquired lock "refresh_cache-8c58d60e-b997-4eed-8cd4-33ac07d9727a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.218 187212 DEBUG nova.network.neutron [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:17 compute-0 podman[215112]: 2025-12-05 12:00:17.219107922 +0000 UTC m=+0.061901286 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.227 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.365 187212 DEBUG nova.network.neutron [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.769 187212 DEBUG nova.network.neutron [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.793 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Releasing lock "refresh_cache-8c58d60e-b997-4eed-8cd4-33ac07d9727a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.794 187212 DEBUG nova.compute.manager [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.801 187212 INFO nova.virt.libvirt.driver [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance destroyed successfully.
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.801 187212 DEBUG nova.objects.instance [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'resources' on Instance uuid 8c58d60e-b997-4eed-8cd4-33ac07d9727a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.814 187212 INFO nova.virt.libvirt.driver [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Deleting instance files /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a_del
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.815 187212 INFO nova.virt.libvirt.driver [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Deletion of /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a_del complete
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.909 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.930 187212 INFO nova.compute.manager [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Took 0.14 seconds to destroy the instance on the hypervisor.
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.931 187212 DEBUG oslo.service.loopingcall [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.931 187212 DEBUG nova.compute.manager [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:00:17 compute-0 nova_compute[187208]: 2025-12-05 12:00:17.932 187212 DEBUG nova.network.neutron [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.105 187212 DEBUG nova.network.neutron [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.359 187212 DEBUG nova.network.neutron [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.372 187212 INFO nova.compute.manager [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Took 0.44 seconds to deallocate network for instance.
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.417 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.418 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.610 187212 DEBUG nova.compute.provider_tree [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.684 187212 DEBUG nova.scheduler.client.report [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.711 187212 DEBUG nova.network.neutron [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updated VIF entry in instance network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.711 187212 DEBUG nova.network.neutron [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.752 187212 DEBUG oslo_concurrency.lockutils [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.755 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.759 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.759 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.760 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.760 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.760 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.762 187212 INFO nova.compute.manager [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Terminating instance
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.763 187212 DEBUG nova.compute.manager [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:00:18 compute-0 kernel: tap06886ab7-aa (unregistering): left promiscuous mode
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.790 187212 INFO nova.scheduler.client.report [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Deleted allocations for instance 8c58d60e-b997-4eed-8cd4-33ac07d9727a
Dec 05 12:00:18 compute-0 NetworkManager[55691]: <info>  [1764936018.8086] device (tap06886ab7-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:00:18 compute-0 ovn_controller[95610]: 2025-12-05T12:00:18Z|00066|binding|INFO|Releasing lport 06886ab7-aa74-4f44-b509-94e27d585818 from this chassis (sb_readonly=0)
Dec 05 12:00:18 compute-0 ovn_controller[95610]: 2025-12-05T12:00:18Z|00067|binding|INFO|Setting lport 06886ab7-aa74-4f44-b509-94e27d585818 down in Southbound
Dec 05 12:00:18 compute-0 ovn_controller[95610]: 2025-12-05T12:00:18Z|00068|binding|INFO|Removing iface tap06886ab7-aa ovn-installed in OVS
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.815 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.827 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241'], port_security=['fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.8/26 fdfe:381f:8400::241/64', 'neutron:device_id': '04518502-62f1-44c3-8c57-b3404958536f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=06886ab7-aa74-4f44-b509-94e27d585818) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.828 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 06886ab7-aa74-4f44-b509-94e27d585818 in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 unbound from our chassis
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.830 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.831 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.855 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd5a96d-4f2d-4e7f-bc5f-1bb3b89b1ea2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.866 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:18 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec 05 12:00:18 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000005.scope: Consumed 13.034s CPU time.
Dec 05 12:00:18 compute-0 systemd-machined[153543]: Machine qemu-8-instance-00000005 terminated.
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.896 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[346b9479-b062-49a9-933d-2c9d85c1ecf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.901 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8a8485-ac1e-44a1-80cb-71d6b95b3f7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:18 compute-0 podman[215152]: 2025-12-05 12:00:18.931000644 +0000 UTC m=+0.089342869 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public)
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.936 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce6a7aa-5d58-4d0d-85d8-4cc3df3aace9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.953 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[718e748f-3adf-4b8e-9af7-3656beb3f0a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 11, 'rx_bytes': 2304, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 11, 'rx_bytes': 2304, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 26, 'inoctets': 1856, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 26, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1856, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 26, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215181, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.974 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e22423eb-6896-4b83-8b46-299274ef57bb]: (4, ({'family': 2, 'prefixlen': 26, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.0.2'], ['IFA_LOCAL', '10.1.0.2'], ['IFA_BROADCAST', '10.1.0.63'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336400, 'tstamp': 336400}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215182, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336403, 'tstamp': 336403}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215182, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.976 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.977 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:18 compute-0 nova_compute[187208]: 2025-12-05 12:00:18.981 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.982 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.982 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.982 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.983 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.041 187212 INFO nova.virt.libvirt.driver [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance destroyed successfully.
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.041 187212 DEBUG nova.objects.instance [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'resources' on Instance uuid 04518502-62f1-44c3-8c57-b3404958536f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.092 187212 DEBUG nova.virt.libvirt.vif [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-2',id=5,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-12-05T11:59:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T11:59:58Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=04518502-62f1-44c3-8c57-b3404958536f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.093 187212 DEBUG nova.network.os_vif_util [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.094 187212 DEBUG nova.network.os_vif_util [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.094 187212 DEBUG os_vif [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.095 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.096 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06886ab7-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.098 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.102 187212 INFO os_vif [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa')
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.103 187212 INFO nova.virt.libvirt.driver [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Deleting instance files /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f_del
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.103 187212 INFO nova.virt.libvirt.driver [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Deletion of /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f_del complete
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.169 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.170 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.189 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.195 187212 INFO nova.compute.manager [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Took 0.43 seconds to destroy the instance on the hypervisor.
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.196 187212 DEBUG oslo.service.loopingcall [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.196 187212 DEBUG nova.compute.manager [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.196 187212 DEBUG nova.network.neutron [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.294 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.294 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.302 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.302 187212 INFO nova.compute.claims [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.540 187212 DEBUG nova.compute.provider_tree [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.556 187212 DEBUG nova.scheduler.client.report [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.578 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.579 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:00:19 compute-0 ovn_controller[95610]: 2025-12-05T12:00:19Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f5:93:9d 10.100.0.7
Dec 05 12:00:19 compute-0 ovn_controller[95610]: 2025-12-05T12:00:19Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f5:93:9d 10.100.0.7
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.621 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.622 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.643 187212 INFO nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.660 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.746 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.748 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.749 187212 INFO nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Creating image(s)
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.751 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.753 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.754 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.774 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.844 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.845 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.846 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.860 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.883 187212 DEBUG nova.policy [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.921 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.922 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.956 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.957 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:19 compute-0 nova_compute[187208]: 2025-12-05 12:00:19.958 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.019 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.020 187212 DEBUG nova.virt.disk.api [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.021 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.078 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.079 187212 DEBUG nova.virt.disk.api [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.079 187212 DEBUG nova.objects.instance [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e7aec76-673e-48b5-b183-cc9c7a95fd37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.093 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.094 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Ensure instance console log exists: /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.095 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.097 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.097 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.518 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Successfully created port: 75a214ef-2b9f-4c81-bdad-de5791244b85 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.546 187212 DEBUG nova.network.neutron [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.567 187212 INFO nova.compute.manager [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Took 1.37 seconds to deallocate network for instance.
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.582 187212 DEBUG nova.compute.manager [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-unplugged-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.583 187212 DEBUG oslo_concurrency.lockutils [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.583 187212 DEBUG oslo_concurrency.lockutils [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.583 187212 DEBUG oslo_concurrency.lockutils [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.583 187212 DEBUG nova.compute.manager [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] No waiting events found dispatching network-vif-unplugged-06886ab7-aa74-4f44-b509-94e27d585818 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.584 187212 DEBUG nova.compute.manager [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-unplugged-06886ab7-aa74-4f44-b509-94e27d585818 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.620 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.621 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.779 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.782 187212 INFO nova.compute.manager [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Terminating instance
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.783 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "refresh_cache-5150eaf5-c0ca-48ab-9045-af5a1c785c8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.783 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquired lock "refresh_cache-5150eaf5-c0ca-48ab-9045-af5a1c785c8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.783 187212 DEBUG nova.network.neutron [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.819 187212 DEBUG nova.compute.provider_tree [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.836 187212 DEBUG nova.scheduler.client.report [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:20 compute-0 nova_compute[187208]: 2025-12-05 12:00:20.857 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:21 compute-0 nova_compute[187208]: 2025-12-05 12:00:21.052 187212 DEBUG nova.network.neutron [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:21 compute-0 nova_compute[187208]: 2025-12-05 12:00:21.123 187212 INFO nova.scheduler.client.report [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Deleted allocations for instance 04518502-62f1-44c3-8c57-b3404958536f
Dec 05 12:00:21 compute-0 nova_compute[187208]: 2025-12-05 12:00:21.381 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:21 compute-0 nova_compute[187208]: 2025-12-05 12:00:21.632 187212 DEBUG nova.network.neutron [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:21 compute-0 nova_compute[187208]: 2025-12-05 12:00:21.652 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Releasing lock "refresh_cache-5150eaf5-c0ca-48ab-9045-af5a1c785c8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:21 compute-0 nova_compute[187208]: 2025-12-05 12:00:21.653 187212 DEBUG nova.compute.manager [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:00:21 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec 05 12:00:21 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Consumed 14.535s CPU time.
Dec 05 12:00:21 compute-0 systemd-machined[153543]: Machine qemu-10-instance-0000000b terminated.
Dec 05 12:00:21 compute-0 nova_compute[187208]: 2025-12-05 12:00:21.893 187212 INFO nova.virt.libvirt.driver [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance destroyed successfully.
Dec 05 12:00:21 compute-0 nova_compute[187208]: 2025-12-05 12:00:21.894 187212 DEBUG nova.objects.instance [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'resources' on Instance uuid 5150eaf5-c0ca-48ab-9045-af5a1c785c8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:21 compute-0 nova_compute[187208]: 2025-12-05 12:00:21.990 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Successfully updated port: 75a214ef-2b9f-4c81-bdad-de5791244b85 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.019 187212 INFO nova.virt.libvirt.driver [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Deleting instance files /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e_del
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.020 187212 INFO nova.virt.libvirt.driver [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Deletion of /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e_del complete
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.035 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.036 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquired lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.036 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.163 187212 INFO nova.compute.manager [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Took 0.51 seconds to destroy the instance on the hypervisor.
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.165 187212 DEBUG oslo.service.loopingcall [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.165 187212 DEBUG nova.compute.manager [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.165 187212 DEBUG nova.network.neutron [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.229 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.248 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.380 187212 DEBUG nova.network.neutron [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.420 187212 DEBUG nova.network.neutron [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.452 187212 INFO nova.compute.manager [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Took 0.29 seconds to deallocate network for instance.
Dec 05 12:00:22 compute-0 ovn_controller[95610]: 2025-12-05T12:00:22Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:4f:38 10.100.0.13
Dec 05 12:00:22 compute-0 ovn_controller[95610]: 2025-12-05T12:00:22Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:4f:38 10.100.0.13
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.588 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.588 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.731 187212 DEBUG nova.compute.provider_tree [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.747 187212 DEBUG nova.scheduler.client.report [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.768 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.791 187212 INFO nova.scheduler.client.report [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Deleted allocations for instance 5150eaf5-c0ca-48ab-9045-af5a1c785c8e
Dec 05 12:00:22 compute-0 nova_compute[187208]: 2025-12-05 12:00:22.881 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:23 compute-0 podman[215231]: 2025-12-05 12:00:23.205797862 +0000 UTC m=+0.053328711 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 12:00:23 compute-0 podman[215232]: 2025-12-05 12:00:23.23795842 +0000 UTC m=+0.083464591 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.251 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Updating instance_info_cache with network_info: [{"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.269 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Releasing lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.269 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance network_info: |[{"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.271 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start _get_guest_xml network_info=[{"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.275 187212 WARNING nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.279 187212 DEBUG nova.virt.libvirt.host [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.280 187212 DEBUG nova.virt.libvirt.host [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.284 187212 DEBUG nova.virt.libvirt.host [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.284 187212 DEBUG nova.virt.libvirt.host [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.285 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.285 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.285 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.286 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.286 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.286 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.286 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.291 187212 DEBUG nova.virt.libvirt.vif [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-720093205',display_name='tempest-ServersAdminTestJSON-server-720093205',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-720093205',id=16,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-00wbi3mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:19Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=4e7aec76-673e-48b5-b183-cc9c7a95fd37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.291 187212 DEBUG nova.network.os_vif_util [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.292 187212 DEBUG nova.network.os_vif_util [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.293 187212 DEBUG nova.objects.instance [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e7aec76-673e-48b5-b183-cc9c7a95fd37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG nova.compute.manager [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-changed-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG nova.compute.manager [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Refreshing instance network info cache due to event network-changed-75a214ef-2b9f-4c81-bdad-de5791244b85. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG oslo_concurrency.lockutils [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG oslo_concurrency.lockutils [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG nova.network.neutron [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Refreshing network info cache for port 75a214ef-2b9f-4c81-bdad-de5791244b85 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.314 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <uuid>4e7aec76-673e-48b5-b183-cc9c7a95fd37</uuid>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <name>instance-00000010</name>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdminTestJSON-server-720093205</nova:name>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:23</nova:creationTime>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:23 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:23 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:23 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:23 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:23 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:23 compute-0 nova_compute[187208]:         <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec 05 12:00:23 compute-0 nova_compute[187208]:         <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:00:23 compute-0 nova_compute[187208]:         <nova:port uuid="75a214ef-2b9f-4c81-bdad-de5791244b85">
Dec 05 12:00:23 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <entry name="serial">4e7aec76-673e-48b5-b183-cc9c7a95fd37</entry>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <entry name="uuid">4e7aec76-673e-48b5-b183-cc9c7a95fd37</entry>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.config"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:d9:46:fb"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <target dev="tap75a214ef-2b"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/console.log" append="off"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:23 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:23 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:23 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:23 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:23 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.315 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Preparing to wait for external event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.315 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.315 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.315 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.316 187212 DEBUG nova.virt.libvirt.vif [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-720093205',display_name='tempest-ServersAdminTestJSON-server-720093205',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-720093205',id=16,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-00wbi3mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:19Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=4e7aec76-673e-48b5-b183-cc9c7a95fd37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.316 187212 DEBUG nova.network.os_vif_util [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.317 187212 DEBUG nova.network.os_vif_util [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.317 187212 DEBUG os_vif [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.317 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.318 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.322 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.322 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75a214ef-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.322 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap75a214ef-2b, col_values=(('external_ids', {'iface-id': '75a214ef-2b9f-4c81-bdad-de5791244b85', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:46:fb', 'vm-uuid': '4e7aec76-673e-48b5-b183-cc9c7a95fd37'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.323 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:23 compute-0 NetworkManager[55691]: <info>  [1764936023.3248] manager: (tap75a214ef-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.326 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.329 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.329 187212 INFO os_vif [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b')
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.406 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.407 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.407 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:d9:46:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.407 187212 INFO nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Using config drive
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.625 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.625 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.648 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.717 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.718 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.727 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.727 187212 INFO nova.compute.claims [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.883 187212 INFO nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Creating config drive at /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.config
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.889 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3mc2egvw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.943 187212 DEBUG nova.compute.provider_tree [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.964 187212 DEBUG nova.scheduler.client.report [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.992 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:23 compute-0 nova_compute[187208]: 2025-12-05 12:00:23.993 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.015 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3mc2egvw" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.056 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.057 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:00:24 compute-0 kernel: tap75a214ef-2b: entered promiscuous mode
Dec 05 12:00:24 compute-0 NetworkManager[55691]: <info>  [1764936024.0741] manager: (tap75a214ef-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.137 187212 INFO nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:00:24 compute-0 ovn_controller[95610]: 2025-12-05T12:00:24Z|00069|binding|INFO|Claiming lport 75a214ef-2b9f-4c81-bdad-de5791244b85 for this chassis.
Dec 05 12:00:24 compute-0 ovn_controller[95610]: 2025-12-05T12:00:24Z|00070|binding|INFO|75a214ef-2b9f-4c81-bdad-de5791244b85: Claiming fa:16:3e:d9:46:fb 10.100.0.5
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.140 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.144 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:46:fb 10.100.0.5'], port_security=['fa:16:3e:d9:46:fb 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=75a214ef-2b9f-4c81-bdad-de5791244b85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.145 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 75a214ef-2b9f-4c81-bdad-de5791244b85 in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.147 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:00:24 compute-0 ovn_controller[95610]: 2025-12-05T12:00:24Z|00071|binding|INFO|Setting lport 75a214ef-2b9f-4c81-bdad-de5791244b85 ovn-installed in OVS
Dec 05 12:00:24 compute-0 ovn_controller[95610]: 2025-12-05T12:00:24Z|00072|binding|INFO|Setting lport 75a214ef-2b9f-4c81-bdad-de5791244b85 up in Southbound
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.162 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.164 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6058b3e2-1e8c-4fff-a777-5cc57e951db8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:24 compute-0 systemd-machined[153543]: New machine qemu-16-instance-00000010.
Dec 05 12:00:24 compute-0 systemd-udevd[215320]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:00:24 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000010.
Dec 05 12:00:24 compute-0 NetworkManager[55691]: <info>  [1764936024.1891] device (tap75a214ef-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:00:24 compute-0 NetworkManager[55691]: <info>  [1764936024.1909] device (tap75a214ef-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.195 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[db02f6d0-04e1-4db1-a3a0-e09187c6c966]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.198 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a89eadf8-a4c1-4f1b-a1f6-ca77481e74a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.227 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4056c9-c4aa-4e06-82df-7819d195deed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.245 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b0565f3d-2206-4af3-a770-190d8180a4bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215332, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.263 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a3120b14-99a1-485b-a02e-16aa927c68fa]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215334, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215334, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.265 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.266 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.268 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.268 187212 INFO nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Creating image(s)
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.268 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.269 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.269 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.269 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.270 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.270 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.270 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.285 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.286 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.314 187212 DEBUG nova.compute.manager [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.314 187212 DEBUG oslo_concurrency.lockutils [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 DEBUG oslo_concurrency.lockutils [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 DEBUG oslo_concurrency.lockutils [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 DEBUG nova.compute.manager [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] No waiting events found dispatching network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 WARNING nova.compute.manager [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received unexpected event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 for instance with vm_state deleted and task_state None.
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 DEBUG nova.compute.manager [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-deleted-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.330 187212 DEBUG nova.policy [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.343 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.343 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.344 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.356 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.420 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.422 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.486 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936024.4862533, 4e7aec76-673e-48b5-b183-cc9c7a95fd37 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.487 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] VM Started (Lifecycle Event)
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.508 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.512 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936024.4910607, 4e7aec76-673e-48b5-b183-cc9c7a95fd37 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.513 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] VM Paused (Lifecycle Event)
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.539 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.541 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.576 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.731 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.731 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.732 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.732 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.732 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.734 187212 INFO nova.compute.manager [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Terminating instance
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.735 187212 DEBUG nova.compute.manager [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:00:24 compute-0 kernel: tap0d1b5558-65 (unregistering): left promiscuous mode
Dec 05 12:00:24 compute-0 NetworkManager[55691]: <info>  [1764936024.7563] device (tap0d1b5558-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:00:24 compute-0 ovn_controller[95610]: 2025-12-05T12:00:24Z|00073|binding|INFO|Releasing lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b from this chassis (sb_readonly=0)
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.761 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:24 compute-0 ovn_controller[95610]: 2025-12-05T12:00:24Z|00074|binding|INFO|Setting lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b down in Southbound
Dec 05 12:00:24 compute-0 ovn_controller[95610]: 2025-12-05T12:00:24Z|00075|binding|INFO|Removing iface tap0d1b5558-65 ovn-installed in OVS
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.764 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.769 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100'], port_security=['fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.6/26 fdfe:381f:8400::100/64', 'neutron:device_id': 'b2e8212c-084c-4a4f-b930-56560ae4da12', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0d1b5558-6557-43e9-8cac-a00b4e97ea8b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.770 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0d1b5558-6557-43e9-8cac-a00b4e97ea8b in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 unbound from our chassis
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.772 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca5a0748-2268-4f31-a673-9ef2606c4273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.773 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[748533e2-e0df-4073-86f2-188794e5c4e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.773 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 namespace which is not needed anymore
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.789 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:24 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec 05 12:00:24 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000006.scope: Consumed 14.687s CPU time.
Dec 05 12:00:24 compute-0 systemd-machined[153543]: Machine qemu-9-instance-00000006 terminated.
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.847 187212 DEBUG nova.network.neutron [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Updated VIF entry in instance network info cache for port 75a214ef-2b9f-4c81-bdad-de5791244b85. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.848 187212 DEBUG nova.network.neutron [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Updating instance_info_cache with network_info: [{"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:24 compute-0 nova_compute[187208]: 2025-12-05 12:00:24.875 187212 DEBUG oslo_concurrency.lockutils [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.000 187212 INFO nova.virt.libvirt.driver [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance destroyed successfully.
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.000 187212 DEBUG nova.objects.instance [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'resources' on Instance uuid b2e8212c-084c-4a4f-b930-56560ae4da12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.015 187212 DEBUG nova.virt.libvirt.vif [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-3',id=6,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-12-05T11:59:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T11:59:58Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=b2e8212c-084c-4a4f-b930-56560ae4da12,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.015 187212 DEBUG nova.network.os_vif_util [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.016 187212 DEBUG nova.network.os_vif_util [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.017 187212 DEBUG os_vif [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.018 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.019 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d1b5558-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.023 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.025 187212 INFO os_vif [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65')
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.026 187212 INFO nova.virt.libvirt.driver [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Deleting instance files /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12_del
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.026 187212 INFO nova.virt.libvirt.driver [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Deletion of /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12_del complete
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.074 187212 INFO nova.compute.manager [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.075 187212 DEBUG oslo.service.loopingcall [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.075 187212 DEBUG nova.compute.manager [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.075 187212 DEBUG nova.network.neutron [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:00:25 compute-0 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [NOTICE]   (214277) : haproxy version is 2.8.14-c23fe91
Dec 05 12:00:25 compute-0 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [NOTICE]   (214277) : path to executable is /usr/sbin/haproxy
Dec 05 12:00:25 compute-0 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [WARNING]  (214277) : Exiting Master process...
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.171 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk 1073741824" returned: 0 in 0.749s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:25 compute-0 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [ALERT]    (214277) : Current worker (214279) exited with code 143 (Terminated)
Dec 05 12:00:25 compute-0 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [WARNING]  (214277) : All workers exited. Exiting... (0)
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.172 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.173 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:25 compute-0 systemd[1]: libpod-99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31.scope: Deactivated successfully.
Dec 05 12:00:25 compute-0 podman[215370]: 2025-12-05 12:00:25.180848987 +0000 UTC m=+0.298921314 container died 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:00:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31-userdata-shm.mount: Deactivated successfully.
Dec 05 12:00:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-77b012e43fc6df7609492b693cc8452628271339171d87e515263e44dc855891-merged.mount: Deactivated successfully.
Dec 05 12:00:25 compute-0 podman[215370]: 2025-12-05 12:00:25.224501212 +0000 UTC m=+0.342573539 container cleanup 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:00:25 compute-0 systemd[1]: libpod-conmon-99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31.scope: Deactivated successfully.
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.244 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.245 187212 DEBUG nova.virt.disk.api [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Checking if we can resize image /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.245 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:25 compute-0 podman[215417]: 2025-12-05 12:00:25.289595068 +0000 UTC m=+0.042749200 container remove 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:00:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.295 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7b0414-9669-4147-a119-d45322cd9bdb]: (4, ('Fri Dec  5 12:00:24 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 (99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31)\n99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31\nFri Dec  5 12:00:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 (99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31)\n99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.297 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e8e07b-4d8e-4922-8d4b-651e67212476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.298 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:25 compute-0 kernel: tapca5a0748-20: left promiscuous mode
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.300 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.306 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.306 187212 DEBUG nova.virt.disk.api [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Cannot resize image /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.306 187212 DEBUG nova.objects.instance [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lazy-loading 'migration_context' on Instance uuid bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.311 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf61bf0-2eaf-4d0a-bce5-19a92d1c7506]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.324 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.324 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Ensure instance console log exists: /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.325 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.325 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.325 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.328 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9663e937-ec5b-457a-9179-965484e70ca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.330 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5de29532-7e75-476e-ada1-8b087c8be600]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.344 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db62b7fe-4687-4c09-8a7c-cdf92f83a25f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336377, 'reachable_time': 23474, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215434, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.357 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:00:25 compute-0 systemd[1]: run-netns-ovnmeta\x2dca5a0748\x2d2268\x2d4f31\x2da673\x2d9ef2606c4273.mount: Deactivated successfully.
Dec 05 12:00:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.358 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[83441355-1a82-4721-b076-88191f048d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:25 compute-0 ovn_controller[95610]: 2025-12-05T12:00:25Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d0:fa:14 10.100.0.14
Dec 05 12:00:25 compute-0 ovn_controller[95610]: 2025-12-05T12:00:25Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d0:fa:14 10.100.0.14
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.762 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936010.76137, 8c58d60e-b997-4eed-8cd4-33ac07d9727a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.763 187212 INFO nova.compute.manager [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] VM Stopped (Lifecycle Event)
Dec 05 12:00:25 compute-0 nova_compute[187208]: 2025-12-05 12:00:25.787 187212 DEBUG nova.compute.manager [None req-345c7b66-fc81-463b-bfb6-41120eb76685 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.024 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Successfully created port: e56fa29b-453e-4140-997d-96c0de8ed4bb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.027 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "52d63666-4caa-4eaa-9128-6e21189b0932" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.027 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.047 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.112 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.113 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.121 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.121 187212 INFO nova.compute.claims [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.250 187212 DEBUG nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.250 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Processing event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.252 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.252 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.252 187212 DEBUG nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] No waiting events found dispatching network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.252 187212 WARNING nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received unexpected event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 for instance with vm_state building and task_state spawning.
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.253 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.256 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936026.2559118, 4e7aec76-673e-48b5-b183-cc9c7a95fd37 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.256 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] VM Resumed (Lifecycle Event)
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.257 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.260 187212 INFO nova.virt.libvirt.driver [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance spawned successfully.
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.260 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.284 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.290 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.294 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.295 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.295 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.296 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.296 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.297 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.323 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.343 187212 DEBUG nova.compute.provider_tree [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.360 187212 INFO nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Took 6.61 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.360 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.363 187212 DEBUG nova.scheduler.client.report [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.405 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.406 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.451 187212 INFO nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Took 7.19 seconds to build instance.
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.456 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.484 187212 INFO nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.488 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.507 187212 DEBUG nova.network.neutron [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.509 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.539 187212 INFO nova.compute.manager [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Took 1.46 seconds to deallocate network for instance.
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.605 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.605 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.608 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.610 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.610 187212 INFO nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating image(s)
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.611 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.611 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.612 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.631 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.697 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.698 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.699 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.709 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.767 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.768 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.806 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.807 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.808 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.867 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.868 187212 DEBUG nova.virt.disk.api [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Checking if we can resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.869 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.890 187212 DEBUG nova.compute.provider_tree [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.909 187212 DEBUG nova.scheduler.client.report [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.927 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.928 187212 DEBUG nova.virt.disk.api [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Cannot resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.929 187212 DEBUG nova.objects.instance [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'migration_context' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.935 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.941 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.942 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Ensure instance console log exists: /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.942 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.943 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.943 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.945 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.949 187212 WARNING nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.955 187212 DEBUG nova.virt.libvirt.host [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.956 187212 DEBUG nova.virt.libvirt.host [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.961 187212 DEBUG nova.virt.libvirt.host [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.962 187212 DEBUG nova.virt.libvirt.host [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.962 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.962 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.963 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.963 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.963 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.964 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.964 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.964 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.964 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.965 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.965 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.965 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.969 187212 DEBUG nova.objects.instance [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'pci_devices' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.971 187212 INFO nova.scheduler.client.report [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Deleted allocations for instance b2e8212c-084c-4a4f-b930-56560ae4da12
Dec 05 12:00:26 compute-0 nova_compute[187208]: 2025-12-05 12:00:26.991 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <uuid>52d63666-4caa-4eaa-9128-6e21189b0932</uuid>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <name>instance-00000012</name>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdmin275Test-server-1823558123</nova:name>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:26</nova:creationTime>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:26 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:26 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:26 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:26 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:26 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:26 compute-0 nova_compute[187208]:         <nova:user uuid="3a90749503e34bda87974b2c22626de0">tempest-ServersAdmin275Test-1624449796-project-member</nova:user>
Dec 05 12:00:26 compute-0 nova_compute[187208]:         <nova:project uuid="6d28e47b844b47238fb8386dae6c546e">tempest-ServersAdmin275Test-1624449796</nova:project>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <entry name="serial">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <entry name="uuid">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log" append="off"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:26 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:26 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:26 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:26 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:26 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.140 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.142 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Successfully updated port: e56fa29b-453e-4140-997d-96c0de8ed4bb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.168 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.168 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquired lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.168 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.171 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.171 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.172 187212 INFO nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Using config drive
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.342 187212 INFO nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating config drive at /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.348 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2zxw75vb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.371 187212 DEBUG nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-unplugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.372 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.372 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 DEBUG nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] No waiting events found dispatching network-vif-unplugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 WARNING nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received unexpected event network-vif-unplugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b for instance with vm_state deleted and task_state None.
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 DEBUG nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.374 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.374 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.374 187212 DEBUG nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] No waiting events found dispatching network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.374 187212 WARNING nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received unexpected event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b for instance with vm_state deleted and task_state None.
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.376 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.476 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2zxw75vb" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:27 compute-0 systemd-machined[153543]: New machine qemu-17-instance-00000012.
Dec 05 12:00:27 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000012.
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.882 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936012.8819108, e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.883 187212 INFO nova.compute.manager [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] VM Stopped (Lifecycle Event)
Dec 05 12:00:27 compute-0 nova_compute[187208]: 2025-12-05 12:00:27.905 187212 DEBUG nova.compute.manager [None req-ee5eaabd-82f7-492e-82ad-29c7dedcfaf4 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.031 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936028.0302062, 52d63666-4caa-4eaa-9128-6e21189b0932 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.031 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Resumed (Lifecycle Event)
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.033 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.033 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.036 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance spawned successfully.
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.036 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.056 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.062 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.069 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.070 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.070 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.070 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.071 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.071 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.079 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.079 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936028.0306618, 52d63666-4caa-4eaa-9128-6e21189b0932 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.079 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Started (Lifecycle Event)
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.098 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.102 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.132 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.151 187212 INFO nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Took 1.54 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.151 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.216 187212 INFO nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Took 2.13 seconds to build instance.
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.247 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.372 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updating instance_info_cache with network_info: [{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.425 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Releasing lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.425 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance network_info: |[{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.429 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start _get_guest_xml network_info=[{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.432 187212 WARNING nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.436 187212 DEBUG nova.virt.libvirt.host [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.437 187212 DEBUG nova.virt.libvirt.host [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.441 187212 DEBUG nova.virt.libvirt.host [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.441 187212 DEBUG nova.virt.libvirt.host [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.442 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.442 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.443 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.443 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.443 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.443 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.444 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.444 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.444 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.445 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.445 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.445 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.448 187212 DEBUG nova.virt.libvirt.vif [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2038537603',display_name='tempest-ServersTestJSON-server-2038537603',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-2038537603',id=17,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMtzxCx5WYGaho1uT1vhlFLxIivbdWss7SksmXXR8og/kbnuLPZgB17Trvp/z6Y5aD5/yAlqaXubyiqNS0bESVauUglSuMwk6CT9qVsDlZeY1DXt7lCJ98WxGxUuXIYIrA==',key_name='tempest-keypair-1060401215',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a211e57445104139baeb5ca8fa933c58',ramdisk_id='',reservation_id='r-059q99bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-2138545093',owner_user_name='tempest-ServersTestJSON-2138545093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4aa579f9c54f43039ef96c870ed5e049',uuid=bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.449 187212 DEBUG nova.network.os_vif_util [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converting VIF {"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.449 187212 DEBUG nova.network.os_vif_util [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.451 187212 DEBUG nova.objects.instance [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lazy-loading 'pci_devices' on Instance uuid bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.573 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <uuid>bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d</uuid>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <name>instance-00000011</name>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestJSON-server-2038537603</nova:name>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:28</nova:creationTime>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:28 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:28 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:28 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:28 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:28 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:28 compute-0 nova_compute[187208]:         <nova:user uuid="4aa579f9c54f43039ef96c870ed5e049">tempest-ServersTestJSON-2138545093-project-member</nova:user>
Dec 05 12:00:28 compute-0 nova_compute[187208]:         <nova:project uuid="a211e57445104139baeb5ca8fa933c58">tempest-ServersTestJSON-2138545093</nova:project>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:00:28 compute-0 nova_compute[187208]:         <nova:port uuid="e56fa29b-453e-4140-997d-96c0de8ed4bb">
Dec 05 12:00:28 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <entry name="serial">bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d</entry>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <entry name="uuid">bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d</entry>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.config"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:37:62:a3"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <target dev="tape56fa29b-45"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/console.log" append="off"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:28 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:28 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:28 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:28 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:28 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.574 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Preparing to wait for external event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.574 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.575 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.575 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.575 187212 DEBUG nova.virt.libvirt.vif [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2038537603',display_name='tempest-ServersTestJSON-server-2038537603',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-2038537603',id=17,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMtzxCx5WYGaho1uT1vhlFLxIivbdWss7SksmXXR8og/kbnuLPZgB17Trvp/z6Y5aD5/yAlqaXubyiqNS0bESVauUglSuMwk6CT9qVsDlZeY1DXt7lCJ98WxGxUuXIYIrA==',key_name='tempest-keypair-1060401215',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a211e57445104139baeb5ca8fa933c58',ramdisk_id='',reservation_id='r-059q99bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-2138545093',owner_user_name='tempest-ServersTestJSON-2138545093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4aa579f9c54f43039ef96c870ed5e049',uuid=bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.576 187212 DEBUG nova.network.os_vif_util [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converting VIF {"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.576 187212 DEBUG nova.network.os_vif_util [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.577 187212 DEBUG os_vif [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.578 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.578 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.582 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.582 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape56fa29b-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.583 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape56fa29b-45, col_values=(('external_ids', {'iface-id': 'e56fa29b-453e-4140-997d-96c0de8ed4bb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:62:a3', 'vm-uuid': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.584 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:28 compute-0 NetworkManager[55691]: <info>  [1764936028.5856] manager: (tape56fa29b-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.587 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.591 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.592 187212 INFO os_vif [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45')
Dec 05 12:00:28 compute-0 podman[215482]: 2025-12-05 12:00:28.693720781 +0000 UTC m=+0.064026087 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.759 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.759 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.760 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] No VIF found with MAC fa:16:3e:37:62:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:00:28 compute-0 nova_compute[187208]: 2025-12-05 12:00:28.760 187212 INFO nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Using config drive
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.042 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936014.0413435, 48f123c5-f925-4f6f-94e5-d109e25ef206 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.042 187212 INFO nova.compute.manager [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] VM Stopped (Lifecycle Event)
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.067 187212 DEBUG nova.compute.manager [None req-0a6319e9-ff30-44f6-8c43-678eb17e7ac9 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.186 187212 INFO nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Creating config drive at /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.config
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.191 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8yr3tqos execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.316 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8yr3tqos" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:29 compute-0 kernel: tape56fa29b-45: entered promiscuous mode
Dec 05 12:00:29 compute-0 systemd-udevd[215477]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:00:29 compute-0 ovn_controller[95610]: 2025-12-05T12:00:29Z|00076|binding|INFO|Claiming lport e56fa29b-453e-4140-997d-96c0de8ed4bb for this chassis.
Dec 05 12:00:29 compute-0 ovn_controller[95610]: 2025-12-05T12:00:29Z|00077|binding|INFO|e56fa29b-453e-4140-997d-96c0de8ed4bb: Claiming fa:16:3e:37:62:a3 10.100.0.3
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.370 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:29 compute-0 NetworkManager[55691]: <info>  [1764936029.3747] manager: (tape56fa29b-45): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.377 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:62:a3 10.100.0.3'], port_security=['fa:16:3e:37:62:a3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a211e57445104139baeb5ca8fa933c58', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4dad292-a18a-4c80-b443-fe4ecc60c1b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eb759a3-016c-413a-81bd-572c3bccb661, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e56fa29b-453e-4140-997d-96c0de8ed4bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.380 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e56fa29b-453e-4140-997d-96c0de8ed4bb in datapath 16e72b69-f48e-48c4-b5b8-b2731e24f397 bound to our chassis
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.384 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16e72b69-f48e-48c4-b5b8-b2731e24f397
Dec 05 12:00:29 compute-0 NetworkManager[55691]: <info>  [1764936029.3922] device (tape56fa29b-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:00:29 compute-0 NetworkManager[55691]: <info>  [1764936029.3929] device (tape56fa29b-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:00:29 compute-0 ovn_controller[95610]: 2025-12-05T12:00:29Z|00078|binding|INFO|Setting lport e56fa29b-453e-4140-997d-96c0de8ed4bb ovn-installed in OVS
Dec 05 12:00:29 compute-0 ovn_controller[95610]: 2025-12-05T12:00:29Z|00079|binding|INFO|Setting lport e56fa29b-453e-4140-997d-96c0de8ed4bb up in Southbound
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.396 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b7064ee2-eb55-4991-b410-f517f1888bb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.397 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap16e72b69-f1 in ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.397 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.399 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap16e72b69-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.399 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92c28497-8ab6-4f20-8178-1cc027a4ef11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.401 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[024ab82c-3cfa-4e5b-9266-1ef25b0faa44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.418 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa8145f-6fc6-43b8-9470-c9565e31a8a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 systemd-machined[153543]: New machine qemu-18-instance-00000011.
Dec 05 12:00:29 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000011.
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.435 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[90f9faea-74be-41a9-8574-be3c481b2953]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.491 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[afb02701-4d8f-4517-853c-32ec860605ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.510 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc80bd8-133e-456c-8a3f-6e22055856d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 systemd-udevd[215519]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:00:29 compute-0 NetworkManager[55691]: <info>  [1764936029.5119] manager: (tap16e72b69-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.558 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3afc10b4-5f6c-4695-8e01-5b9564b2dd1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.561 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5b7d0b-5329-43c7-afdf-b4bdb31f9bf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 NetworkManager[55691]: <info>  [1764936029.5952] device (tap16e72b69-f0): carrier: link connected
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.597 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4ebf68-b402-4d64-911b-edf2bcac87c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.613 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e39357e-1d6d-4b9a-822c-1dc3d7f24d12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16e72b69-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:3e:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341015, 'reachable_time': 25571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215553, 'error': None, 'target': 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.629 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ee418151-51cb-4d6d-9e6f-54291ede2892]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:3ef0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341015, 'tstamp': 341015}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215554, 'error': None, 'target': 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.643 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4de32a91-f2e6-4db8-ae51-2be4d991817f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16e72b69-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:3e:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341015, 'reachable_time': 25571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215555, 'error': None, 'target': 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.671 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0d15c4-7bae-42a0-9e2f-716be73a4021]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.727 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[94c21476-cd3c-4f70-94b2-ae8f3f219525]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.729 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16e72b69-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.729 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.730 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16e72b69-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:29 compute-0 NetworkManager[55691]: <info>  [1764936029.7335] manager: (tap16e72b69-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.741 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:29 compute-0 kernel: tap16e72b69-f0: entered promiscuous mode
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.745 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16e72b69-f0, col_values=(('external_ids', {'iface-id': 'ed62467c-0aee-45a7-a6b0-252916dfc244'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:29 compute-0 ovn_controller[95610]: 2025-12-05T12:00:29Z|00080|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec 05 12:00:29 compute-0 nova_compute[187208]: 2025-12-05 12:00:29.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.759 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/16e72b69-f48e-48c4-b5b8-b2731e24f397.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/16e72b69-f48e-48c4-b5b8-b2731e24f397.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.760 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[79a59751-2087-4bd9-87bf-b2945b9ee770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.761 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-16e72b69-f48e-48c4-b5b8-b2731e24f397
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/16e72b69-f48e-48c4-b5b8-b2731e24f397.pid.haproxy
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 16e72b69-f48e-48c4-b5b8-b2731e24f397
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:00:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.762 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'env', 'PROCESS_TAG=haproxy-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/16e72b69-f48e-48c4-b5b8-b2731e24f397.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.023 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936030.0233805, bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.024 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] VM Started (Lifecycle Event)
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.046 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.050 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936030.023566, bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.050 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] VM Paused (Lifecycle Event)
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.078 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.082 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.102 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.299 187212 DEBUG nova.compute.manager [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.300 187212 DEBUG nova.compute.manager [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing instance network info cache due to event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.302 187212 DEBUG oslo_concurrency.lockutils [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.302 187212 DEBUG oslo_concurrency.lockutils [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:30 compute-0 nova_compute[187208]: 2025-12-05 12:00:30.302 187212 DEBUG nova.network.neutron [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:00:30 compute-0 podman[215594]: 2025-12-05 12:00:30.282949246 +0000 UTC m=+0.036033969 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:00:30 compute-0 podman[215594]: 2025-12-05 12:00:30.371462029 +0000 UTC m=+0.124546692 container create 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:00:30 compute-0 systemd[1]: Started libpod-conmon-7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8.scope.
Dec 05 12:00:30 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:00:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5b1d7643321fc89fc61c887a583f48ca73e0a371a4b5fe52022732576250580/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:00:30 compute-0 podman[215594]: 2025-12-05 12:00:30.449166535 +0000 UTC m=+0.202251228 container init 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:00:30 compute-0 podman[215594]: 2025-12-05 12:00:30.455413943 +0000 UTC m=+0.208498606 container start 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 12:00:30 compute-0 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [NOTICE]   (215613) : New worker (215615) forked
Dec 05 12:00:30 compute-0 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [NOTICE]   (215613) : Loading success.
Dec 05 12:00:31 compute-0 nova_compute[187208]: 2025-12-05 12:00:31.364 187212 DEBUG nova.compute.manager [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-deleted-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:31 compute-0 nova_compute[187208]: 2025-12-05 12:00:31.365 187212 DEBUG nova.compute.manager [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-changed-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:31 compute-0 nova_compute[187208]: 2025-12-05 12:00:31.365 187212 DEBUG nova.compute.manager [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Refreshing instance network info cache due to event network-changed-e56fa29b-453e-4140-997d-96c0de8ed4bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:00:31 compute-0 nova_compute[187208]: 2025-12-05 12:00:31.366 187212 DEBUG oslo_concurrency.lockutils [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:31 compute-0 nova_compute[187208]: 2025-12-05 12:00:31.366 187212 DEBUG oslo_concurrency.lockutils [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:31 compute-0 nova_compute[187208]: 2025-12-05 12:00:31.366 187212 DEBUG nova.network.neutron [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Refreshing network info cache for port e56fa29b-453e-4140-997d-96c0de8ed4bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:00:32 compute-0 nova_compute[187208]: 2025-12-05 12:00:32.236 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:32 compute-0 nova_compute[187208]: 2025-12-05 12:00:32.312 187212 DEBUG nova.network.neutron [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updated VIF entry in instance network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:00:32 compute-0 nova_compute[187208]: 2025-12-05 12:00:32.313 187212 DEBUG nova.network.neutron [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:32 compute-0 nova_compute[187208]: 2025-12-05 12:00:32.331 187212 DEBUG oslo_concurrency.lockutils [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:32 compute-0 nova_compute[187208]: 2025-12-05 12:00:32.446 187212 DEBUG nova.network.neutron [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updated VIF entry in instance network info cache for port e56fa29b-453e-4140-997d-96c0de8ed4bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:00:32 compute-0 nova_compute[187208]: 2025-12-05 12:00:32.446 187212 DEBUG nova.network.neutron [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updating instance_info_cache with network_info: [{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:32 compute-0 nova_compute[187208]: 2025-12-05 12:00:32.487 187212 DEBUG oslo_concurrency.lockutils [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:33 compute-0 nova_compute[187208]: 2025-12-05 12:00:33.585 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:33 compute-0 nova_compute[187208]: 2025-12-05 12:00:33.774 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:33 compute-0 nova_compute[187208]: 2025-12-05 12:00:33.775 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:33 compute-0 nova_compute[187208]: 2025-12-05 12:00:33.797 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:00:33 compute-0 nova_compute[187208]: 2025-12-05 12:00:33.878 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:33 compute-0 nova_compute[187208]: 2025-12-05 12:00:33.879 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:33 compute-0 nova_compute[187208]: 2025-12-05 12:00:33.887 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:00:33 compute-0 nova_compute[187208]: 2025-12-05 12:00:33.887 187212 INFO nova.compute.claims [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.039 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936019.0387871, 04518502-62f1-44c3-8c57-b3404958536f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.040 187212 INFO nova.compute.manager [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] VM Stopped (Lifecycle Event)
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.061 187212 DEBUG nova.compute.manager [None req-e8f0d9d6-36a8-4bda-b42b-a6593efea64c - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.090 187212 DEBUG nova.compute.provider_tree [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.104 187212 DEBUG nova.scheduler.client.report [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.128 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.129 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.169 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.170 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.192 187212 INFO nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.207 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.294 187212 INFO nova.compute.manager [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Rebuilding instance
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.308 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.310 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.310 187212 INFO nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Creating image(s)
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.310 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.311 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.311 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.328 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.401 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.402 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.403 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.414 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.473 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.474 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.513 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.513 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.514 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.530 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.553 187212 DEBUG nova.compute.manager [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.568 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.569 187212 DEBUG nova.virt.disk.api [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.569 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.602 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'pci_requests' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.616 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'pci_devices' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.621 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.622 187212 DEBUG nova.virt.disk.api [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.622 187212 DEBUG nova.objects.instance [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid d95c0324-d1d3-4960-9ab7-3a2a098a9f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.635 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'resources' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.636 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.636 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Ensure instance console log exists: /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.637 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.637 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.637 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.645 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'migration_context' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.656 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.660 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.886 187212 DEBUG nova.policy [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:00:34 compute-0 ovn_controller[95610]: 2025-12-05T12:00:34Z|00081|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:00:34 compute-0 ovn_controller[95610]: 2025-12-05T12:00:34Z|00082|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec 05 12:00:34 compute-0 ovn_controller[95610]: 2025-12-05T12:00:34Z|00083|binding|INFO|Releasing lport 79bf1a96-6e90-41b7-8356-9756185de59f from this chassis (sb_readonly=0)
Dec 05 12:00:34 compute-0 nova_compute[187208]: 2025-12-05 12:00:34.946 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:35 compute-0 ovn_controller[95610]: 2025-12-05T12:00:35Z|00084|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:00:35 compute-0 ovn_controller[95610]: 2025-12-05T12:00:35Z|00085|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec 05 12:00:35 compute-0 ovn_controller[95610]: 2025-12-05T12:00:35Z|00086|binding|INFO|Releasing lport 79bf1a96-6e90-41b7-8356-9756185de59f from this chassis (sb_readonly=0)
Dec 05 12:00:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:35.053 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:35.057 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.068 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.105 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.106 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.155 187212 DEBUG nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.155 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.155 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Processing event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.157 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.157 187212 DEBUG nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] No waiting events found dispatching network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.157 187212 WARNING nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received unexpected event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb for instance with vm_state building and task_state spawning.
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.158 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.164 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936035.1645408, bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.165 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] VM Resumed (Lifecycle Event)
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.167 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.172 187212 INFO nova.virt.libvirt.driver [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance spawned successfully.
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.174 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.185 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:35 compute-0 podman[215640]: 2025-12-05 12:00:35.196936379 +0000 UTC m=+0.052133476 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.194 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.208 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.209 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.210 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.210 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.210 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.211 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.275 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.310 187212 INFO nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Took 11.04 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.311 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.373 187212 INFO nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Took 11.67 seconds to build instance.
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.391 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.430 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.430 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.431 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.431 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.678 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:35.763 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0d7840929940419ee8ea88b703037ee31cfdee552fea31f4c91af9a9732801d7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 05 12:00:35 compute-0 nova_compute[187208]: 2025-12-05 12:00:35.796 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Successfully created port: 47612a1a-e470-434b-927c-8fcd6c2fbe4e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:00:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:36.059 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:36 compute-0 nova_compute[187208]: 2025-12-05 12:00:36.891 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936021.8897574, 5150eaf5-c0ca-48ab-9045-af5a1c785c8e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:36 compute-0 nova_compute[187208]: 2025-12-05 12:00:36.892 187212 INFO nova.compute.manager [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] VM Stopped (Lifecycle Event)
Dec 05 12:00:36 compute-0 nova_compute[187208]: 2025-12-05 12:00:36.947 187212 DEBUG nova.compute.manager [None req-3d62cb01-e3b2-4923-bd48-a2cb40bdfdd6 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:37.230 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 05 Dec 2025 12:00:35 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-0db9964f-1820-4db9-887c-2ed75b418e20 x-openstack-request-id: req-0db9964f-1820-4db9-887c-2ed75b418e20 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 05 12:00:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:37.230 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "09233d41-3279-4f39-ac6e-a21662b4f176", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}]}, {"id": "dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 05 12:00:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:37.230 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-0db9964f-1820-4db9-887c-2ed75b418e20 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 05 12:00:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:37.232 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0d7840929940419ee8ea88b703037ee31cfdee552fea31f4c91af9a9732801d7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.537 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.601 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.602 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.603 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.603 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.603 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.604 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.604 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.629 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.629 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.630 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.630 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.745 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.808 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.809 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.829 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Successfully updated port: 47612a1a-e470-434b-927c-8fcd6c2fbe4e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.850 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.850 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquired lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.851 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.867 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.874 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.930 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.931 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:37 compute-0 nova_compute[187208]: 2025-12-05 12:00:37.995 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.001 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.057 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.059 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.120 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.127 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.155 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.190 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.191 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.252 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.258 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.262 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Fri, 05 Dec 2025 12:00:37 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-09d873fe-7b46-42b6-a563-314e2b893c8c x-openstack-request-id: req-09d873fe-7b46-42b6-a563-314e2b893c8c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.262 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.262 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f used request id req-09d873fe-7b46-42b6-a563-314e2b893c8c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.263 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'name': 'tempest-ServersAdmin275Test-server-1823558123', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000012', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6d28e47b844b47238fb8386dae6c546e', 'user_id': '3a90749503e34bda87974b2c22626de0', 'hostId': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.265 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'name': 'tempest-ServersAdminTestJSON-server-1562123791', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98815fe6b9ea4988abc2cccd9726dc86', 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'hostId': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.267 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'name': 'tempest-ServersAdminTestJSON-server-1785289561', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98815fe6b9ea4988abc2cccd9726dc86', 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'hostId': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.268 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'name': 'tempest-ServersTestJSON-server-2038537603', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000011', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a211e57445104139baeb5ca8fa933c58', 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'hostId': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.270 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'name': 'tempest-ServersAdminTestJSON-server-720093205', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000010', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98815fe6b9ea4988abc2cccd9726dc86', 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'hostId': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.271 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'hostId': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.272 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '16d2f26b00364f84b1702bb7219b8d31', 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'hostId': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.272 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.296 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.latency volume: 275954440 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.297 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.latency volume: 5231646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.319 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.latency volume: 548477667 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.320 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.latency volume: 25050026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.351 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.latency volume: 293495470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.351 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.latency volume: 24220875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.355 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.358 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.387 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.latency volume: 191881453 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.387 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.latency volume: 718160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.413 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.latency volume: 203180993 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.413 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.latency volume: 23551307 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.421 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.429 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.442 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.latency volume: 275895128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.443 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.latency volume: 29196254 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.474 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.latency volume: 572759749 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.474 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.latency volume: 39616245 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea25c099-7922-43c0-a91f-fb1c4f8e463b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 275954440, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03a51cf4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': 'ef21f6c4907a28df102b4b97d7a358af966b451295ec63b9fa5f24ab4d1a30a8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5231646, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03a530d6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '56d938fc0ba8814f40efb00999f63007d48ef23ec5b1bc15b92db5d1bdca7f34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 548477667, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03a887ea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': 'af64c65b95a50fb0f9dc4586814e9aebe47381570eb68579d6fcd565fc5651e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25050026, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03a893ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': 'b225f25d367bd57b362d84b5e75be71c309dadab4de4e7cf1f306353c13f2fd0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 293495470, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03ad56da-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': 'f69e3deed95380c9d6d91b67868f15efbca27fb1ee189b7d3fa4c33aa40b95b7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24220875, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, '
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03ad61a2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '79a5529c3387874e476d9f8867d7f7a9112df14ca5751fb9c711be1ca26053ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 191881453, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03b2dbaa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '3b7de2bd0daa8c1701b34a61a952c355c56ff6ae11c853adb29c39491fdae355'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 718160, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03b2e85c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '1f37671353e101af48cb8f806b628ee5e4c4b52a6da37bf86a174af28ae64f6f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 203180993, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03b6d03e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'b1782155c1f6742db0b0c00dc6570aa571fd5061ae65a267cbac88cd44036726'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23551307, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03b6db6a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '5e92a8f476e006504d314c339a3b8fa734ef77f708cae7ad0c6d96cbff5ab137'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 275895128, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03bb5ec4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '2fb5723ed3563d1fe9477acc98534b5f5130e076e5b44ae0fcba07796536ba57'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 29196254, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name'
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: : 'sda'}, 'message_id': '03bb6ef0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '93da2b4b5ee2d712e395817e9ee93ad801666932a3529c86480daaca1be84def'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 572759749, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c01a86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '3a91b488dd74af71f02d7e9db6875642d72d468619e1384fb6b0778927d64b2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39616245, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c02544-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '3fd2055d2ca0c7c968212a1df59aee57793d9cf14008e96f4a948f1b68b1d88d'}]}, 'timestamp': '2025-12-05 12:00:38.474809', '_unique_id': '5af8f6dfd34a4eccb4671312eaa75b8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.487 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.492 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa / tapf194d74d-a9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.492 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.495 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 982a8e69-5181-4847-bdfe-8d4de12bb2e4 / tap380c99a7-94 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.495 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.497 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d / tape56fa29b-45 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.497 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.500 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4e7aec76-673e-48b5-b183-cc9c7a95fd37 / tap75a214ef-2b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.500 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.504 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 / tap9275d01b-3e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.504 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b112dc6-48b4-44ac-8fb0-122210cbf963', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03c2ecca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': 'f1c7057e348ef540e88bea30ba2f0effff592bc9e6b23358f145443dabba9eb9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03c35886-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': 'a74ad0ec201c7b59ac71d1bd863660a32c139f45b3e4ad48ea3dbf117f3e575e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03c3afca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '51c17405c369a67834b41f925a3656bcc9598856e26ebbce5702aaa365b50415'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03c4220c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': 'd15a5e2c6f50ca844e26dc7622f5de456dcd4c7caba4121e0ddde0759b7b4247'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03c4b1ea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '78cdd9cec6479440979e03a1d853c9e29c6ffd7c61f23bb13c3a182ed92d76e8'}]}, 'timestamp': '2025-12-05 12:00:38.504652', '_unique_id': '00237ce6c1a848219e3878b7fb8e4b4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.506 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.506 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.506 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.507 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.507 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.507 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.508 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.508 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.508 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e83a61a7-86b2-4b18-9a43-3bcc8a320413', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03c530fc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '7933392c58636ee107e36747e90a31667b4d1a4881174f95c76dbe61f220e370'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03c53da4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': 'dc10f64ceb2eac437985f27f0cd7913289195f99059b8a1c4d6fdb58162944f7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03c54b1e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '8aae540af3cbadb289ca239d634bd69b05ca3ff82ae1d9d97eae73cff9486560'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03c5541a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '0d8f44693d5daf5a0b3e31c37be30151abadd5c003b023efb767427c3f94aaed'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03c55cda-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': 'b199710f178da0374535ed18a6611187de2d54a8d1708311228c2055e9865a9e'}]}, 'timestamp': '2025-12-05 12:00:38.508981', '_unique_id': 'c37d91822d7c43ad8cae2f001e050bb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.bytes volume: 17154048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.511 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.511 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.bytes volume: 72876032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.511 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.512 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.bytes volume: 72859648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.512 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.512 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.512 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.513 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.bytes volume: 26034176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.513 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.513 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.bytes volume: 72953856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.514 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.514 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.bytes volume: 72892416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.514 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18921d62-561f-48cb-a7fe-924640409c51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 17154048, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c5b388-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '4873610171500068da1f62399948bfd270f98eaa33e1a62b7bb46cc1ef303cb6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c5bfae-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': 'ba8f5911e4b74e078dfe6fd511c35230a07eda87d15b5308f223bcbd145350d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72876032, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c5ca58-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '0ce6d46dd2724f5ad84d987bcd3a00bbf8aa90b877e127a6b8d747ddcc8bd20f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c5d520-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '481b8eb02c0d44662f7cbcfbdbbb7c4be1487d0071469d82714070019cac6a60'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72859648, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c5dfac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '22d9ab40ac577cf3e5b2bbc857663886fbd2040e3699c72a947b9134291633b2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'ro
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: ot_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c5ee2a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '5a751fdde318b1ecd332cca560de7f5788d961d956569f150b2969c0a103a55e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c5f7ee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': 'e849937f2abb914342e0679b6ec10ad5208186c8f55531fd6d33b33ae646eccc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c60266-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '64ca3d254707b42a45173f7b87a34ce686b919f9505f7cb658f4c2f5f99bdbef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 26034176, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c61134-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'ce58c46054dfa41dbcb4025c42ca6e6ea538ecf9c97ee6df4cebdd6b3e296b15'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c61b0c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '5ba76c438f159fa6b80a0591d1ac5da3f69f529388dcfcfa435493e8e1ab65a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72953856, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c62552-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '4cadaa9715b8968ffd83ed1887655d9e236a81085d0b7be00b788ff87c491cc7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c62f2a-d1d2-11f0-8572-fa163e006c52', 'monotonic_t
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: ime': 3419.034940067, 'message_signature': 'c7f9960c08eef33e48cd48124739bc968d2346d781bfc121afe84e36fc41bbc4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72892416, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c63d4e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'e1b45f8b562cd9aa9520a658c4368dd6356f6aa5c2388a7f0a675e0208f1c1a8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c64744-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '03b0d5a6ae38733b6f8619130f26f4597eba37865c448a2ba908dfc0a715d2e5'}]}, 'timestamp': '2025-12-05 12:00:38.515011', '_unique_id': 'be7233566b93449abe82a524eba1a090'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.517 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.517 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.requests volume: 838 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.518 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.518 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.requests volume: 1088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.518 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.519 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.requests volume: 1054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.519 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.519 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.519 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.520 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.requests volume: 1034 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.520 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.520 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.requests volume: 1104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.521 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.521 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.requests volume: 1133 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.521 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.521 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.522 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'accc251c-b1c8-4236-bffb-257ae26c221d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 838, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c6c160-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '857543cfcefd636e7e2b1f9d6d9df079f7e55fb20406c7bd0c15322a1b717a47'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c6cdb8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '862e83ae44944604bb7444bde126875d220c812c961ffbdceb75f5adfd530db3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1088, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c6db64-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '9b98fcf10cbcfe0d1d31263d2d542f457e93813eb36907a345991f060734db70'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c6e55a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '2e6ca7e5c2f70859218afcdc99dfe8a405196aac19a5330168902918ddd65633'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1054, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c6efdc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '48a9cb4e564c5d7f62b32d10dd21c1052af4e48a49dd0ad8d342b824a01572cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 28, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c6f900-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': 'bc840329355ca347a04d560f06e0e1fc0e108f547480138504ae3919e7264ae4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c706c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '3fd8681c2cbf4ab1ad46529514f5c6601b15447c7a692ff31f8591dbaebf8ac8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c70fee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '7502c8be3c9d32b2b32e0f00ff08857da6f4c2b8df433089068611d58374a513'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1034, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c71a34-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'ea8710599d6ece4430575a69b44d9c8ee79e36e4d8cfe9aa84179a1fa33eeff3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c72402-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'fd92f7058704174e2961dabea841f4213fb8fd4a5f7ab97aaa80ac55fba0bff1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1104, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c7330c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': 'e879040e864430780336a8665b00a18d447dee86189b1438570b756ddebca434'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'd
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: isk_name': 'sda'}, 'message_id': '03c73fe6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '6bc3bc48441d65ff24c5587886a1e3c584fdcfdacf1e6bce8ed449d85a6c0797'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1133, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c74a0e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'c738b231a290d34d24e027436894eb1a44964adb3cccf93a84718e9a9a26be2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c758aa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'ae7560d975a82772c3c137962a6022cd09ce64fa1ef6b0cb6e570c595db2fb2d'}]}, 'timestamp': '2025-12-05 12:00:38.522040', '_unique_id': 'bc157b52d87240f494ae4bbd5ac84d74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.545 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.546 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.555 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.556 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.570 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.570 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.582 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.584 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.585 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.595 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.596 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.596 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.611 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.612 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.622 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.622 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0ac272a-e50f-4798-a216-326340bf6bd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03cb0ab8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': 'f8ec4601230c08f066db4e62f4a9752e0bc9a7cc324b631d7bd356e20e2d8765'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03cb1e18-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': 'ef9a7c83a662a05bf67e1a921bae72c786a2f3999ef7945918294a42a0381125'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03cc9b62-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': 'ad6255f7e12c6782f4619dbfe4766f9618bfef60a491614abc520d26b79e07ba'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03ccace2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': '706274286121935611b6a19190f5e359748b9aa5cdcc2246479c1b4cda60920f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03cece00-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': 'f25bdca3a92f09fdba9360cdd14519bb4c97fff94d653356b60649b5d6be1ba9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_na
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: me': 'sda'}, 'message_id': '03ced8dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': 'ae6f86bd744b410c14e2f53f00cf9c32f4a9c58c7e531df439cee65dcf02035c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03d0fa72-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '80a711a346909825eb8d7796c6b41fce1c78c08445b74cfc34c6d383954358f4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03d10c6a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '2faf34bddea40138e02820499a773659a216971eb114732bec035f402b2d9cf8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03d2b5ec-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': '787fee2049f6bdf55f93963cbbd18d5f77636483464078abeed87ef4ceede660'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03d2c1c2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': '5a919137c299f63787c64f4ab9de8e24adbaafc3747f3caa07fbbefa8ccb2720'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03d5270a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.217814101, 'message_signature': '5d7c8f6e3a4587100c30295e3219a20edd6fc028d1f1d2f44601e4b3fbafa3fa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03d53272-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.217814101, 'message_si
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: gnature': 'dd64bf683cbb3d781c1720467944b93b3afd6ba48a0b027cf016c108073e9dae'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03d6b5d4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '6e65ad472f2ef89560fc0f31378421aaa804e6a7449543eacea59e539f9938e6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03d6c1e6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '997e4caa9f22a21a1d0501d53d92e10ac87081f475e7292a71625bb47e3812b2'}]}, 'timestamp': '2025-12-05 12:00:38.622985', '_unique_id': '48054cd6e19f405bb96e7c594cc0ff29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.624 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.627 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.627 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.627 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.628 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.628 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ffd8d66-2a9c-4a79-9e06-f7e8c3cee6a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03d771f4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '6602d44df4996678c1e969bbd1487e59500d0bc384cd01c253ee5d0dd207558d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03d77b18-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': 'dfdec2f65e437bef52017dc8d3962fd592af796ab6c602f69765f2432b2b9f64'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03d78a36-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '404b67b878a1d43617849b741d57106728b2c2b6d806266b091ffe2b77b9f2d9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03d795da-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': 'c50e87c9bf3d11f4fa9a7dce93c1ddc90c7de66259956ad7aa1cdf632781cd64'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03d7a0ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': 'b1b607a4c77a9c9fe956fb05eda597387e8f14975af75a25b881b159ee48cfa3'}]}, 'timestamp': '2025-12-05 12:00:38.628685', '_unique_id': 'faf7fb3884fd4e139832d657ed2722eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.631 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.632 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.632 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.632 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.633 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f15334b-45d7-4c2a-b87a-11b5d4844f7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03d82a72-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '238dc83090981537d84c5c9f558c448600b5049f7221cc2a4f184a9d81e911a5'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03d83620-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': 'e21f16743172c92d6290eab70b2931b6b1caa84fb8911b628c38c7a1cfe77cd8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03d84372-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '84f933b65d16b945250dde57d67824db883bd04d92e4c8fc13d56d3d2114b029'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03d84ebc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': 'a5cf264ffcdc56a22594e79615aa6f2ca29568415ca742d91193c7f6d10c69d2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03d85c5e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '62763996172660ac496adfd692739cb5b4e7588c5fee22434b2a4d7a1542c74d'}]}, 'timestamp': '2025-12-05 12:00:38.633485', '_unique_id': '3e95d1448aa54385865eee4a657e4c22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.636 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.636 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.637 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.637 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.637 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '132d3a01-37bb-4254-8cf6-217b96d30908', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03d8e11a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': 'e975572e7b159b8ce5d9a879aa422d97322d957ed98355e657af96f06144b7a0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03d8ee26-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '2e88116828c703ff058d29265a4c77f1b9a58f7a43d69e72fe67328ce9864039'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03d8f92a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '7ba1d5d9e7937ba2d7fe237801894745abee68bcb6d63e7f3f79d91c0e9a39ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03d905c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '69791f2c0873951a1ce3af0261fc12bf193496a67dd9959264d8ac6a84a3ac4c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03d91676-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '7f1c85c1546c5422073befc6d61a586b51f7e149fbd46d0812716aeb4057cfc2'}]}, 'timestamp': '2025-12-05 12:00:38.638256', '_unique_id': '89a4dba561dc49af84fc71e511b37d0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.639 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.642 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.642 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.642 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.643 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.643 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aee12e8-e692-4e71-9c51-cbc43772437c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03d9bfb8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': 'e92ea10d2ab9817ec8abc33aa549fe970489190d7331a52c026ee1293965ea83'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03d9cb0c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '33b870e2ea91f128328b472eca2c6ddcf011dc62cacf245b87c0ddc3030f49c0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03d9d7be-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': 'db0d01415042e9bb28a7c347c3e7c084a9d598ac5f369c9bb1e793686562e908'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03d9e038-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '3bc735a3e05bfeb1490d3552cea1cab587763258b7237cbad528f9063346ce0c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03d9f028-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '4ac6a2b7e8bb3e761fc1c7d70bef3cffcf3147c9dd7f5c8a8b9b637ec887b066'}]}, 'timestamp': '2025-12-05 12:00:38.643830', '_unique_id': '44e9336d3f284a2da80b83e5f5bc0a4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.647 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.648 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.648 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.649 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.649 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37effe1e-806b-4a55-8176-8aac73a44e44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03da9ba4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '8206c301336333a0e913303af107100e99c8d40c9778b828222e6f06f448042e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03daa900-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '4ae4362dc1c65bc5e6cbf0d4466fb2a669ed24cd1ca219acb101305f28abf47f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03dab9cc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': 'a4a69f3c3f890a8e11d950bac56fe1dd289e97890eb22509ba04c992fd2b34ea'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03dac6c4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '7f77f902c7e0457c695c2beb8207681c24f65c10289aa977ef40e08e4f332391'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03dad33a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': 'ee981055c4a1ce8d760d353d020c7a85ca717e3bb13a9e6cdfd50628ff7229f7'}]}, 'timestamp': '2025-12-05 12:00:38.649737', '_unique_id': 'a6cd876f295a4276a86a4b2dc383594c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.653 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.653 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.requests volume: 164 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.654 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.654 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.654 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.655 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.655 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.655 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.656 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.656 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.requests volume: 236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.656 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.657 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.requests volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.657 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.657 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.requests volume: 310 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.657 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11fb5b8b-0cc5-48ee-92e4-320df53d249d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 164, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03db83ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '5e3cfaa9a99373051231006d4aa7e2aad20953fb40832afcf8cf0c6189b8ce62'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03db8f64-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '9e22e6ca6db77202345c845170325c75a0ca5713781393bc86833fbf5ee3bdd0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03db9d74-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': 'eefb000370ed6ad9df658939d10f1c2713373c7fb82bfc308a1e3b653a3d5e2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dba5b2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '4f3b71491e9b86f98fce33b9acadd4f2b624e5512d7b464727369ea3984a6621'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dbb16a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '993fda24a135676ba52a5ad6d99d5459be8d313bd1ba3e7d5d73ef98b95739b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 12
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 8, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dbbff2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '1b242ffb0772dfba0011703bbfe1aa4131360858f8ec92728ffb170a95e7be8e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dbcdc6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '664214605ad941b569fa9fdf74bf07556b75c6586495e90d946b9a0062a73110'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dbd9a6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '9ef6edb19befbd85455c1e82ba372d5baf3fc584c9f27544683519623be38529'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 236, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dbe50e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '5d6cd4eeda23095daffad6e7194fa65e77f9c3ab78836444170559727a81c5b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dbf2c4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '03d7e6acf27b76460601eb4c65f224d6d06b4ebdacd0ba8b8574a9755b285c5e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 320, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dbffda-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '06983a2f446ade7a558f4ac37e10e04502b64fbb4b9bc4df5419dd6ca894734d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'dis
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: k_name': 'sda'}, 'message_id': '03dc0ae8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': 'e03df298c1b0fc99ca65dfcf5ec6387c9bbde12298da85c08ea3ac156f5c5bf5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 310, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dc170e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '665e46704c9303bc4fd13585eb9904421cb7fe74d39f8eb569ee562acec24c87'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dc226c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '12c5ffb385319db7003a157a9372a0a2437014b31c5eaa55b153f5b25e7e0db3'}]}, 'timestamp': '2025-12-05 12:00:38.658213', '_unique_id': 'ad42a5d74f7147c29526e6cb76648a41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.662 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.662 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.662 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.663 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.663 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.663 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '066eff7e-3e42-4b55-9e3e-35e79b32b6f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03dcc654-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '855888e016c891b885bbdaa8225ca75505a8287012fc33b77e1f76c5659b7ce4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03dcd2d4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '91e222cdaa14e7c6be3ebe72c4636f340438d3ca6f82c5e628da4dc9642c3496'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03dcde14-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': 'd42690b36cb0e87b6f70f433be91b6b163f5deac1979e51c6a6637727e640654'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03dce9b8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '073ed2fdc319dc7877174cc4e388d60d0bb91f5715df544881fb5cecb4dee2ce'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03dcf598-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': 'fa50f9b7b673d459500ab77d9c6770a7804a8f724a1a115717610b6f1c4431f7'}]}, 'timestamp': '2025-12-05 12:00:38.663627', '_unique_id': '3d40a08daef94cbcb0b7eba87b4f6490'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.665 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.667 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.667 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.667 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.667 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.668 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.668 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: ot_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c5ee2a-d1d2-11f0-8572-fa163e006 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7425b213-21dc-4cd8-9c81-4d20ed9d0635', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03dd9462-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': 'ed38a294fa2ecf482ffb55cdd91abdf6a5c2b805efe2147b0530ee395ab146ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03dda006-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '9c06f7080de8e586e7ff72480d40f5fea27a2ecddd73489891ac6505f195069f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03ddada8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': 'f9462ccfd706b36814fce5312d486bf273e6414b3b1d3bdf5531349940a97eb6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03ddbb54-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '44f1a790a5531aef47eb3a26be38433e19a47afc3884585ed22e753443ad7429'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03ddcba8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '3e40ba938911e2455da7deb64804cce038541058261a4d29f51b421777a8d0c5'}]}, 'timestamp': '2025-12-05 12:00:38.669123', '_unique_id': '1b31b6bf0f384bbea1ba7cc05d9e7747'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.672 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.672 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.latency volume: 405864442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.674 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.674 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.latency volume: 5228938005 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.674 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.675 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.latency volume: 3763362476 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.675 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.675 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.675 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.676 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.latency volume: 3976242325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.676 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.676 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.latency volume: 4168675680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.677 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.677 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.latency volume: 7049264369 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.677 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 28, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a22e0851-ed3b-4971-9ed0-80d64596d815', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 405864442, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03de9268-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': 'cf93d9dafba0d12568c25b382801c3beb9dc9793972ca3f46c9f462ef191cb8b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03de9e0c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '569f4c64aaa24077611924002337e07de22618cd853b9128c36e894f601155d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5228938005, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dea6d6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '2a21ef38a14168e9b97ccd84993776a93732273d745df3b7eb666a2e599295c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03deb482-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '887fec3fd2100501ec9dca7d3cbc59333f309ea0309c0c137542d87202d072d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3763362476, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03debfea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '13dccc815b258d6da46fb656a814dc5f3f3956eb0013dc5e8d70f85a3fdc5924'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]:  'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03decbde-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '705a5622f64131dee71740090b2ab01ada9cf71dc6ecbc394c3f307fad213a5a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03ded6c4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '127aef846059fe1d67c1e3305123b25cab2db8320a9d3631dc75f1b83034781f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dedf70-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '443b7bcd4d5ee9a7e7d0eb82e67846710092942a90ddf3d906cbb1f2ee09bba6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3976242325, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03deec7c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'fe387280febd96b910083821159873ae17a06e8d9ad0e34aa5e5047bf2b01809'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03def884-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'c17c07d8c6748b77568e75ec2a777bf4f610d7b8c0d01aff13b14c3f27ba8327'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4168675680, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03df0388-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '35abb078167b3dc3817cf722cb6d0d73e567579088a5f4518134dcdd68388b59'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03df0f
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 04-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '0092daecb36638a5f94b16428884e409003d1f1aa54757d4bd54612c837032f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7049264369, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03df1aa8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'e02c3a4d876773d4ee497e3e72a6dad0c9e5cf394205ed72894345767f250361'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03df255c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'f51f383109c22d7927cde92e8188cb2972999d2f6f4093ef5d3a81791e1f390c'}]}, 'timestamp': '2025-12-05 12:00:38.677947', '_unique_id': '63e7a5fea63645b4ba32c36d1fa72fe8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.679 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.681 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.bytes volume: 25349632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.bytes volume: 55474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.bytes volume: 30292480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.bytes volume: 29227520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.bytes volume: 28912640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.bytes volume: 30648832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.684 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.bytes volume: 31009280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.684 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8faecabd-469a-4002-b1bf-a5e22b6b7db9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25349632, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dfc48a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '10cdbcf7af69375f2c0d36c54ce476a2633853434c11fdb794006602d65c1a8e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55474, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dfccc8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': 'a9593e789a3bcc8ea24e7c6fecb4ded3d30327d2022722df4bf68bc28e8b93d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30292480, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dfd3e4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': 'dd50577d79bb60f20b19f16b3ff796e22606921ff2c09648452eee4f1b74d35b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dfdad8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '095c7965eb7fd843b43391509d7a8273f7b1dccad9ba82ec1bb7019dea2e721a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29227520, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dfe1b8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': 'a837b6206bce83179f5eaf0264ba51854be614ff8bea7129007f18ede9cfca8e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb'
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: : 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dfe8c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': 'a0e627199d7ee04e124b0842e6fc1295cc6994d263a9487ef39cf057bfff0c74'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dff234-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '3df30e10a15f17241d2f645960aac856011af39b3e896bee3e7519f8d1137103'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dff96e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '08f5d7bb0df8f1d55d8aaf004417ad5f1776fe4ce9f899cfc35f547b6deece85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28912640, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03e0006c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '1aeea85dc3e924fc0e0de6c3c4325531a1ff2036a75bac6073316c253499a126'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03e00774-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'f7a7a5876fb72853eae51393d2fdb6470e4febc35d7da47bcbefdb47d718d068'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30648832, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03e00e68-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '5b5e3a49a0644128fc455dca86370510caea294787738c39ae600bcc56e7d4b7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03e0155c-d1d2-11f0-8572-fa163
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '051b2dc001de3319bc07a999b58e268dabc932c4eda681940d97d739e2f6e4c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31009280, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03e01d36-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '9aa3ed12b3898032019fe8778fa9eda7c8fddf796cc43ae61f7d4f95e71cbc99'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03e0243e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'edb1fea5c110d8f5be9fea2cf130ebed1c0bb8811c1def2ef5f520ea6e4b8aaa'}]}, 'timestamp': '2025-12-05 12:00:38.684460', '_unique_id': 'b69e850149ca47908d4b687bd5c650d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: me': 'sda'}, 'message_id': '03ced8dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_ti [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.701 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/memory.usage volume: 40.45703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.713 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/memory.usage volume: 42.76953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 8, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_ [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.734 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/memory.usage volume: 42.62890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is:  'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03decbde-d [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: : 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dfe8c0-d1d2-11f0-8572-f [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.752 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.761 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.762 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d: ceilometer.compute.pollsters.NoVolumeException
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.775 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/memory.usage volume: 40.4921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.797 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/memory.usage volume: 40.9296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.813 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/memory.usage volume: 42.30859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80adce3e-9262-4bea-8a6f-f4a4db671cc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.45703125, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03e2b622-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.321760055, 'message_signature': 'b97c1f3ce6d45a0063dfcaf601e7ade9dcc66fe1221c9694856318691480e25d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.76953125, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03e49a50-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.334417796, 'message_signature': 'a0ccb9170d8c6e50475ce98e77cd7e5d5dfa6beb8fb9e95e837d5774655369e9'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.62890625, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03e7d3f0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.355400074, 'message_signature': 'fbdcd0fd41c55c05cac32c96c376fc461c56be7b313a02d31ac1f338af69651e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4921875, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03ee15d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.39633785, 'message_signature': '43e051b01ec9c14ba4ad07a31c05e99aa36b119b6ce5a90e54a271601dd0819f'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.9296875, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03f17248-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.41843059, 'message_signature': '48a3142921cc54301eedf699056fadc0b082ea42385d1438cad8fb66f98131ec'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.30859375, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03f4099a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.43491245, 'message_signature': '0f2bf354e506f8b4ade1a4565dd5cd6f6f05
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: b16a715759d6de377e939058571b'}]}, 'timestamp': '2025-12-05 12:00:38.815228', '_unique_id': 'd32e1f0d556e4a5b9b3bfa566905cf40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/cpu volume: 9740000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/cpu volume: 12110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/cpu volume: 12160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/cpu volume: 3270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.819 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/cpu volume: 11060000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.819 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/cpu volume: 11660000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.819 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/cpu volume: 11610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5189173-640c-4222-abed-0bae8ddf318d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9740000000, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f499c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.321760055, 'message_signature': 'e5ba9bf2fb66344133a16d77e20b6e2fae1bdc95bc5e9d725db4114bf3d619ee'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12110000000, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4a1fc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.334417796, 'message_signature': 'dbd13288a344231e0fa94cdd64243459b99950d8971626fcd2c825f8798f6862'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12160000000, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4a972-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.355400074, 'message_signature': 'fc0617f7e2bfbee79ed2f64d09001e9ba9e49c866520529bb75c0adcb84a69f6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3270000000, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4b37c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.382517636, 'message_signature': '489c605bbc07db52d74459c3040cf09203293642e44a73d3184b6ec8eb79bc8f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11060000000, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4bea8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.39633785, 'message_signature': 'c7a95d2695a8e185765a587f48d23390528d0ce24b552512cd5b26af9d93830b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11660000000, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4c696-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.41843059, '
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: message_signature': '4e01042460fcd06d015477d77999aa06e8826b9aa10ebed3b38e62af538ec120'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11610000000, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4cdc6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.43491245, 'message_signature': '26619e08bc40a79d9f2e0ffb2c422b7424e14e6d656d6bc5245a0662f45d53dc'}]}, 'timestamp': '2025-12-05 12:00:38.819890', '_unique_id': '64eba5bdac7242349cfe0035ebf767fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>]
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.823 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.823 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.823 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.823 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.824 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.824 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.824 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.825 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.825 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.825 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.825 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.826 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e0aba0a-65ee-4248-a338-92ec9e6d9b17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f546ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': '0dae6ce365469c3dfc584ccf1d86c2d315b7e10bada2ddb0e44e61b891b7ddfb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f54f62-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': 'd245fed6a79456f52758eab7c429d9b100633c195662e115ba95d2d5440fbd81'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f55a20-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': '545bf0404006c9086ffc8182a9b895c9d4288cf37d4db844ea69dead334587ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f5633a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': 'fe1d262e2c6e8fe182f71cdbb19872fcea18f0c1e8f9384c3bd736bc44fa8151'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f56ac4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': '1cbd1774b9e6c73029bdfbe15150b612a1b681a17bc3c24719f2ce73c538fad7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: f573e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': '0f2bb59e1fe935d938ae2eefa05cea834f6d2302f017f3cc060d88408159a1d2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f57bf4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '6f606828ccf6ae18da6e84c5d56aa7e06c1bb516c2dc9223ac5f7310df496fa6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f58b94-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '19b3352a4fd2db16c98de87d15e8632dd3dc8dc971d9e042012dc63b75229b3b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f5933c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': 'a034b440c05d550ea81fc738dbdf5c0a772ff0ce014d7d7beee93693d6873f32'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f5a106-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': 'b8409814c57e8a61e7b389717248f1e4e26cefd42e495af2d0eaa58c80e9dd09'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f5ae9e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.217814101, 'message_signature': '9ec46d929c826a4d7ec3983426a5a6f2b9f6572b73ee91da2bebd0e9e19461ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f5bc72-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.217814101, 'message_signature': '0c92b818e1c378f373de1d49ae17d4d93297ee019497ce749
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 9db0353d5f8f8f8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f5cb86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '9f68808588bed8b5282bcb65a4d704d336d7ef8d481a65d3438cca940500e875'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f5d9aa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': 'ccc3cf9686de5c41432ed04b4f63dc74ede97021d637c6362a0e6f7bb16928f6'}]}, 'timestamp': '2025-12-05 12:00:38.826861', '_unique_id': '187a90cf808e416998e1a627c3cfaac3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.830 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.830 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.bytes volume: 1514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.830 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.bytes volume: 1604 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.830 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.831 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.bytes volume: 784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.831 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de9e2a72-3dfe-4c67-a573-c99df085fdab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1514, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03f66d8e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '1b6bb1cc32150dfaf7e41111cbe48f0bfcf16d079ab93f949244525f961c7e07'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1604, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03f67950-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '7bfabc409afff4e719baf7abbdbf56ba395cfe99e6eddac8b93d976382d65088'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03f68666-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '14ea34aa0ca729fdab559b880c7bc6957995c4817d32c1c77a6b2255c1ef4bbe'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 784, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03f690e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': 'c12f1678819b064e6e3643546f82f36c442de2210d3d9bbb996ee662121d060e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03f69afc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '0aa793697a01e828b570ced5bdc86750323d244be7c42f750436115fe41506ec'}]}, 'timestamp': '2025-12-05 12:00:38.831849', '_unique_id': 'c92d2112dad4434ea958b9eba2972722'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.833 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.833 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.834 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.834 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.834 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.834 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.835 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.835 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.835 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.835 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.836 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.836 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.837 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.837 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.838 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '674af94b-450d-4499-8e18-6f9c7cea8113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f6f948-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': 'cae1421aee19af202d1133b5e438c0a0ce5214cf213040d291a2f17ad02d0008'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f701fe-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': '8f552493e9a49c14e4385f197e81165008d8031df8a1c330f48eff4bd1c38ea0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f70ad2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': '066550a8a213c902b157952cb97478e40cb0b6cb259cd3eb74621233692da1a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f7132e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': '1e497373d14122752c81a138907c0e73f7cc00532337d7332d45555db012bb95'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f71f90-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': '221ecb6cce6a1d9096956da2ceb0ca6fc391f119c7fec8945c258b27c7b79a3c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': '
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: sda'}, 'message_id': '03f727a6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': 'dd0e13fc7818ef6704f3419b57f54c71a6cd38f08276e01e52c5c57f209f1359'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f72fa8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '2cc78990e4c22950544bd0b029eee1937a13e09b45059f9d4d0de323be9821af'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f73a0c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': 'a370b20be7851507d914a1902767ed6a7b6ed99e5ee50c688bb21872edb77939'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f74916-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': '01c4ef46afc639541bd630926ececccf36ab1d9d3db0787578ffeaf4220980de'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f76220-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': '80f8698a48dab3434ba12917ea2411e1b14171e8fa2ef4c501e62a254ae57215'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f773be-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.217814101, 'message_signature': '844496276426a4f77c5d3570db59ecd9d0693aee362e3d03f85f128aaae186e1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f78192-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.217814101, 'message_signature': 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: '96e6a986c70a8a7b017fa546759cc38fa3459c35f59838ba95d133e8accd6b61'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f792fe-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '41232417bbc791a8dded9dddb1a385d58f53f1b2085e75991c4411b52b2c723f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f79e34-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '146682dbcb992f1277ac2cd359da855f98a3243723760806ebd0eef58d5b4537'}]}, 'timestamp': '2025-12-05 12:00:38.838448', '_unique_id': 'e4d5ba672a0643dba92cd3b23f6fb9c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:00:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.960 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.962 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4576MB free_disk=73.19064712524414GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.962 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:38 compute-0 nova_compute[187208]: 2025-12-05 12:00:38.962 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: f573e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_ [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: sda'}, 'message_id': '03f727a6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.074 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance caa6c7c3-7eb3-4636-a7ad-7b605ef393ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.075 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.075 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 982a8e69-5181-4847-bdfe-8d4de12bb2e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.075 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.075 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 4e7aec76-673e-48b5-b183-cc9c7a95fd37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.076 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.076 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 52d63666-4caa-4eaa-9128-6e21189b0932 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.076 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance d95c0324-d1d3-4960-9ab7-3a2a098a9f7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.077 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 8 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.077 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1536MB phys_disk=79GB used_disk=8GB total_vcpus=8 used_vcpus=8 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.083 187212 DEBUG nova.compute.manager [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-changed-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.083 187212 DEBUG nova.compute.manager [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Refreshing instance network info cache due to event network-changed-47612a1a-e470-434b-927c-8fcd6c2fbe4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.083 187212 DEBUG oslo_concurrency.lockutils [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.232 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.249 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.268 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.269 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.597 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.597 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.598 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.599 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.599 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.600 187212 INFO nova.compute.manager [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Terminating instance
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.602 187212 DEBUG nova.compute.manager [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:00:39 compute-0 kernel: tap9275d01b-3e (unregistering): left promiscuous mode
Dec 05 12:00:39 compute-0 NetworkManager[55691]: <info>  [1764936039.6261] device (tap9275d01b-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:00:39 compute-0 ovn_controller[95610]: 2025-12-05T12:00:39Z|00087|binding|INFO|Releasing lport 9275d01b-3eb9-429b-a0ba-0cb60048987a from this chassis (sb_readonly=0)
Dec 05 12:00:39 compute-0 ovn_controller[95610]: 2025-12-05T12:00:39Z|00088|binding|INFO|Setting lport 9275d01b-3eb9-429b-a0ba-0cb60048987a down in Southbound
Dec 05 12:00:39 compute-0 ovn_controller[95610]: 2025-12-05T12:00:39Z|00089|binding|INFO|Removing iface tap9275d01b-3e ovn-installed in OVS
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.634 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.648 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:93:9d 10.100.0.7'], port_security=['fa:16:3e:f5:93:9d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d2f26b00364f84b1702bb7219b8d31', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba7f2e39-8114-45e5-bd44-4ae84ab46fc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd38fa62-d49e-4607-8d3e-179b767c8786, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9275d01b-3eb9-429b-a0ba-0cb60048987a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.649 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9275d01b-3eb9-429b-a0ba-0cb60048987a in datapath e5a9559e-b860-47a2-b44b-45c7f67f2119 unbound from our chassis
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.651 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e5a9559e-b860-47a2-b44b-45c7f67f2119, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.652 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ec695521-33d1-41d0-bc3a-9bd60630bf80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.653 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 namespace which is not needed anymore
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.659 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec 05 12:00:39 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000a.scope: Consumed 14.069s CPU time.
Dec 05 12:00:39 compute-0 systemd-machined[153543]: Machine qemu-11-instance-0000000a terminated.
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.715 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updating instance_info_cache with network_info: [{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.726 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.726 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:00:39 compute-0 ovn_controller[95610]: 2025-12-05T12:00:39Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:46:fb 10.100.0.5
Dec 05 12:00:39 compute-0 ovn_controller[95610]: 2025-12-05T12:00:39Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:46:fb 10.100.0.5
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.746 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Releasing lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.746 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance network_info: |[{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.746 187212 DEBUG oslo_concurrency.lockutils [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.747 187212 DEBUG nova.network.neutron [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Refreshing network info cache for port 47612a1a-e470-434b-927c-8fcd6c2fbe4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.749 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start _get_guest_xml network_info=[{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.754 187212 WARNING nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.762 187212 DEBUG nova.virt.libvirt.host [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.762 187212 DEBUG nova.virt.libvirt.host [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.769 187212 DEBUG nova.virt.libvirt.host [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.769 187212 DEBUG nova.virt.libvirt.host [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.770 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.770 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.771 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.771 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.771 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.771 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.772 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.772 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.772 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.773 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.773 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.773 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:39 compute-0 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [NOTICE]   (214635) : haproxy version is 2.8.14-c23fe91
Dec 05 12:00:39 compute-0 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [NOTICE]   (214635) : path to executable is /usr/sbin/haproxy
Dec 05 12:00:39 compute-0 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [WARNING]  (214635) : Exiting Master process...
Dec 05 12:00:39 compute-0 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [ALERT]    (214635) : Current worker (214637) exited with code 143 (Terminated)
Dec 05 12:00:39 compute-0 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [WARNING]  (214635) : All workers exited. Exiting... (0)
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.777 187212 DEBUG nova.virt.libvirt.vif [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1974624987',display_name='tempest-ServersAdminTestJSON-server-1974624987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1974624987',id=19,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-wb0rav5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:34Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=d95c0324-d1d3-4960-9ab7-3a2a098a9f7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.777 187212 DEBUG nova.network.os_vif_util [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.778 187212 DEBUG nova.network.os_vif_util [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:39 compute-0 systemd[1]: libpod-24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c.scope: Deactivated successfully.
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.779 187212 DEBUG nova.objects.instance [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid d95c0324-d1d3-4960-9ab7-3a2a098a9f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:39 compute-0 podman[215764]: 2025-12-05 12:00:39.787071459 +0000 UTC m=+0.045012074 container died 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.802 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <uuid>d95c0324-d1d3-4960-9ab7-3a2a098a9f7c</uuid>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <name>instance-00000013</name>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdminTestJSON-server-1974624987</nova:name>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:39</nova:creationTime>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:39 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:39 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:39 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:39 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:39 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:39 compute-0 nova_compute[187208]:         <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec 05 12:00:39 compute-0 nova_compute[187208]:         <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:00:39 compute-0 nova_compute[187208]:         <nova:port uuid="47612a1a-e470-434b-927c-8fcd6c2fbe4e">
Dec 05 12:00:39 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <entry name="serial">d95c0324-d1d3-4960-9ab7-3a2a098a9f7c</entry>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <entry name="uuid">d95c0324-d1d3-4960-9ab7-3a2a098a9f7c</entry>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.config"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:45:e2:12"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <target dev="tap47612a1a-e4"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/console.log" append="off"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:39 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:39 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:39 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:39 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:39 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.804 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Preparing to wait for external event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.804 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.805 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.805 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.806 187212 DEBUG nova.virt.libvirt.vif [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1974624987',display_name='tempest-ServersAdminTestJSON-server-1974624987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1974624987',id=19,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-wb0rav5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:34Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=d95c0324-d1d3-4960-9ab7-3a2a098a9f7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.806 187212 DEBUG nova.network.os_vif_util [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.807 187212 DEBUG nova.network.os_vif_util [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.807 187212 DEBUG os_vif [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.809 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.810 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.813 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.813 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.814 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.814 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.814 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.815 187212 INFO nova.compute.manager [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Terminating instance
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.816 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.816 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquired lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.816 187212 DEBUG nova.network.neutron [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.817 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.818 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47612a1a-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.818 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47612a1a-e4, col_values=(('external_ids', {'iface-id': '47612a1a-e470-434b-927c-8fcd6c2fbe4e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:e2:12', 'vm-uuid': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:39 compute-0 NetworkManager[55691]: <info>  [1764936039.8222] manager: (tap47612a1a-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:00:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c-userdata-shm.mount: Deactivated successfully.
Dec 05 12:00:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-15ac9401aab02c277b66f0b8b3e087367793eb4fcc0a66aa2c56cb8b76ba06f3-merged.mount: Deactivated successfully.
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.835 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.836 187212 INFO os_vif [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4')
Dec 05 12:00:39 compute-0 podman[215764]: 2025-12-05 12:00:39.858394423 +0000 UTC m=+0.116335028 container cleanup 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 12:00:39 compute-0 systemd[1]: libpod-conmon-24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c.scope: Deactivated successfully.
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.882 187212 INFO nova.virt.libvirt.driver [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance destroyed successfully.
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.883 187212 DEBUG nova.objects.instance [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lazy-loading 'resources' on Instance uuid 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.904 187212 DEBUG nova.virt.libvirt.vif [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:59:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-147223876',id=10,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d2f26b00364f84b1702bb7219b8d31',ramdisk_id='',reservation_id='r-5f154sc3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:05Z,user_data=None,user_id='d4754b88440a4ea08a37067ef9234672',uuid=597f2994-fdad-46b1-9ef7-f56d62b4bbd0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.905 187212 DEBUG nova.network.os_vif_util [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converting VIF {"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.906 187212 DEBUG nova.network.os_vif_util [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.906 187212 DEBUG os_vif [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.919 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9275d01b-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.926 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.927 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.927 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:45:e2:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.928 187212 INFO nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Using config drive
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.929 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.934 187212 INFO os_vif [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e')
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.934 187212 INFO nova.virt.libvirt.driver [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Deleting instance files /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0_del
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.935 187212 INFO nova.virt.libvirt.driver [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Deletion of /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0_del complete
Dec 05 12:00:39 compute-0 podman[215812]: 2025-12-05 12:00:39.938356543 +0000 UTC m=+0.048252577 container remove 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.943 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[241c11d4-9fa1-4063-bb99-1dc13fe75f11]: (4, ('Fri Dec  5 12:00:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 (24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c)\n24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c\nFri Dec  5 12:00:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 (24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c)\n24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.945 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0a1c5243-4c6d-4fdf-b382-3b2f39abc51a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.946 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5a9559e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 kernel: tape5a9559e-b0: left promiscuous mode
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.961 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.966 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c0daf182-c846-438a-be46-bbe3a28aa075]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.983 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6aabd3-e749-4848-8888-768c4eaaf536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.986 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43f613d8-2b9b-4aa8-996f-ba9f97a0b590]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.999 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936024.9982498, b2e8212c-084c-4a4f-b930-56560ae4da12 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:39 compute-0 nova_compute[187208]: 2025-12-05 12:00:39.999 187212 INFO nova.compute.manager [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] VM Stopped (Lifecycle Event)
Dec 05 12:00:40 compute-0 nova_compute[187208]: 2025-12-05 12:00:40.000 187212 INFO nova.compute.manager [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 12:00:40 compute-0 nova_compute[187208]: 2025-12-05 12:00:40.000 187212 DEBUG oslo.service.loopingcall [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:00:40 compute-0 nova_compute[187208]: 2025-12-05 12:00:40.001 187212 DEBUG nova.compute.manager [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:00:40 compute-0 nova_compute[187208]: 2025-12-05 12:00:40.001 187212 DEBUG nova.network.neutron [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:00:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:40.001 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4e85cc0a-7917-42f5-8aec-5fdf73e731b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337401, 'reachable_time': 44825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215832, 'error': None, 'target': 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:40 compute-0 systemd[1]: run-netns-ovnmeta\x2de5a9559e\x2db860\x2d47a2\x2db44b\x2d45c7f67f2119.mount: Deactivated successfully.
Dec 05 12:00:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:40.004 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:00:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:40.004 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[407bcdf2-95d2-4890-a313-756e5b14ecb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:40 compute-0 nova_compute[187208]: 2025-12-05 12:00:40.017 187212 DEBUG nova.compute.manager [None req-d112b245-f889-4b6c-81c3-9ef6ff76efdf - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:40 compute-0 nova_compute[187208]: 2025-12-05 12:00:40.379 187212 DEBUG nova.network.neutron [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:40 compute-0 nova_compute[187208]: 2025-12-05 12:00:40.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:40 compute-0 NetworkManager[55691]: <info>  [1764936040.5545] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Dec 05 12:00:40 compute-0 NetworkManager[55691]: <info>  [1764936040.5553] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Dec 05 12:00:40 compute-0 nova_compute[187208]: 2025-12-05 12:00:40.643 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:40 compute-0 ovn_controller[95610]: 2025-12-05T12:00:40Z|00090|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:00:40 compute-0 ovn_controller[95610]: 2025-12-05T12:00:40Z|00091|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec 05 12:00:40 compute-0 nova_compute[187208]: 2025-12-05 12:00:40.661 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:41 compute-0 nova_compute[187208]: 2025-12-05 12:00:41.807 187212 DEBUG nova.network.neutron [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:41 compute-0 nova_compute[187208]: 2025-12-05 12:00:41.835 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Releasing lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:41 compute-0 nova_compute[187208]: 2025-12-05 12:00:41.836 187212 DEBUG nova.compute.manager [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:00:41 compute-0 nova_compute[187208]: 2025-12-05 12:00:41.852 187212 INFO nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Creating config drive at /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.config
Dec 05 12:00:41 compute-0 nova_compute[187208]: 2025-12-05 12:00:41.858 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuwuxa0jl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:41 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec 05 12:00:41 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 16.976s CPU time.
Dec 05 12:00:41 compute-0 systemd-machined[153543]: Machine qemu-1-instance-00000001 terminated.
Dec 05 12:00:41 compute-0 nova_compute[187208]: 2025-12-05 12:00:41.982 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuwuxa0jl" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.036 187212 DEBUG nova.compute.manager [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-unplugged-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.037 187212 DEBUG oslo_concurrency.lockutils [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.037 187212 DEBUG oslo_concurrency.lockutils [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.038 187212 DEBUG oslo_concurrency.lockutils [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.038 187212 DEBUG nova.compute.manager [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] No waiting events found dispatching network-vif-unplugged-9275d01b-3eb9-429b-a0ba-0cb60048987a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.038 187212 DEBUG nova.compute.manager [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-unplugged-9275d01b-3eb9-429b-a0ba-0cb60048987a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:00:42 compute-0 kernel: tap47612a1a-e4: entered promiscuous mode
Dec 05 12:00:42 compute-0 ovn_controller[95610]: 2025-12-05T12:00:42Z|00092|binding|INFO|Claiming lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e for this chassis.
Dec 05 12:00:42 compute-0 ovn_controller[95610]: 2025-12-05T12:00:42Z|00093|binding|INFO|47612a1a-e470-434b-927c-8fcd6c2fbe4e: Claiming fa:16:3e:45:e2:12 10.100.0.10
Dec 05 12:00:42 compute-0 NetworkManager[55691]: <info>  [1764936042.0671] manager: (tap47612a1a-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.066 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:42 compute-0 systemd-udevd[215743]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.072 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e2:12 10.100.0.10'], port_security=['fa:16:3e:45:e2:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=47612a1a-e470-434b-927c-8fcd6c2fbe4e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.073 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 47612a1a-e470-434b-927c-8fcd6c2fbe4e in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.076 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:00:42 compute-0 NetworkManager[55691]: <info>  [1764936042.0802] device (tap47612a1a-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:00:42 compute-0 NetworkManager[55691]: <info>  [1764936042.0813] device (tap47612a1a-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:00:42 compute-0 ovn_controller[95610]: 2025-12-05T12:00:42Z|00094|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e ovn-installed in OVS
Dec 05 12:00:42 compute-0 ovn_controller[95610]: 2025-12-05T12:00:42Z|00095|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e up in Southbound
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.084 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.089 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.093 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3b15b4ce-081d-4935-bfde-b692c02f314f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.107 187212 INFO nova.virt.libvirt.driver [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance destroyed successfully.
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.107 187212 DEBUG nova.objects.instance [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'resources' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.110 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.124 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[61f0f223-d298-4fe2-a64b-48942ebe97b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.127 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a88bfd26-2793-4e09-937e-a391f75de20e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.124 187212 INFO nova.virt.libvirt.driver [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Deleting instance files /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba_del
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.125 187212 INFO nova.virt.libvirt.driver [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Deletion of /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba_del complete
Dec 05 12:00:42 compute-0 systemd-machined[153543]: New machine qemu-19-instance-00000013.
Dec 05 12:00:42 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000013.
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.162 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cffd0cc5-ce20-4c06-959d-98862247feb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.183 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e294fd-6bd2-4b9b-b418-c4f5fc9ceede]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215868, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.184 187212 INFO nova.compute.manager [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.185 187212 DEBUG oslo.service.loopingcall [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.185 187212 DEBUG nova.compute.manager [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.185 187212 DEBUG nova.network.neutron [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.201 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[37eab8fc-5d1b-4caa-84ef-5c2dfce1f058]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215872, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215872, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.204 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.206 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.207 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.207 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.207 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.208 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:00:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.208 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.450 187212 DEBUG nova.network.neutron [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updated VIF entry in instance network info cache for port 47612a1a-e470-434b-927c-8fcd6c2fbe4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.451 187212 DEBUG nova.network.neutron [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updating instance_info_cache with network_info: [{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.471 187212 DEBUG oslo_concurrency.lockutils [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.475 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936042.4756083, d95c0324-d1d3-4960-9ab7-3a2a098a9f7c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.476 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] VM Started (Lifecycle Event)
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.497 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.505 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936042.4757717, d95c0324-d1d3-4960-9ab7-3a2a098a9f7c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.505 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] VM Paused (Lifecycle Event)
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.528 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.533 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.551 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.602 187212 DEBUG nova.network.neutron [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.621 187212 INFO nova.compute.manager [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Took 2.62 seconds to deallocate network for instance.
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.656 187212 DEBUG nova.network.neutron [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.678 187212 DEBUG nova.network.neutron [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.686 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.686 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.691 187212 INFO nova.compute.manager [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Took 0.51 seconds to deallocate network for instance.
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.736 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.863 187212 DEBUG nova.compute.provider_tree [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.880 187212 DEBUG nova.scheduler.client.report [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.900 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.902 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.921 187212 INFO nova.scheduler.client.report [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Deleted allocations for instance 597f2994-fdad-46b1-9ef7-f56d62b4bbd0
Dec 05 12:00:42 compute-0 nova_compute[187208]: 2025-12-05 12:00:42.984 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:43 compute-0 nova_compute[187208]: 2025-12-05 12:00:43.067 187212 DEBUG nova.compute.provider_tree [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:43 compute-0 nova_compute[187208]: 2025-12-05 12:00:43.086 187212 DEBUG nova.scheduler.client.report [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:43 compute-0 nova_compute[187208]: 2025-12-05 12:00:43.108 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:43 compute-0 nova_compute[187208]: 2025-12-05 12:00:43.134 187212 INFO nova.scheduler.client.report [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Deleted allocations for instance caa6c7c3-7eb3-4636-a7ad-7b605ef393ba
Dec 05 12:00:43 compute-0 nova_compute[187208]: 2025-12-05 12:00:43.190 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:43 compute-0 podman[215883]: 2025-12-05 12:00:43.207855597 +0000 UTC m=+0.057630975 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 05 12:00:44 compute-0 nova_compute[187208]: 2025-12-05 12:00:44.716 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:00:44 compute-0 nova_compute[187208]: 2025-12-05 12:00:44.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.078 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.079 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.079 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.079 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.079 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] No waiting events found dispatching network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 WARNING nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received unexpected event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a for instance with vm_state deleted and task_state None.
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-changed-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Refreshing instance network info cache due to event network-changed-e56fa29b-453e-4140-997d-96c0de8ed4bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:45 compute-0 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG nova.network.neutron [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Refreshing network info cache for port e56fa29b-453e-4140-997d-96c0de8ed4bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:00:46 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec 05 12:00:46 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000012.scope: Consumed 11.913s CPU time.
Dec 05 12:00:46 compute-0 systemd-machined[153543]: Machine qemu-17-instance-00000012 terminated.
Dec 05 12:00:47 compute-0 nova_compute[187208]: 2025-12-05 12:00:47.242 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:47 compute-0 ovn_controller[95610]: 2025-12-05T12:00:47Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:37:62:a3 10.100.0.3
Dec 05 12:00:47 compute-0 ovn_controller[95610]: 2025-12-05T12:00:47Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:37:62:a3 10.100.0.3
Dec 05 12:00:47 compute-0 nova_compute[187208]: 2025-12-05 12:00:47.730 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance shutdown successfully after 13 seconds.
Dec 05 12:00:47 compute-0 nova_compute[187208]: 2025-12-05 12:00:47.735 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.
Dec 05 12:00:47 compute-0 nova_compute[187208]: 2025-12-05 12:00:47.740 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.
Dec 05 12:00:47 compute-0 nova_compute[187208]: 2025-12-05 12:00:47.741 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deleting instance files /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del
Dec 05 12:00:47 compute-0 nova_compute[187208]: 2025-12-05 12:00:47.742 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deletion of /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del complete
Dec 05 12:00:48 compute-0 podman[215934]: 2025-12-05 12:00:48.186787833 +0000 UTC m=+0.044862560 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.312 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.312 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating image(s)
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.313 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.313 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.314 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.315 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.315 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.673 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "c5241646-e089-40a3-b197-60aff60ea075" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.673 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.692 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.805 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.805 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.811 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:00:48 compute-0 nova_compute[187208]: 2025-12-05 12:00:48.812 187212 INFO nova.compute.claims [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:00:49 compute-0 podman[215953]: 2025-12-05 12:00:49.230625497 +0000 UTC m=+0.090988375 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350)
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.493 187212 DEBUG nova.network.neutron [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updated VIF entry in instance network info cache for port e56fa29b-453e-4140-997d-96c0de8ed4bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.494 187212 DEBUG nova.network.neutron [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updating instance_info_cache with network_info: [{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.511 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.512 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-deleted-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.512 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.512 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.513 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.513 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.513 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Processing event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.514 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.514 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.515 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.515 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.515 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] No waiting events found dispatching network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.515 187212 WARNING nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received unexpected event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e for instance with vm_state building and task_state spawning.
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.518 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.522 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936049.5219681, d95c0324-d1d3-4960-9ab7-3a2a098a9f7c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.522 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] VM Resumed (Lifecycle Event)
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.525 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.541 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.542 187212 INFO nova.virt.libvirt.driver [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance spawned successfully.
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.543 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.552 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.574 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.577 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.577 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.578 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.578 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.579 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.579 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.650 187212 INFO nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Took 15.34 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.650 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.706 187212 DEBUG nova.compute.provider_tree [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.809 187212 INFO nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Took 15.96 seconds to build instance.
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.816 187212 DEBUG nova.scheduler.client.report [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.828 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.838 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.839 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.887 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.887 187212 DEBUG nova.network.neutron [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.909 187212 INFO nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:49 compute-0 nova_compute[187208]: 2025-12-05 12:00:49.925 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.007 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.008 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.008 187212 INFO nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Creating image(s)
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.009 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.009 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.010 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.023 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 ovn_controller[95610]: 2025-12-05T12:00:50Z|00096|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:00:50 compute-0 ovn_controller[95610]: 2025-12-05T12:00:50Z|00097|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.080 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.081 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.082 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.096 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.151 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.152 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.188 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.189 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.190 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.244 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.246 187212 DEBUG nova.virt.disk.api [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Checking if we can resize image /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.246 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.303 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.305 187212 DEBUG nova.virt.disk.api [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Cannot resize image /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.305 187212 DEBUG nova.objects.instance [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lazy-loading 'migration_context' on Instance uuid c5241646-e089-40a3-b197-60aff60ea075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.318 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.319 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Ensure instance console log exists: /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.320 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.320 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.320 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.415 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.475 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.part --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.477 187212 DEBUG nova.virt.images [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] 6e277715-617f-4e35-89c7-208beae9fd5c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.479 187212 DEBUG nova.privsep.utils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.480 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.part /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.677 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.part /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.converted" returned: 0 in 0.197s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.682 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.754 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.converted --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.756 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.774 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.806 187212 DEBUG nova.network.neutron [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.807 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.809 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.814 187212 WARNING nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.818 187212 DEBUG nova.virt.libvirt.host [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.819 187212 DEBUG nova.virt.libvirt.host [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.823 187212 DEBUG nova.virt.libvirt.host [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.824 187212 DEBUG nova.virt.libvirt.host [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.825 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.825 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.826 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.826 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.826 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.826 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.827 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.827 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.827 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.828 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.828 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.828 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.834 187212 DEBUG nova.objects.instance [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lazy-loading 'pci_devices' on Instance uuid c5241646-e089-40a3-b197-60aff60ea075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.848 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.849 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.850 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.860 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.880 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <uuid>c5241646-e089-40a3-b197-60aff60ea075</uuid>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <name>instance-00000014</name>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerDiagnosticsNegativeTest-server-1685795102</nova:name>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:50</nova:creationTime>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:50 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:50 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:50 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:50 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:50 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:50 compute-0 nova_compute[187208]:         <nova:user uuid="32b963f457f74f00ad4c8ac7fa298e83">tempest-ServerDiagnosticsNegativeTest-546957392-project-member</nova:user>
Dec 05 12:00:50 compute-0 nova_compute[187208]:         <nova:project uuid="38d566f1d23b4fccb2a68f0a7aa78d72">tempest-ServerDiagnosticsNegativeTest-546957392</nova:project>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <entry name="serial">c5241646-e089-40a3-b197-60aff60ea075</entry>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <entry name="uuid">c5241646-e089-40a3-b197-60aff60ea075</entry>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.config"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/console.log" append="off"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:50 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:50 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:50 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:50 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:50 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.920 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.921 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.949 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.950 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.951 187212 INFO nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Using config drive
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.960 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.961 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:50 compute-0 nova_compute[187208]: 2025-12-05 12:00:50.961 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.018 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.019 187212 DEBUG nova.virt.disk.api [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Checking if we can resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.020 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.076 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.078 187212 DEBUG nova.virt.disk.api [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Cannot resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.078 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.079 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Ensure instance console log exists: /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.079 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.080 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.080 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.082 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.087 187212 WARNING nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.091 187212 DEBUG nova.virt.libvirt.host [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.092 187212 DEBUG nova.virt.libvirt.host [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.095 187212 DEBUG nova.virt.libvirt.host [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.095 187212 DEBUG nova.virt.libvirt.host [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.095 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.096 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.096 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.096 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.097 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.097 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.097 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.098 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.098 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.098 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.099 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.099 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.099 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.435 187212 DEBUG oslo_concurrency.lockutils [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] Acquiring lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.436 187212 DEBUG oslo_concurrency.lockutils [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] Acquired lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.436 187212 DEBUG nova.network.neutron [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.455 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <uuid>52d63666-4caa-4eaa-9128-6e21189b0932</uuid>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <name>instance-00000012</name>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdmin275Test-server-1823558123</nova:name>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:51</nova:creationTime>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:51 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:51 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:51 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:51 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:51 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:51 compute-0 nova_compute[187208]:         <nova:user uuid="3a90749503e34bda87974b2c22626de0">tempest-ServersAdmin275Test-1624449796-project-member</nova:user>
Dec 05 12:00:51 compute-0 nova_compute[187208]:         <nova:project uuid="6d28e47b844b47238fb8386dae6c546e">tempest-ServersAdmin275Test-1624449796</nova:project>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <entry name="serial">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <entry name="uuid">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log" append="off"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:51 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:51 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:51 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:51 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:51 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.732 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.733 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.734 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Using config drive
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.741 187212 INFO nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Creating config drive at /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.config
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.747 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfmxcsxbz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.766 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.852 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'keypairs' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:51 compute-0 nova_compute[187208]: 2025-12-05 12:00:51.871 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfmxcsxbz" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:51 compute-0 systemd-machined[153543]: New machine qemu-20-instance-00000014.
Dec 05 12:00:51 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-00000014.
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.244 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.424 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936052.4239159, c5241646-e089-40a3-b197-60aff60ea075 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.425 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] VM Resumed (Lifecycle Event)
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.428 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.428 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.434 187212 INFO nova.virt.libvirt.driver [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance spawned successfully.
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.434 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.464 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.469 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.470 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.471 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.471 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.472 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.472 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.477 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.505 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.506 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936052.4248924, c5241646-e089-40a3-b197-60aff60ea075 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.507 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] VM Started (Lifecycle Event)
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.552 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.556 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.604 187212 INFO nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Took 2.60 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.605 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.780 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating config drive at /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.788 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5putlyy9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.814 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.829 187212 INFO nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Took 4.07 seconds to build instance.
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.919 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5putlyy9" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:52 compute-0 nova_compute[187208]: 2025-12-05 12:00:52.986 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:53 compute-0 systemd-machined[153543]: New machine qemu-21-instance-00000012.
Dec 05 12:00:53 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-00000012.
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.320 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 52d63666-4caa-4eaa-9128-6e21189b0932 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.322 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936053.3198972, 52d63666-4caa-4eaa-9128-6e21189b0932 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.322 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Resumed (Lifecycle Event)
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.324 187212 DEBUG nova.compute.manager [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.325 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.327 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance spawned successfully.
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.328 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.579 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.591 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.594 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.595 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.595 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.595 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.596 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.596 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.663 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.663 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936053.3206246, 52d63666-4caa-4eaa-9128-6e21189b0932 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.663 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Started (Lifecycle Event)
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.685 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.688 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.886 187212 DEBUG nova.compute.manager [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:53 compute-0 nova_compute[187208]: 2025-12-05 12:00:53.891 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.042 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.042 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.043 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.195 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:54 compute-0 podman[216075]: 2025-12-05 12:00:54.275639236 +0000 UTC m=+0.070275035 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:00:54 compute-0 podman[216076]: 2025-12-05 12:00:54.306063933 +0000 UTC m=+0.097971994 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.786 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "897abc63-6217-4009-a547-8799c4621feb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.786 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.812 187212 DEBUG nova.network.neutron [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updating instance_info_cache with network_info: [{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.878 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936039.8766558, 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.879 187212 INFO nova.compute.manager [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] VM Stopped (Lifecycle Event)
Dec 05 12:00:54 compute-0 nova_compute[187208]: 2025-12-05 12:00:54.925 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.097 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.131 187212 DEBUG nova.compute.manager [None req-d47e53a1-6326-4c96-bfdf-45a8447c3ec7 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.136 187212 DEBUG oslo_concurrency.lockutils [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] Releasing lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.136 187212 DEBUG nova.compute.manager [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.136 187212 DEBUG nova.compute.manager [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] network_info to inject: |[{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.154 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "c5241646-e089-40a3-b197-60aff60ea075" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.155 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.155 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "c5241646-e089-40a3-b197-60aff60ea075-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.155 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.156 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.157 187212 INFO nova.compute.manager [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Terminating instance
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.157 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "refresh_cache-c5241646-e089-40a3-b197-60aff60ea075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.158 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquired lock "refresh_cache-c5241646-e089-40a3-b197-60aff60ea075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.158 187212 DEBUG nova.network.neutron [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.189 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.190 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.196 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.196 187212 INFO nova.compute.claims [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.522 187212 DEBUG nova.network.neutron [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.701 187212 DEBUG nova.compute.provider_tree [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.718 187212 DEBUG nova.scheduler.client.report [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.747 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.748 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.796 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.809 187212 INFO nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.824 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.901 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.902 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.903 187212 INFO nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Creating image(s)
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.904 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.904 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.905 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:55 compute-0 nova_compute[187208]: 2025-12-05 12:00:55.921 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.010 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.011 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.012 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.029 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.098 187212 DEBUG nova.network.neutron [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.102 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.103 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.128 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Releasing lock "refresh_cache-c5241646-e089-40a3-b197-60aff60ea075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.129 187212 DEBUG nova.compute.manager [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.157 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.158 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.159 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:56 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000014.scope: Deactivated successfully.
Dec 05 12:00:56 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000014.scope: Consumed 4.176s CPU time.
Dec 05 12:00:56 compute-0 systemd-machined[153543]: Machine qemu-20-instance-00000014 terminated.
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.248 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.250 187212 DEBUG nova.virt.disk.api [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Checking if we can resize image /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.251 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.311 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.313 187212 DEBUG nova.virt.disk.api [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Cannot resize image /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.313 187212 DEBUG nova.objects.instance [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lazy-loading 'migration_context' on Instance uuid 897abc63-6217-4009-a547-8799c4621feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.327 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.327 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Ensure instance console log exists: /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.328 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.328 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.328 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.330 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.336 187212 WARNING nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.356 187212 DEBUG nova.virt.libvirt.host [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.358 187212 DEBUG nova.virt.libvirt.host [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.371 187212 DEBUG nova.virt.libvirt.host [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.373 187212 DEBUG nova.virt.libvirt.host [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.373 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.373 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.374 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.374 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.375 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.375 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.375 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.376 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.376 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.376 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.377 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.377 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.382 187212 DEBUG nova.objects.instance [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lazy-loading 'pci_devices' on Instance uuid 897abc63-6217-4009-a547-8799c4621feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.387 187212 INFO nova.virt.libvirt.driver [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance destroyed successfully.
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.387 187212 DEBUG nova.objects.instance [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lazy-loading 'resources' on Instance uuid c5241646-e089-40a3-b197-60aff60ea075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.414 187212 INFO nova.virt.libvirt.driver [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Deleting instance files /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075_del
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.415 187212 INFO nova.virt.libvirt.driver [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Deletion of /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075_del complete
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.420 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <uuid>897abc63-6217-4009-a547-8799c4621feb</uuid>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <name>instance-00000015</name>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerDiagnosticsV248Test-server-1605716829</nova:name>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:00:56</nova:creationTime>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:00:56 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:00:56 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:00:56 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:00:56 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:00:56 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:00:56 compute-0 nova_compute[187208]:         <nova:user uuid="3777f30c4e2e4644912c2ef76a3ea2c0">tempest-ServerDiagnosticsV248Test-494221988-project-member</nova:user>
Dec 05 12:00:56 compute-0 nova_compute[187208]:         <nova:project uuid="7a8c57ca06ea434e98ac6900d68e5c27">tempest-ServerDiagnosticsV248Test-494221988</nova:project>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <system>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <entry name="serial">897abc63-6217-4009-a547-8799c4621feb</entry>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <entry name="uuid">897abc63-6217-4009-a547-8799c4621feb</entry>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     </system>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <os>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   </os>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <features>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   </features>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.config"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/console.log" append="off"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <video>
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     </video>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:00:56 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:00:56 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:00:56 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:00:56 compute-0 nova_compute[187208]: </domain>
Dec 05 12:00:56 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.475 187212 INFO nova.compute.manager [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.476 187212 DEBUG oslo.service.loopingcall [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.477 187212 DEBUG nova.compute.manager [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.477 187212 DEBUG nova.network.neutron [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.486 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.486 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.487 187212 INFO nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Using config drive
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.890 187212 INFO nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Creating config drive at /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.config
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.899 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52v8narm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.954 187212 DEBUG nova.network.neutron [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.976 187212 DEBUG nova.network.neutron [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:00:56 compute-0 nova_compute[187208]: 2025-12-05 12:00:56.992 187212 INFO nova.compute.manager [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Took 0.51 seconds to deallocate network for instance.
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.023 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52v8narm" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.042 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.043 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:00:57 compute-0 systemd-machined[153543]: New machine qemu-22-instance-00000015.
Dec 05 12:00:57 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-00000015.
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.104 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936042.1026633, caa6c7c3-7eb3-4636-a7ad-7b605ef393ba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.105 187212 INFO nova.compute.manager [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] VM Stopped (Lifecycle Event)
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.106 187212 INFO nova.compute.manager [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Rebuilding instance
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.133 187212 DEBUG nova.compute.manager [None req-034d3c6e-a4fd-4486-9db6-ec82f682dea7 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.199 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.248 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.283 187212 DEBUG nova.compute.provider_tree [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.297 187212 DEBUG nova.scheduler.client.report [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.326 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.346 187212 INFO nova.scheduler.client.report [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Deleted allocations for instance c5241646-e089-40a3-b197-60aff60ea075
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.366 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.393 187212 DEBUG nova.compute.manager [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.430 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.448 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'pci_requests' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.462 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.474 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'resources' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.484 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'migration_context' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.496 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.502 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.531 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936057.5310698, 897abc63-6217-4009-a547-8799c4621feb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.531 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] VM Resumed (Lifecycle Event)
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.534 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.535 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.538 187212 INFO nova.virt.libvirt.driver [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance spawned successfully.
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.539 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.553 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.556 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.563 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.563 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.563 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.564 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.564 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.565 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.573 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.573 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936057.5338352, 897abc63-6217-4009-a547-8799c4621feb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.573 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] VM Started (Lifecycle Event)
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.594 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.600 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.621 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.632 187212 INFO nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Took 1.73 seconds to spawn the instance on the hypervisor.
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.632 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.686 187212 INFO nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Took 2.52 seconds to build instance.
Dec 05 12:00:57 compute-0 nova_compute[187208]: 2025-12-05 12:00:57.701 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.915s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:00:59 compute-0 podman[216172]: 2025-12-05 12:00:59.208095127 +0000 UTC m=+0.062777791 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 05 12:00:59 compute-0 nova_compute[187208]: 2025-12-05 12:00:59.739 187212 DEBUG nova.compute.manager [None req-5db75d0d-2d1b-4d3f-b9ea-3a193a47a305 a19dd465f6924cd8a015d5ec028d2e21 4ee519a917b34232836644d7ed32c09c - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:00:59 compute-0 nova_compute[187208]: 2025-12-05 12:00:59.742 187212 INFO nova.compute.manager [None req-5db75d0d-2d1b-4d3f-b9ea-3a193a47a305 a19dd465f6924cd8a015d5ec028d2e21 4ee519a917b34232836644d7ed32c09c - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Retrieving diagnostics
Dec 05 12:00:59 compute-0 nova_compute[187208]: 2025-12-05 12:00:59.966 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:00 compute-0 nova_compute[187208]: 2025-12-05 12:01:00.458 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.125 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.125 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.125 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.125 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.126 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.127 187212 INFO nova.compute.manager [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Terminating instance
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.127 187212 DEBUG nova.compute.manager [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:01:01 compute-0 kernel: tape56fa29b-45 (unregistering): left promiscuous mode
Dec 05 12:01:01 compute-0 NetworkManager[55691]: <info>  [1764936061.1488] device (tape56fa29b-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:01 compute-0 ovn_controller[95610]: 2025-12-05T12:01:01Z|00098|binding|INFO|Releasing lport e56fa29b-453e-4140-997d-96c0de8ed4bb from this chassis (sb_readonly=0)
Dec 05 12:01:01 compute-0 ovn_controller[95610]: 2025-12-05T12:01:01Z|00099|binding|INFO|Setting lport e56fa29b-453e-4140-997d-96c0de8ed4bb down in Southbound
Dec 05 12:01:01 compute-0 ovn_controller[95610]: 2025-12-05T12:01:01Z|00100|binding|INFO|Removing iface tape56fa29b-45 ovn-installed in OVS
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.173 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:62:a3 10.100.0.3'], port_security=['fa:16:3e:37:62:a3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a211e57445104139baeb5ca8fa933c58', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4dad292-a18a-4c80-b443-fe4ecc60c1b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.217'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eb759a3-016c-413a-81bd-572c3bccb661, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e56fa29b-453e-4140-997d-96c0de8ed4bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.175 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e56fa29b-453e-4140-997d-96c0de8ed4bb in datapath 16e72b69-f48e-48c4-b5b8-b2731e24f397 unbound from our chassis
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.176 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16e72b69-f48e-48c4-b5b8-b2731e24f397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.178 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bead7b02-99c3-4c50-8a7c-8cde22c671ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.179 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 namespace which is not needed anymore
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.190 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:01 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000011.scope: Deactivated successfully.
Dec 05 12:01:01 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000011.scope: Consumed 13.056s CPU time.
Dec 05 12:01:01 compute-0 systemd-machined[153543]: Machine qemu-18-instance-00000011 terminated.
Dec 05 12:01:01 compute-0 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [NOTICE]   (215613) : haproxy version is 2.8.14-c23fe91
Dec 05 12:01:01 compute-0 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [NOTICE]   (215613) : path to executable is /usr/sbin/haproxy
Dec 05 12:01:01 compute-0 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [WARNING]  (215613) : Exiting Master process...
Dec 05 12:01:01 compute-0 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [ALERT]    (215613) : Current worker (215615) exited with code 143 (Terminated)
Dec 05 12:01:01 compute-0 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [WARNING]  (215613) : All workers exited. Exiting... (0)
Dec 05 12:01:01 compute-0 systemd[1]: libpod-7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8.scope: Deactivated successfully.
Dec 05 12:01:01 compute-0 podman[216233]: 2025-12-05 12:01:01.323960147 +0000 UTC m=+0.046388894 container died 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 12:01:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8-userdata-shm.mount: Deactivated successfully.
Dec 05 12:01:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-e5b1d7643321fc89fc61c887a583f48ca73e0a371a4b5fe52022732576250580-merged.mount: Deactivated successfully.
Dec 05 12:01:01 compute-0 podman[216233]: 2025-12-05 12:01:01.420139731 +0000 UTC m=+0.142568478 container cleanup 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Dec 05 12:01:01 compute-0 systemd[1]: libpod-conmon-7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8.scope: Deactivated successfully.
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.433 187212 INFO nova.virt.libvirt.driver [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance destroyed successfully.
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.434 187212 DEBUG nova.objects.instance [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lazy-loading 'resources' on Instance uuid bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.455 187212 DEBUG nova.virt.libvirt.vif [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:00:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2038537603',display_name='tempest-ServersTestJSON-server-2038537603',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-2038537603',id=17,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMtzxCx5WYGaho1uT1vhlFLxIivbdWss7SksmXXR8og/kbnuLPZgB17Trvp/z6Y5aD5/yAlqaXubyiqNS0bESVauUglSuMwk6CT9qVsDlZeY1DXt7lCJ98WxGxUuXIYIrA==',key_name='tempest-keypair-1060401215',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a211e57445104139baeb5ca8fa933c58',ramdisk_id='',reservation_id='r-059q99bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-2138545093',owner_user_name='tempest-ServersTestJSON-2138545093-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4aa579f9c54f43039ef96c870ed5e049',uuid=bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.455 187212 DEBUG nova.network.os_vif_util [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converting VIF {"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.456 187212 DEBUG nova.network.os_vif_util [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.456 187212 DEBUG os_vif [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.459 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.459 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape56fa29b-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.462 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.465 187212 INFO os_vif [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45')
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.466 187212 INFO nova.virt.libvirt.driver [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Deleting instance files /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d_del
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.466 187212 INFO nova.virt.libvirt.driver [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Deletion of /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d_del complete
Dec 05 12:01:01 compute-0 podman[216276]: 2025-12-05 12:01:01.481962762 +0000 UTC m=+0.042340887 container remove 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.487 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b462d86b-11d9-4ff0-ba68-512c44c360fe]: (4, ('Fri Dec  5 12:01:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 (7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8)\n7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8\nFri Dec  5 12:01:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 (7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8)\n7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.491 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[450000b6-c77a-47e6-ab29-6f71b2bfc3b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.492 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16e72b69-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.494 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:01 compute-0 kernel: tap16e72b69-f0: left promiscuous mode
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.507 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.510 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b968a65e-a087-4c5b-9294-376766b70d87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.522 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[07bac4f8-3eec-414e-aa18-3df6e6f82d30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.525 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[573d11c9-00b5-4614-baaa-8cc7a16be646]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.531 187212 INFO nova.compute.manager [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.532 187212 DEBUG oslo.service.loopingcall [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.532 187212 DEBUG nova.compute.manager [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:01:01 compute-0 nova_compute[187208]: 2025-12-05 12:01:01.532 187212 DEBUG nova.network.neutron [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.549 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2a02adf4-5bdf-4022-a6c8-3cee1c455ce1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341004, 'reachable_time': 27199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216290, 'error': None, 'target': 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.551 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:01:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.551 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7614eaeb-51a2-42ea-a94d-7055619c5978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d16e72b69\x2df48e\x2d48c4\x2db5b8\x2db2731e24f397.mount: Deactivated successfully.
Dec 05 12:01:01 compute-0 CROND[216292]: (root) CMD (run-parts /etc/cron.hourly)
Dec 05 12:01:01 compute-0 run-parts[216299]: (/etc/cron.hourly) starting 0anacron
Dec 05 12:01:01 compute-0 anacron[216307]: Anacron started on 2025-12-05
Dec 05 12:01:01 compute-0 anacron[216307]: Will run job `cron.daily' in 23 min.
Dec 05 12:01:01 compute-0 anacron[216307]: Will run job `cron.weekly' in 43 min.
Dec 05 12:01:01 compute-0 anacron[216307]: Will run job `cron.monthly' in 63 min.
Dec 05 12:01:01 compute-0 anacron[216307]: Jobs will be executed sequentially
Dec 05 12:01:01 compute-0 run-parts[216309]: (/etc/cron.hourly) finished 0anacron
Dec 05 12:01:01 compute-0 CROND[216291]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 05 12:01:02 compute-0 nova_compute[187208]: 2025-12-05 12:01:02.297 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:02 compute-0 ovn_controller[95610]: 2025-12-05T12:01:02Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:e2:12 10.100.0.10
Dec 05 12:01:02 compute-0 ovn_controller[95610]: 2025-12-05T12:01:02Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:e2:12 10.100.0.10
Dec 05 12:01:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:03.008 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:03.008 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:03.009 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:03 compute-0 nova_compute[187208]: 2025-12-05 12:01:03.443 187212 INFO nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Rebuilding instance
Dec 05 12:01:03 compute-0 nova_compute[187208]: 2025-12-05 12:01:03.763 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:03 compute-0 nova_compute[187208]: 2025-12-05 12:01:03.856 187212 DEBUG nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.051 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_requests' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.067 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.081 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.100 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.114 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.120 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.170 187212 DEBUG nova.network.neutron [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.201 187212 INFO nova.compute.manager [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Took 2.67 seconds to deallocate network for instance.
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.281 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.282 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.478 187212 DEBUG nova.compute.provider_tree [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.492 187212 DEBUG nova.scheduler.client.report [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.520 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.545 187212 INFO nova.scheduler.client.report [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Deleted allocations for instance bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.622 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.497s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:04 compute-0 nova_compute[187208]: 2025-12-05 12:01:04.653 187212 DEBUG nova.compute.manager [req-fb136fdd-ec7a-43f9-907e-0b87babf8067 req-b80dbebe-e076-4d08-b51e-6b9db3c7f726 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-vif-deleted-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:06 compute-0 podman[216346]: 2025-12-05 12:01:06.238911398 +0000 UTC m=+0.087314430 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:01:06 compute-0 kernel: tap380c99a7-94 (unregistering): left promiscuous mode
Dec 05 12:01:06 compute-0 NetworkManager[55691]: <info>  [1764936066.3087] device (tap380c99a7-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:01:06 compute-0 ovn_controller[95610]: 2025-12-05T12:01:06Z|00101|binding|INFO|Releasing lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d from this chassis (sb_readonly=0)
Dec 05 12:01:06 compute-0 ovn_controller[95610]: 2025-12-05T12:01:06Z|00102|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d down in Southbound
Dec 05 12:01:06 compute-0 ovn_controller[95610]: 2025-12-05T12:01:06Z|00103|binding|INFO|Removing iface tap380c99a7-94 ovn-installed in OVS
Dec 05 12:01:06 compute-0 nova_compute[187208]: 2025-12-05 12:01:06.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:06 compute-0 nova_compute[187208]: 2025-12-05 12:01:06.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.328 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.330 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.333 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:01:06 compute-0 nova_compute[187208]: 2025-12-05 12:01:06.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.352 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[94e0bff3-72ea-41ff-8b0d-2b6c0fd2e0fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:06 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec 05 12:01:06 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 15.656s CPU time.
Dec 05 12:01:06 compute-0 systemd-machined[153543]: Machine qemu-12-instance-0000000c terminated.
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.383 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[de5058d8-f1d0-49b5-ad1d-4ac753379db3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.387 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[54aaef84-d6f5-4187-add2-e267028817e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.409 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a14ea123-b513-44e1-9b35-d8fd009fe450]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.427 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[25d2f3b1-c4c0-45ab-b31c-76b0fa00e5e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216380, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.444 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7165c9-acbf-433f-a4dd-1e1ded287666]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216381, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216381, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.447 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:06 compute-0 nova_compute[187208]: 2025-12-05 12:01:06.450 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:06 compute-0 nova_compute[187208]: 2025-12-05 12:01:06.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.455 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.456 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.457 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.458 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:06 compute-0 nova_compute[187208]: 2025-12-05 12:01:06.461 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.145 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance shutdown successfully after 3 seconds.
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.155 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.162 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.164 187212 DEBUG nova.virt.libvirt.vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:01Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.165 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.166 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.168 187212 DEBUG os_vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.172 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap380c99a7-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.174 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.183 187212 INFO os_vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.184 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deleting instance files /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.186 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deletion of /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del complete
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.223 187212 DEBUG nova.compute.manager [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.224 187212 DEBUG oslo_concurrency.lockutils [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.224 187212 DEBUG oslo_concurrency.lockutils [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.225 187212 DEBUG oslo_concurrency.lockutils [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.225 187212 DEBUG nova.compute.manager [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.225 187212 WARNING nova.compute.manager [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state error and task_state rebuilding.
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.301 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.412 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.413 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating image(s)
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.414 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.415 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.416 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.439 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.494 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.496 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.497 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.513 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.547 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.567 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.568 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.601 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.603 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.603 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.669 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.671 187212 DEBUG nova.virt.disk.api [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.671 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.715 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "ab127619-9b81-4800-a347-5747dd062e5e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.716 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.737 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.743 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.744 187212 DEBUG nova.virt.disk.api [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.745 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.745 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Ensure instance console log exists: /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.746 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.746 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.746 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.749 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start _get_guest_xml network_info=[{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.753 187212 WARNING nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.757 187212 DEBUG nova.virt.libvirt.host [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.758 187212 DEBUG nova.virt.libvirt.host [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.761 187212 DEBUG nova.virt.libvirt.host [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.761 187212 DEBUG nova.virt.libvirt.host [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.762 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.762 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.762 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.763 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.763 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.763 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.764 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.764 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.764 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.764 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.765 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.765 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.765 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.795 187212 DEBUG nova.virt.libvirt.vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:07Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.796 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.796 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.798 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <uuid>982a8e69-5181-4847-bdfe-8d4de12bb2e4</uuid>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <name>instance-0000000c</name>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdminTestJSON-server-1785289561</nova:name>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:01:07</nova:creationTime>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:01:07 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:01:07 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:01:07 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:01:07 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:01:07 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:01:07 compute-0 nova_compute[187208]:         <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec 05 12:01:07 compute-0 nova_compute[187208]:         <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:01:07 compute-0 nova_compute[187208]:         <nova:port uuid="380c99a7-9480-45f8-b2f4-adfcdfa8576d">
Dec 05 12:01:07 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <system>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <entry name="serial">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <entry name="uuid">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     </system>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <os>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   </os>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <features>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   </features>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:24:4f:38"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <target dev="tap380c99a7-94"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log" append="off"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <video>
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     </video>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:01:07 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:01:07 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:01:07 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:01:07 compute-0 nova_compute[187208]: </domain>
Dec 05 12:01:07 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.803 187212 DEBUG nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Preparing to wait for external event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.803 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.803 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.803 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.804 187212 DEBUG nova.virt.libvirt.vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:07Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.805 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.805 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.806 187212 DEBUG os_vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.807 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.808 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.814 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.814 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap380c99a7-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.815 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap380c99a7-94, col_values=(('external_ids', {'iface-id': '380c99a7-9480-45f8-b2f4-adfcdfa8576d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:4f:38', 'vm-uuid': '982a8e69-5181-4847-bdfe-8d4de12bb2e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:07 compute-0 NetworkManager[55691]: <info>  [1764936067.8199] manager: (tap380c99a7-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.825 187212 INFO os_vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.842 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.842 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.848 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.848 187212 INFO nova.compute.claims [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.916 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.916 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.917 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:24:4f:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.917 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Using config drive
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.938 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:07 compute-0 nova_compute[187208]: 2025-12-05 12:01:07.964 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'keypairs' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.097 187212 DEBUG nova.compute.provider_tree [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.112 187212 DEBUG nova.scheduler.client.report [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.133 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.133 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.194 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.195 187212 DEBUG nova.network.neutron [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.214 187212 INFO nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.233 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.334 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.337 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.337 187212 INFO nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Creating image(s)
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.338 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.338 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.339 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.351 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.425 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.426 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.427 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.444 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.502 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.503 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.623 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk 1073741824" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.625 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.625 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.681 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.682 187212 DEBUG nova.virt.disk.api [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Checking if we can resize image /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.683 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.738 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.739 187212 DEBUG nova.virt.disk.api [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Cannot resize image /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.739 187212 DEBUG nova.objects.instance [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lazy-loading 'migration_context' on Instance uuid ab127619-9b81-4800-a347-5747dd062e5e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.756 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.757 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Ensure instance console log exists: /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.757 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.758 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.758 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.784 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating config drive at /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.790 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc0_8e_8p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.874 187212 DEBUG nova.network.neutron [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.875 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.876 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.880 187212 WARNING nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.884 187212 DEBUG nova.virt.libvirt.host [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.884 187212 DEBUG nova.virt.libvirt.host [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.887 187212 DEBUG nova.virt.libvirt.host [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.887 187212 DEBUG nova.virt.libvirt.host [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.888 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.888 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.888 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.890 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.890 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.890 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.891 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.895 187212 DEBUG nova.objects.instance [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lazy-loading 'pci_devices' on Instance uuid ab127619-9b81-4800-a347-5747dd062e5e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.913 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc0_8e_8p" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.918 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <uuid>ab127619-9b81-4800-a347-5747dd062e5e</uuid>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <name>instance-00000016</name>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <nova:name>tempest-TenantUsagesTestJSON-server-1464755080</nova:name>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:01:08</nova:creationTime>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:01:08 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:01:08 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:01:08 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:01:08 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:01:08 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:01:08 compute-0 nova_compute[187208]:         <nova:user uuid="4f0702685eba47f39b88602e4d1f00cc">tempest-TenantUsagesTestJSON-1778758271-project-member</nova:user>
Dec 05 12:01:08 compute-0 nova_compute[187208]:         <nova:project uuid="02988f772510450db9a7b8c5bd4b0dc7">tempest-TenantUsagesTestJSON-1778758271</nova:project>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <system>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <entry name="serial">ab127619-9b81-4800-a347-5747dd062e5e</entry>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <entry name="uuid">ab127619-9b81-4800-a347-5747dd062e5e</entry>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     </system>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <os>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   </os>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <features>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   </features>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.config"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/console.log" append="off"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <video>
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     </video>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:01:08 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:01:08 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:01:08 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:01:08 compute-0 nova_compute[187208]: </domain>
Dec 05 12:01:08 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.991 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.992 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:08 compute-0 nova_compute[187208]: 2025-12-05 12:01:08.992 187212 INFO nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Using config drive
Dec 05 12:01:08 compute-0 kernel: tap380c99a7-94: entered promiscuous mode
Dec 05 12:01:08 compute-0 systemd-udevd[216373]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:01:08 compute-0 NetworkManager[55691]: <info>  [1764936068.9994] manager: (tap380c99a7-94): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Dec 05 12:01:09 compute-0 NetworkManager[55691]: <info>  [1764936069.0457] device (tap380c99a7-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:01:09 compute-0 NetworkManager[55691]: <info>  [1764936069.0467] device (tap380c99a7-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:01:09 compute-0 ovn_controller[95610]: 2025-12-05T12:01:09Z|00104|binding|INFO|Claiming lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d for this chassis.
Dec 05 12:01:09 compute-0 ovn_controller[95610]: 2025-12-05T12:01:09Z|00105|binding|INFO|380c99a7-9480-45f8-b2f4-adfcdfa8576d: Claiming fa:16:3e:24:4f:38 10.100.0.13
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.048 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.057 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.057 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.059 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:01:09 compute-0 ovn_controller[95610]: 2025-12-05T12:01:09Z|00106|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d ovn-installed in OVS
Dec 05 12:01:09 compute-0 ovn_controller[95610]: 2025-12-05T12:01:09Z|00107|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d up in Southbound
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.065 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.076 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0d6adbce-de82-4639-a534-48f96f8a7174]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:09 compute-0 systemd-machined[153543]: New machine qemu-23-instance-0000000c.
Dec 05 12:01:09 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000000c.
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.108 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a499ca-d8fe-449f-be28-617e9ac36ae9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.111 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6491c5c4-8708-4ef2-b5a6-615c26953b04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.134 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b8737174-48f8-42a7-8280-8792e05caf29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.162 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[40b3a960-9c87-4d74-8828-28ce10ca33da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 13, 'rx_bytes': 868, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 13, 'rx_bytes': 868, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216480, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.176 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72426923-11e8-4384-a2c8-8b5f31c6f734]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216481, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216481, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.177 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.180 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.180 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.180 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.180 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.352 187212 DEBUG nova.compute.manager [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.352 187212 DEBUG oslo_concurrency.lockutils [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.352 187212 DEBUG oslo_concurrency.lockutils [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.352 187212 DEBUG oslo_concurrency.lockutils [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.353 187212 DEBUG nova.compute.manager [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Processing event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:01:09 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec 05 12:01:09 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000012.scope: Consumed 12.409s CPU time.
Dec 05 12:01:09 compute-0 systemd-machined[153543]: Machine qemu-21-instance-00000012 terminated.
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.782 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 982a8e69-5181-4847-bdfe-8d4de12bb2e4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.782 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936069.7814522, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.782 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Started (Lifecycle Event)
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.784 187212 DEBUG nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.787 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.792 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance spawned successfully.
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.792 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.800 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.803 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.809 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.809 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.810 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.810 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.810 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.811 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.830 187212 INFO nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Creating config drive at /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.config
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.835 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpme1xpuld execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.853 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.854 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936069.7824657, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.854 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Paused (Lifecycle Event)
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.876 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.880 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936069.7875993, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.880 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Resumed (Lifecycle Event)
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.889 187212 DEBUG nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.898 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.900 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.926 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.929 187212 DEBUG nova.compute.manager [None req-34cd94ef-e927-48aa-b044-25943cc30bf6 a19dd465f6924cd8a015d5ec028d2e21 4ee519a917b34232836644d7ed32c09c - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.942 187212 INFO nova.compute.manager [None req-34cd94ef-e927-48aa-b044-25943cc30bf6 a19dd465f6924cd8a015d5ec028d2e21 4ee519a917b34232836644d7ed32c09c - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Retrieving diagnostics
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.952 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.953 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.953 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:01:09 compute-0 nova_compute[187208]: 2025-12-05 12:01:09.960 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpme1xpuld" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:10 compute-0 systemd-machined[153543]: New machine qemu-24-instance-00000016.
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.028 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:10 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000016.
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.173 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "897abc63-6217-4009-a547-8799c4621feb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.174 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.174 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "897abc63-6217-4009-a547-8799c4621feb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.174 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.175 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.176 187212 INFO nova.compute.manager [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Terminating instance
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.177 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "refresh_cache-897abc63-6217-4009-a547-8799c4621feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.177 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquired lock "refresh_cache-897abc63-6217-4009-a547-8799c4621feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.177 187212 DEBUG nova.network.neutron [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.368 187212 DEBUG nova.network.neutron [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.443 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.563 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance shutdown successfully after 13 seconds.
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.570 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.576 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.577 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deleting instance files /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.577 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deletion of /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del complete
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.735 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.736 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating image(s)
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.736 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Acquiring lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.737 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.737 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.753 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.827 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.830 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.831 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.854 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.877 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936070.8656435, ab127619-9b81-4800-a347-5747dd062e5e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.878 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] VM Resumed (Lifecycle Event)
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.884 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.885 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.888 187212 DEBUG nova.network.neutron [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.895 187212 INFO nova.virt.libvirt.driver [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance spawned successfully.
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.896 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.911 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.914 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Releasing lock "refresh_cache-897abc63-6217-4009-a547-8799c4621feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.914 187212 DEBUG nova.compute.manager [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.919 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.921 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.943 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.954 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.955 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.956 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.957 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.958 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.959 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.964 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.966 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.967 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:10 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000015.scope: Deactivated successfully.
Dec 05 12:01:10 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000015.scope: Consumed 12.136s CPU time.
Dec 05 12:01:10 compute-0 systemd-machined[153543]: Machine qemu-22-instance-00000015 terminated.
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.996 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.997 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936070.8686862, ab127619-9b81-4800-a347-5747dd062e5e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:10 compute-0 nova_compute[187208]: 2025-12-05 12:01:10.997 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] VM Started (Lifecycle Event)
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.021 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.023 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.023 187212 DEBUG nova.virt.disk.api [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Checking if we can resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.023 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.048 187212 INFO nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Took 2.71 seconds to spawn the instance on the hypervisor.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.049 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.055 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.083 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.085 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.085 187212 DEBUG nova.virt.disk.api [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Cannot resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.086 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.086 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Ensure instance console log exists: /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.087 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.087 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.088 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.089 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.094 187212 WARNING nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.099 187212 DEBUG nova.virt.libvirt.host [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.099 187212 DEBUG nova.virt.libvirt.host [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.102 187212 DEBUG nova.virt.libvirt.host [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.103 187212 DEBUG nova.virt.libvirt.host [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.103 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.103 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.104 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.104 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.104 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.105 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.105 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.105 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.105 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.106 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.106 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.106 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.106 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.118 187212 INFO nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Took 3.31 seconds to build instance.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.123 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <uuid>52d63666-4caa-4eaa-9128-6e21189b0932</uuid>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <name>instance-00000012</name>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdmin275Test-server-1823558123</nova:name>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:01:11</nova:creationTime>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:01:11 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:01:11 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:01:11 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:01:11 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:01:11 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:01:11 compute-0 nova_compute[187208]:         <nova:user uuid="3a90749503e34bda87974b2c22626de0">tempest-ServersAdmin275Test-1624449796-project-member</nova:user>
Dec 05 12:01:11 compute-0 nova_compute[187208]:         <nova:project uuid="6d28e47b844b47238fb8386dae6c546e">tempest-ServersAdmin275Test-1624449796</nova:project>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <system>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <entry name="serial">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <entry name="uuid">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     </system>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <os>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   </os>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <features>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   </features>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log" append="off"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <video>
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     </video>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:01:11 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:01:11 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:01:11 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:01:11 compute-0 nova_compute[187208]: </domain>
Dec 05 12:01:11 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.139 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.198 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.198 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.199 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Using config drive
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.204 187212 INFO nova.virt.libvirt.driver [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance destroyed successfully.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.204 187212 DEBUG nova.objects.instance [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lazy-loading 'resources' on Instance uuid 897abc63-6217-4009-a547-8799c4621feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.232 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.236 187212 INFO nova.virt.libvirt.driver [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Deleting instance files /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb_del
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.237 187212 INFO nova.virt.libvirt.driver [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Deletion of /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb_del complete
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.263 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'keypairs' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.315 187212 INFO nova.compute.manager [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.315 187212 DEBUG oslo.service.loopingcall [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.316 187212 DEBUG nova.compute.manager [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.316 187212 DEBUG nova.network.neutron [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.384 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936056.380033, c5241646-e089-40a3-b197-60aff60ea075 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.385 187212 INFO nova.compute.manager [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] VM Stopped (Lifecycle Event)
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.437 187212 DEBUG nova.compute.manager [None req-eb546a3e-1030-4128-a0a9-8ba433345284 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.453 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating config drive at /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.458 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx7abbxh_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.596 187212 DEBUG nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.597 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.599 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.599 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.600 187212 DEBUG nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.600 187212 WARNING nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state None.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.601 187212 DEBUG nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.601 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.602 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.602 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.602 187212 DEBUG nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.603 187212 WARNING nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state None.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.604 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx7abbxh_" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:11 compute-0 systemd-machined[153543]: New machine qemu-25-instance-00000012.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.689 187212 DEBUG nova.network.neutron [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.701 187212 DEBUG nova.network.neutron [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:11 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000012.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.714 187212 INFO nova.compute.manager [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Took 0.40 seconds to deallocate network for instance.
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.766 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.766 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.953 187212 DEBUG nova.compute.provider_tree [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.967 187212 DEBUG nova.scheduler.client.report [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:11 compute-0 nova_compute[187208]: 2025-12-05 12:01:11.986 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.029 187212 INFO nova.scheduler.client.report [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Deleted allocations for instance 897abc63-6217-4009-a547-8799c4621feb
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.103 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:12 compute-0 ovn_controller[95610]: 2025-12-05T12:01:12Z|00108|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.183 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.362 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:12 compute-0 ovn_controller[95610]: 2025-12-05T12:01:12Z|00109|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.388 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 52d63666-4caa-4eaa-9128-6e21189b0932 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.389 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936072.3881116, 52d63666-4caa-4eaa-9128-6e21189b0932 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.389 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Resumed (Lifecycle Event)
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.392 187212 DEBUG nova.compute.manager [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.392 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.393 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.397 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance spawned successfully.
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.397 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.608 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.612 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.612 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.613 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.613 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.613 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.614 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.620 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.699 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.699 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936072.3907473, 52d63666-4caa-4eaa-9128-6e21189b0932 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.699 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Started (Lifecycle Event)
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.723 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.726 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.732 187212 DEBUG nova.compute.manager [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.754 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.783 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.784 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.784 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:12 compute-0 nova_compute[187208]: 2025-12-05 12:01:12.834 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.009 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "ab127619-9b81-4800-a347-5747dd062e5e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.010 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.010 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "ab127619-9b81-4800-a347-5747dd062e5e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.010 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.011 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.012 187212 INFO nova.compute.manager [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Terminating instance
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.013 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "refresh_cache-ab127619-9b81-4800-a347-5747dd062e5e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.013 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquired lock "refresh_cache-ab127619-9b81-4800-a347-5747dd062e5e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.013 187212 DEBUG nova.network.neutron [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:01:13 compute-0 nova_compute[187208]: 2025-12-05 12:01:13.213 187212 DEBUG nova.network.neutron [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.020 187212 DEBUG nova.network.neutron [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.049 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Releasing lock "refresh_cache-ab127619-9b81-4800-a347-5747dd062e5e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.049 187212 DEBUG nova.compute.manager [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:01:14 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec 05 12:01:14 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000016.scope: Consumed 3.982s CPU time.
Dec 05 12:01:14 compute-0 systemd-machined[153543]: Machine qemu-24-instance-00000016 terminated.
Dec 05 12:01:14 compute-0 podman[216578]: 2025-12-05 12:01:14.161642457 +0000 UTC m=+0.072263006 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.304 187212 INFO nova.virt.libvirt.driver [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance destroyed successfully.
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.305 187212 DEBUG nova.objects.instance [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lazy-loading 'resources' on Instance uuid ab127619-9b81-4800-a347-5747dd062e5e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.324 187212 INFO nova.virt.libvirt.driver [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Deleting instance files /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e_del
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.324 187212 INFO nova.virt.libvirt.driver [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Deletion of /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e_del complete
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.622 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "52d63666-4caa-4eaa-9128-6e21189b0932" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "52d63666-4caa-4eaa-9128-6e21189b0932-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.624 187212 INFO nova.compute.manager [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Terminating instance
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.625 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "refresh_cache-52d63666-4caa-4eaa-9128-6e21189b0932" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.626 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquired lock "refresh_cache-52d63666-4caa-4eaa-9128-6e21189b0932" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.627 187212 DEBUG nova.network.neutron [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.633 187212 INFO nova.compute.manager [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Took 0.58 seconds to destroy the instance on the hypervisor.
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.635 187212 DEBUG oslo.service.loopingcall [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.635 187212 DEBUG nova.compute.manager [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.636 187212 DEBUG nova.network.neutron [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.684 187212 INFO nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Rebuilding instance
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.847 187212 DEBUG nova.network.neutron [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.863 187212 DEBUG nova.network.neutron [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.871 187212 DEBUG nova.network.neutron [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.881 187212 INFO nova.compute.manager [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Took 0.25 seconds to deallocate network for instance.
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.920 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.920 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.929 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:14 compute-0 nova_compute[187208]: 2025-12-05 12:01:14.956 187212 DEBUG nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.002 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_requests' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.019 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.040 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.061 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.098 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.102 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.127 187212 DEBUG nova.compute.provider_tree [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.155 187212 DEBUG nova.scheduler.client.report [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.180 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.207 187212 INFO nova.scheduler.client.report [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Deleted allocations for instance ab127619-9b81-4800-a347-5747dd062e5e
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.311 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.544 187212 DEBUG nova.network.neutron [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.565 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Releasing lock "refresh_cache-52d63666-4caa-4eaa-9128-6e21189b0932" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.565 187212 DEBUG nova.compute.manager [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:01:15 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec 05 12:01:15 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000012.scope: Consumed 3.795s CPU time.
Dec 05 12:01:15 compute-0 systemd-machined[153543]: Machine qemu-25-instance-00000012 terminated.
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.816 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.817 187212 DEBUG nova.objects.instance [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'resources' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.833 187212 INFO nova.virt.libvirt.driver [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deleting instance files /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.834 187212 INFO nova.virt.libvirt.driver [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deletion of /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del complete
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.894 187212 INFO nova.compute.manager [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Took 0.33 seconds to destroy the instance on the hypervisor.
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.895 187212 DEBUG oslo.service.loopingcall [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.895 187212 DEBUG nova.compute.manager [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:01:15 compute-0 nova_compute[187208]: 2025-12-05 12:01:15.895 187212 DEBUG nova.network.neutron [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.182 187212 DEBUG nova.network.neutron [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.197 187212 DEBUG nova.network.neutron [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.219 187212 INFO nova.compute.manager [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Took 0.32 seconds to deallocate network for instance.
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.265 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.266 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.423 187212 DEBUG nova.compute.provider_tree [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.430 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936061.429443, bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.430 187212 INFO nova.compute.manager [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] VM Stopped (Lifecycle Event)
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.442 187212 DEBUG nova.scheduler.client.report [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.452 187212 DEBUG nova.compute.manager [None req-a9e6b9b7-cc2a-4a0c-a21b-2907dda973f3 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.468 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.498 187212 INFO nova.scheduler.client.report [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Deleted allocations for instance 52d63666-4caa-4eaa-9128-6e21189b0932
Dec 05 12:01:16 compute-0 nova_compute[187208]: 2025-12-05 12:01:16.566 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:17 compute-0 nova_compute[187208]: 2025-12-05 12:01:17.365 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:17 compute-0 nova_compute[187208]: 2025-12-05 12:01:17.820 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:19 compute-0 podman[216615]: 2025-12-05 12:01:19.205063033 +0000 UTC m=+0.054366355 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:01:20 compute-0 podman[216634]: 2025-12-05 12:01:20.214402537 +0000 UTC m=+0.059991325 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, name=ubi9-minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350)
Dec 05 12:01:22 compute-0 nova_compute[187208]: 2025-12-05 12:01:22.367 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:22 compute-0 ovn_controller[95610]: 2025-12-05T12:01:22Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:4f:38 10.100.0.13
Dec 05 12:01:22 compute-0 ovn_controller[95610]: 2025-12-05T12:01:22Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:4f:38 10.100.0.13
Dec 05 12:01:22 compute-0 nova_compute[187208]: 2025-12-05 12:01:22.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:24 compute-0 nova_compute[187208]: 2025-12-05 12:01:24.772 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:24 compute-0 nova_compute[187208]: 2025-12-05 12:01:24.772 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:24 compute-0 nova_compute[187208]: 2025-12-05 12:01:24.802 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:01:24 compute-0 nova_compute[187208]: 2025-12-05 12:01:24.879 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:24 compute-0 nova_compute[187208]: 2025-12-05 12:01:24.880 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:24 compute-0 nova_compute[187208]: 2025-12-05 12:01:24.885 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:01:24 compute-0 nova_compute[187208]: 2025-12-05 12:01:24.886 187212 INFO nova.compute.claims [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.044 187212 DEBUG nova.compute.provider_tree [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.065 187212 DEBUG nova.scheduler.client.report [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.097 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.098 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.146 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.163 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.164 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.180 187212 INFO nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.197 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:01:25 compute-0 podman[216666]: 2025-12-05 12:01:25.209875534 +0000 UTC m=+0.055912099 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 12:01:25 compute-0 podman[216667]: 2025-12-05 12:01:25.242387722 +0000 UTC m=+0.086865162 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.293 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.294 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.295 187212 INFO nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Creating image(s)
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.295 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.296 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.296 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.313 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.368 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.370 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.370 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.386 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.442 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.443 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.479 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.480 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.480 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.543 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.544 187212 DEBUG nova.virt.disk.api [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Checking if we can resize image /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.545 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.601 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.602 187212 DEBUG nova.virt.disk.api [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Cannot resize image /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.602 187212 DEBUG nova.objects.instance [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lazy-loading 'migration_context' on Instance uuid 1282e776-5758-493b-8f52-59839ebcd31b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.619 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.620 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Ensure instance console log exists: /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.620 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.620 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:25 compute-0 nova_compute[187208]: 2025-12-05 12:01:25.621 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:26 compute-0 nova_compute[187208]: 2025-12-05 12:01:26.202 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936071.200889, 897abc63-6217-4009-a547-8799c4621feb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:26 compute-0 nova_compute[187208]: 2025-12-05 12:01:26.202 187212 INFO nova.compute.manager [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] VM Stopped (Lifecycle Event)
Dec 05 12:01:26 compute-0 nova_compute[187208]: 2025-12-05 12:01:26.226 187212 DEBUG nova.compute.manager [None req-2efd0d07-15d0-491a-ae36-de792470b97c - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:26 compute-0 nova_compute[187208]: 2025-12-05 12:01:26.567 187212 DEBUG nova.policy [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '496da6872d53413ea1c201178cf5b05c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8400e354e93c4b33b8d683012dfe5c94', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.022 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.023 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.042 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.120 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.120 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.125 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.126 187212 INFO nova.compute.claims [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.318 187212 DEBUG nova.compute.provider_tree [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:27 compute-0 kernel: tap380c99a7-94 (unregistering): left promiscuous mode
Dec 05 12:01:27 compute-0 NetworkManager[55691]: <info>  [1764936087.3256] device (tap380c99a7-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.332 187212 DEBUG nova.scheduler.client.report [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:27 compute-0 ovn_controller[95610]: 2025-12-05T12:01:27Z|00110|binding|INFO|Releasing lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d from this chassis (sb_readonly=0)
Dec 05 12:01:27 compute-0 ovn_controller[95610]: 2025-12-05T12:01:27Z|00111|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d down in Southbound
Dec 05 12:01:27 compute-0 ovn_controller[95610]: 2025-12-05T12:01:27Z|00112|binding|INFO|Removing iface tap380c99a7-94 ovn-installed in OVS
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.342 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.343 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.345 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.353 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.358 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.358 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.359 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[06c5745d-da48-419b-ab0a-3f434b3961c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.369 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.387 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[dd709adb-d972-4835-9ee9-de601c64aeb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.390 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7910f3eb-70e1-4442-b1e0-91816445abff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:27 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec 05 12:01:27 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000000c.scope: Consumed 13.278s CPU time.
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.403 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.403 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:01:27 compute-0 systemd-machined[153543]: Machine qemu-23-instance-0000000c terminated.
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.418 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0eeac5-8a07-4e67-b517-b15d415b471d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.425 187212 INFO nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.435 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dae07921-0ac0-42c4-9adb-ada934a19cb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 15, 'rx_bytes': 868, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 15, 'rx_bytes': 868, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216744, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.442 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[477d79e5-0f0e-4f95-94a4-713f49e7be03]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216745, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216745, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.450 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.451 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.456 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.456 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.457 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.457 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.458 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.540 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.542 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.542 187212 INFO nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Creating image(s)
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.543 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.543 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.544 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.561 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:27 compute-0 kernel: tap380c99a7-94: entered promiscuous mode
Dec 05 12:01:27 compute-0 kernel: tap380c99a7-94 (unregistering): left promiscuous mode
Dec 05 12:01:27 compute-0 NetworkManager[55691]: <info>  [1764936087.5783] manager: (tap380c99a7-94): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.581 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.637 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.638 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.639 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.658 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.679 187212 DEBUG nova.policy [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a5b1ecad65045afbe3c154494417765', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2c184f0f2b71412fb560981314d0574d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.712 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.712 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.744 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.745 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.746 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.809 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.810 187212 DEBUG nova.virt.disk.api [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Checking if we can resize image /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.811 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.864 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.882 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.883 187212 DEBUG nova.virt.disk.api [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Cannot resize image /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:01:27 compute-0 nova_compute[187208]: 2025-12-05 12:01:27.883 187212 DEBUG nova.objects.instance [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'migration_context' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.044 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.044 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Ensure instance console log exists: /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.045 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.045 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.045 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.159 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance shutdown successfully after 13 seconds.
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.165 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.173 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.174 187212 DEBUG nova.virt.libvirt.vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:13Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.175 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.176 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.176 187212 DEBUG os_vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.180 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap380c99a7-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.185 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.188 187212 INFO os_vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.188 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deleting instance files /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del
Dec 05 12:01:28 compute-0 nova_compute[187208]: 2025-12-05 12:01:28.189 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deletion of /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del complete
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.303 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936074.3023188, ab127619-9b81-4800-a347-5747dd062e5e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.303 187212 INFO nova.compute.manager [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] VM Stopped (Lifecycle Event)
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.402 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.403 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating image(s)
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.404 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.404 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.404 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.418 187212 DEBUG nova.compute.manager [None req-e990e7c0-5100-47e3-bc3d-2faddc29fc81 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.420 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.484 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.485 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.486 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.502 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.559 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.560 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.595 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.596 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.596 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.651 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.652 187212 DEBUG nova.virt.disk.api [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.653 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.707 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.708 187212 DEBUG nova.virt.disk.api [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.709 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.709 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Ensure instance console log exists: /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.709 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.710 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.710 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.712 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start _get_guest_xml network_info=[{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.717 187212 WARNING nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.729 187212 DEBUG nova.virt.libvirt.host [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.730 187212 DEBUG nova.virt.libvirt.host [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.732 187212 DEBUG nova.virt.libvirt.host [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.733 187212 DEBUG nova.virt.libvirt.host [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.733 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.734 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.734 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.735 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.735 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.735 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.735 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.737 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.756 187212 DEBUG nova.virt.libvirt.vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:28Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.756 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.757 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.759 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <uuid>982a8e69-5181-4847-bdfe-8d4de12bb2e4</uuid>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <name>instance-0000000c</name>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAdminTestJSON-server-1785289561</nova:name>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:01:29</nova:creationTime>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:01:29 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:01:29 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:01:29 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:01:29 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:01:29 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:01:29 compute-0 nova_compute[187208]:         <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec 05 12:01:29 compute-0 nova_compute[187208]:         <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:01:29 compute-0 nova_compute[187208]:         <nova:port uuid="380c99a7-9480-45f8-b2f4-adfcdfa8576d">
Dec 05 12:01:29 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <system>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <entry name="serial">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <entry name="uuid">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     </system>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <os>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   </os>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <features>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   </features>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:24:4f:38"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <target dev="tap380c99a7-94"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log" append="off"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <video>
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     </video>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:01:29 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:01:29 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:01:29 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:01:29 compute-0 nova_compute[187208]: </domain>
Dec 05 12:01:29 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.760 187212 DEBUG nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Preparing to wait for external event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.760 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.761 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.761 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.762 187212 DEBUG nova.virt.libvirt.vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:28Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.762 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.762 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.763 187212 DEBUG os_vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.763 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.766 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.766 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap380c99a7-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.767 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap380c99a7-94, col_values=(('external_ids', {'iface-id': '380c99a7-9480-45f8-b2f4-adfcdfa8576d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:4f:38', 'vm-uuid': '982a8e69-5181-4847-bdfe-8d4de12bb2e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:29 compute-0 NetworkManager[55691]: <info>  [1764936089.7695] manager: (tap380c99a7-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.774 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:29 compute-0 nova_compute[187208]: 2025-12-05 12:01:29.774 187212 INFO os_vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')
Dec 05 12:01:29 compute-0 podman[216791]: 2025-12-05 12:01:29.863814973 +0000 UTC m=+0.057130853 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:01:30 compute-0 nova_compute[187208]: 2025-12-05 12:01:30.658 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:30 compute-0 nova_compute[187208]: 2025-12-05 12:01:30.659 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:30 compute-0 nova_compute[187208]: 2025-12-05 12:01:30.659 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:24:4f:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:01:30 compute-0 nova_compute[187208]: 2025-12-05 12:01:30.659 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Using config drive
Dec 05 12:01:30 compute-0 nova_compute[187208]: 2025-12-05 12:01:30.815 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936075.814507, 52d63666-4caa-4eaa-9128-6e21189b0932 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:30 compute-0 nova_compute[187208]: 2025-12-05 12:01:30.816 187212 INFO nova.compute.manager [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Stopped (Lifecycle Event)
Dec 05 12:01:31 compute-0 nova_compute[187208]: 2025-12-05 12:01:31.284 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:31 compute-0 nova_compute[187208]: 2025-12-05 12:01:31.287 187212 DEBUG nova.compute.manager [None req-6752171b-cb46-489d-9e0f-4c38d3f8bd91 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:31 compute-0 nova_compute[187208]: 2025-12-05 12:01:31.314 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'keypairs' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:31 compute-0 nova_compute[187208]: 2025-12-05 12:01:31.415 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Successfully created port: 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:01:31 compute-0 nova_compute[187208]: 2025-12-05 12:01:31.566 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Successfully created port: 02d6eab5-4561-4d9f-ad9a-169b57667224 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.372 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.491 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating config drive at /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.496 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcd15zg5t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.617 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcd15zg5t" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:32 compute-0 kernel: tap380c99a7-94: entered promiscuous mode
Dec 05 12:01:32 compute-0 ovn_controller[95610]: 2025-12-05T12:01:32Z|00113|binding|INFO|Claiming lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d for this chassis.
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:32 compute-0 ovn_controller[95610]: 2025-12-05T12:01:32Z|00114|binding|INFO|380c99a7-9480-45f8-b2f4-adfcdfa8576d: Claiming fa:16:3e:24:4f:38 10.100.0.13
Dec 05 12:01:32 compute-0 NetworkManager[55691]: <info>  [1764936092.6731] manager: (tap380c99a7-94): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.680 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.681 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.683 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:01:32 compute-0 ovn_controller[95610]: 2025-12-05T12:01:32Z|00115|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d ovn-installed in OVS
Dec 05 12:01:32 compute-0 ovn_controller[95610]: 2025-12-05T12:01:32Z|00116|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d up in Southbound
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.689 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.698 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aface6b5-80ed-4c2b-8758-35fbf3f85d73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:32 compute-0 systemd-udevd[216827]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:01:32 compute-0 systemd-machined[153543]: New machine qemu-26-instance-0000000c.
Dec 05 12:01:32 compute-0 NetworkManager[55691]: <info>  [1764936092.7200] device (tap380c99a7-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:01:32 compute-0 NetworkManager[55691]: <info>  [1764936092.7207] device (tap380c99a7-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:01:32 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-0000000c.
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.727 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfb9359-d34b-48c9-bf40-223eba1cd30f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.731 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0e21bbb6-421c-4b46-afe6-07218da108e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.758 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[58c02809-d387-42bf-a269-5aff061144a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.775 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[253b4214-ec0b-4faa-a6b3-3adf8cdb1753]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 17, 'rx_bytes': 868, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 17, 'rx_bytes': 868, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216840, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.795 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d9f81c-b0bb-47d8-85d7-7414802dc22e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216842, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216842, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.797 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.800 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.804 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.804 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.970 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Successfully updated port: 02d6eab5-4561-4d9f-ad9a-169b57667224 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.989 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.989 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquired lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:32 compute-0 nova_compute[187208]: 2025-12-05 12:01:32.990 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.179 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.264 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 982a8e69-5181-4847-bdfe-8d4de12bb2e4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.264 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936093.2632902, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.265 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Started (Lifecycle Event)
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.288 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.291 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936093.2644527, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.291 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Paused (Lifecycle Event)
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.441 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.447 187212 DEBUG nova.compute.manager [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-changed-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.448 187212 DEBUG nova.compute.manager [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Refreshing instance network info cache due to event network-changed-02d6eab5-4561-4d9f-ad9a-169b57667224. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.448 187212 DEBUG oslo_concurrency.lockutils [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.453 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:33 compute-0 nova_compute[187208]: 2025-12-05 12:01:33.482 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.196 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.196 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.218 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.354 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.355 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.371 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.372 187212 INFO nova.compute.claims [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.573 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.605 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Releasing lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.605 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance network_info: |[{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.606 187212 DEBUG oslo_concurrency.lockutils [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.606 187212 DEBUG nova.network.neutron [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Refreshing network info cache for port 02d6eab5-4561-4d9f-ad9a-169b57667224 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.609 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start _get_guest_xml network_info=[{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.612 187212 WARNING nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.615 187212 DEBUG nova.virt.libvirt.host [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.616 187212 DEBUG nova.virt.libvirt.host [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.619 187212 DEBUG nova.virt.libvirt.host [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.619 187212 DEBUG nova.virt.libvirt.host [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.622 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.624 187212 DEBUG nova.virt.libvirt.vif [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:27Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.625 187212 DEBUG nova.network.os_vif_util [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.625 187212 DEBUG nova.network.os_vif_util [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.626 187212 DEBUG nova.objects.instance [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.638 187212 DEBUG nova.compute.provider_tree [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.641 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <uuid>7b8cf31f-430b-4c7f-9c33-7d0cadd44d31</uuid>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <name>instance-00000018</name>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <nova:name>tempest-AttachInterfacesV270Test-server-1046212835</nova:name>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:01:34</nova:creationTime>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:01:34 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:01:34 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:01:34 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:01:34 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:01:34 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:01:34 compute-0 nova_compute[187208]:         <nova:user uuid="9a5b1ecad65045afbe3c154494417765">tempest-AttachInterfacesV270Test-1975383464-project-member</nova:user>
Dec 05 12:01:34 compute-0 nova_compute[187208]:         <nova:project uuid="2c184f0f2b71412fb560981314d0574d">tempest-AttachInterfacesV270Test-1975383464</nova:project>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:01:34 compute-0 nova_compute[187208]:         <nova:port uuid="02d6eab5-4561-4d9f-ad9a-169b57667224">
Dec 05 12:01:34 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <system>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <entry name="serial">7b8cf31f-430b-4c7f-9c33-7d0cadd44d31</entry>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <entry name="uuid">7b8cf31f-430b-4c7f-9c33-7d0cadd44d31</entry>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     </system>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <os>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   </os>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <features>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   </features>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.config"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:d4:b7:ec"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <target dev="tap02d6eab5-45"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/console.log" append="off"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <video>
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     </video>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:01:34 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:01:34 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:01:34 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:01:34 compute-0 nova_compute[187208]: </domain>
Dec 05 12:01:34 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.642 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Preparing to wait for external event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.642 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.642 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.642 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.643 187212 DEBUG nova.virt.libvirt.vif [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:27Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.643 187212 DEBUG nova.network.os_vif_util [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.644 187212 DEBUG nova.network.os_vif_util [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.644 187212 DEBUG os_vif [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.645 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.645 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.645 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.648 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.648 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02d6eab5-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.648 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02d6eab5-45, col_values=(('external_ids', {'iface-id': '02d6eab5-4561-4d9f-ad9a-169b57667224', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:b7:ec', 'vm-uuid': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.652 187212 DEBUG nova.scheduler.client.report [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.664 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:34 compute-0 NetworkManager[55691]: <info>  [1764936094.6657] manager: (tap02d6eab5-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.668 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.672 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.673 187212 INFO os_vif [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45')
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.676 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.677 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.728 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.729 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.746 187212 INFO nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.752 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.753 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.753 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No VIF found with MAC fa:16:3e:d4:b7:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.753 187212 INFO nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Using config drive
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.769 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.885 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.888 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.888 187212 INFO nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Creating image(s)
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.889 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.889 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.890 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.906 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.968 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.969 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.970 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:34 compute-0 nova_compute[187208]: 2025-12-05 12:01:34.988 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.010 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Successfully updated port: 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.030 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.030 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquired lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.030 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.050 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.051 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.071 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.072 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.089 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.090 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.090 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.156 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.157 187212 DEBUG nova.virt.disk.api [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Checking if we can resize image /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.158 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.180 187212 DEBUG nova.compute.manager [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-changed-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.180 187212 DEBUG nova.compute.manager [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Refreshing instance network info cache due to event network-changed-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.180 187212 DEBUG oslo_concurrency.lockutils [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.203 187212 INFO nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Creating config drive at /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.config
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.208 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvm__mfxf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.233 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.234 187212 DEBUG nova.virt.disk.api [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Cannot resize image /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.235 187212 DEBUG nova.objects.instance [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'migration_context' on Instance uuid adc15883-b705-42dd-ac95-04f4b8964012 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.247 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.247 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Ensure instance console log exists: /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.248 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.248 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.248 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.284 187212 DEBUG nova.policy [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '79758a6c7516459bb1907270241d266a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '342e6d694cf6482c9f1b7557a17bce60', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.298 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.298 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.300 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.336 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvm__mfxf" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:35 compute-0 kernel: tap02d6eab5-45: entered promiscuous mode
Dec 05 12:01:35 compute-0 NetworkManager[55691]: <info>  [1764936095.4123] manager: (tap02d6eab5-45): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Dec 05 12:01:35 compute-0 systemd-udevd[216832]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:01:35 compute-0 ovn_controller[95610]: 2025-12-05T12:01:35Z|00117|binding|INFO|Claiming lport 02d6eab5-4561-4d9f-ad9a-169b57667224 for this chassis.
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.416 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:35 compute-0 ovn_controller[95610]: 2025-12-05T12:01:35Z|00118|binding|INFO|02d6eab5-4561-4d9f-ad9a-169b57667224: Claiming fa:16:3e:d4:b7:ec 10.100.0.5
Dec 05 12:01:35 compute-0 NetworkManager[55691]: <info>  [1764936095.4290] device (tap02d6eab5-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:01:35 compute-0 NetworkManager[55691]: <info>  [1764936095.4299] device (tap02d6eab5-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.430 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:b7:ec 10.100.0.5'], port_security=['fa:16:3e:d4:b7:ec 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-423f0bba-22e2-4219-9338-a671dbe69e42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c184f0f2b71412fb560981314d0574d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8510d8eb-f367-43d1-be5f-8be0c3ab7e61', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eacee27-dbb3-4c60-a47d-c1f874faea06, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=02d6eab5-4561-4d9f-ad9a-169b57667224) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.431 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 02d6eab5-4561-4d9f-ad9a-169b57667224 in datapath 423f0bba-22e2-4219-9338-a671dbe69e42 bound to our chassis
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.434 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 423f0bba-22e2-4219-9338-a671dbe69e42
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.440 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.446 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9a05ebc3-8231-4088-b051-c825999dea6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.447 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap423f0bba-21 in ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.449 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap423f0bba-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1c8ca0-46e7-4f5d-9f71-6f7c06cefff6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.450 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[07b72dc4-8c3f-49e3-95ef-2017bce822e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 systemd-machined[153543]: New machine qemu-27-instance-00000018.
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.465 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[fa251933-9fd8-4beb-82cb-a6e9c1c66e57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:35 compute-0 ovn_controller[95610]: 2025-12-05T12:01:35Z|00119|binding|INFO|Setting lport 02d6eab5-4561-4d9f-ad9a-169b57667224 ovn-installed in OVS
Dec 05 12:01:35 compute-0 ovn_controller[95610]: 2025-12-05T12:01:35Z|00120|binding|INFO|Setting lport 02d6eab5-4561-4d9f-ad9a-169b57667224 up in Southbound
Dec 05 12:01:35 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-00000018.
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.481 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.493 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c72c708e-df43-4f02-9116-0f672643e3c2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.525 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[93bcea32-749c-4592-95f8-2d96a7dba886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.530 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e095c130-f519-4879-b70e-119f1ec738ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 NetworkManager[55691]: <info>  [1764936095.5320] manager: (tap423f0bba-20): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.564 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[eaee02f9-6d08-433b-8c7f-9598a4331851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.568 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[78d5f2ce-c38a-4dd1-b605-488d0762e756]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 NetworkManager[55691]: <info>  [1764936095.5906] device (tap423f0bba-20): carrier: link connected
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.595 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[92a3e489-a209-454b-9a57-461a84c8a8c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.615 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3c3c49-cd4a-4a62-94f5-2047203b5d8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap423f0bba-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:51:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347615, 'reachable_time': 21997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216916, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.631 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d0e867-c2a1-4b7d-9f75-b281a6efd3c8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe57:51bc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347615, 'tstamp': 347615}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216919, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.647 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[30c545e8-1e42-4e9b-8ee2-abbfd0093cfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap423f0bba-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:51:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347615, 'reachable_time': 21997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216924, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.679 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db76bc1b-b97a-4da3-a401-fd6423a001e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.722 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936095.7215908, 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.722 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] VM Started (Lifecycle Event)
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.731 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f422dd6-b1e2-45c1-b410-cdcae528dcfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.732 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap423f0bba-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.732 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.733 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap423f0bba-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.744 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.747 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936095.7217267, 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.747 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] VM Paused (Lifecycle Event)
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.770 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.773 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.778 187212 DEBUG nova.compute.manager [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.778 187212 DEBUG oslo_concurrency.lockutils [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.779 187212 DEBUG oslo_concurrency.lockutils [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.779 187212 DEBUG oslo_concurrency.lockutils [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.779 187212 DEBUG nova.compute.manager [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Processing event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.780 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.782 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:35 compute-0 kernel: tap423f0bba-20: entered promiscuous mode
Dec 05 12:01:35 compute-0 NetworkManager[55691]: <info>  [1764936095.7836] manager: (tap423f0bba-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.785 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap423f0bba-20, col_values=(('external_ids', {'iface-id': '8801ec73-6ce8-4039-ab6c-4693dcbc877e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.786 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.787 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:35 compute-0 ovn_controller[95610]: 2025-12-05T12:01:35Z|00121|binding|INFO|Releasing lport 8801ec73-6ce8-4039-ab6c-4693dcbc877e from this chassis (sb_readonly=0)
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.790 187212 INFO nova.virt.libvirt.driver [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance spawned successfully.
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.790 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.792 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.792 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936095.782405, 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.792 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] VM Resumed (Lifecycle Event)
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.801 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/423f0bba-22e2-4219-9338-a671dbe69e42.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/423f0bba-22e2-4219-9338-a671dbe69e42.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.802 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a703711d-a87d-44d1-9de3-625eac5e2ad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.803 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-423f0bba-22e2-4219-9338-a671dbe69e42
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/423f0bba-22e2-4219-9338-a671dbe69e42.pid.haproxy
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 423f0bba-22e2-4219-9338-a671dbe69e42
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:01:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.804 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'env', 'PROCESS_TAG=haproxy-423f0bba-22e2-4219-9338-a671dbe69e42', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/423f0bba-22e2-4219-9338-a671dbe69e42.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.810 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.814 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.814 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.814 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.815 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.815 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.815 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.819 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.869 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.889 187212 INFO nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Took 8.35 seconds to spawn the instance on the hypervisor.
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.889 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.944 187212 INFO nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Took 8.85 seconds to build instance.
Dec 05 12:01:35 compute-0 nova_compute[187208]: 2025-12-05 12:01:35.970 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.083 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.084 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.177 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.178 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.178 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.180 187212 DEBUG nova.network.neutron [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updated VIF entry in instance network info cache for port 02d6eab5-4561-4d9f-ad9a-169b57667224. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.181 187212 DEBUG nova.network.neutron [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.204 187212 DEBUG oslo_concurrency.lockutils [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:36 compute-0 podman[216958]: 2025-12-05 12:01:36.162036717 +0000 UTC m=+0.029387420 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:01:36 compute-0 podman[216958]: 2025-12-05 12:01:36.49398583 +0000 UTC m=+0.361336483 container create 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:01:36 compute-0 systemd[1]: Started libpod-conmon-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8.scope.
Dec 05 12:01:36 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:01:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c6b011a5a4020501a80db3fcb573c57c4bdcfb8d5de9a077c6de3d75c9302b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:01:36 compute-0 podman[216972]: 2025-12-05 12:01:36.576414905 +0000 UTC m=+0.058054700 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:01:36 compute-0 podman[216958]: 2025-12-05 12:01:36.581790659 +0000 UTC m=+0.449141342 container init 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:01:36 compute-0 podman[216958]: 2025-12-05 12:01:36.587253425 +0000 UTC m=+0.454604078 container start 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:01:36 compute-0 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [NOTICE]   (217003) : New worker (217006) forked
Dec 05 12:01:36 compute-0 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [NOTICE]   (217003) : Loading success.
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.977 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "interface-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.977 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "interface-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:36 compute-0 nova_compute[187208]: 2025-12-05 12:01:36.978 187212 DEBUG nova.objects.instance [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'flavor' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.019 187212 DEBUG nova.objects.instance [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'pci_requests' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.036 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.374 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.412 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Updating instance_info_cache with network_info: [{"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.438 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Releasing lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.438 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance network_info: |[{"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.438 187212 DEBUG oslo_concurrency.lockutils [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.439 187212 DEBUG nova.network.neutron [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Refreshing network info cache for port 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.441 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start _get_guest_xml network_info=[{"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.446 187212 WARNING nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.451 187212 DEBUG nova.virt.libvirt.host [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.452 187212 DEBUG nova.virt.libvirt.host [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.455 187212 DEBUG nova.virt.libvirt.host [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.456 187212 DEBUG nova.virt.libvirt.host [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.456 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.457 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.457 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.458 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.458 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.458 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.459 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.459 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.459 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.459 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.460 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.460 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.465 187212 DEBUG nova.virt.libvirt.vif [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1341604448',display_name='tempest-ImagesNegativeTestJSON-server-1341604448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1341604448',id=23,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8400e354e93c4b33b8d683012dfe5c94',ramdisk_id='',reservation_id='r-7yvjshh1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-965315619',owner_user_name='tempest-ImagesNegativeTestJSON-965315619-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:25Z,user_data=None,user_id='496da6872d53413ea1c201178cf5b05c',uuid=1282e776-5758-493b-8f52-59839ebcd31b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.466 187212 DEBUG nova.network.os_vif_util [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converting VIF {"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.466 187212 DEBUG nova.network.os_vif_util [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.468 187212 DEBUG nova.objects.instance [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1282e776-5758-493b-8f52-59839ebcd31b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.490 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <uuid>1282e776-5758-493b-8f52-59839ebcd31b</uuid>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <name>instance-00000017</name>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesNegativeTestJSON-server-1341604448</nova:name>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:01:37</nova:creationTime>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:01:37 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:01:37 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:01:37 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:01:37 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:01:37 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:01:37 compute-0 nova_compute[187208]:         <nova:user uuid="496da6872d53413ea1c201178cf5b05c">tempest-ImagesNegativeTestJSON-965315619-project-member</nova:user>
Dec 05 12:01:37 compute-0 nova_compute[187208]:         <nova:project uuid="8400e354e93c4b33b8d683012dfe5c94">tempest-ImagesNegativeTestJSON-965315619</nova:project>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:01:37 compute-0 nova_compute[187208]:         <nova:port uuid="9bb4b8ce-5722-4698-aa3d-6d891ab14b0d">
Dec 05 12:01:37 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <system>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <entry name="serial">1282e776-5758-493b-8f52-59839ebcd31b</entry>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <entry name="uuid">1282e776-5758-493b-8f52-59839ebcd31b</entry>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     </system>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <os>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   </os>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <features>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   </features>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:06:f0:53"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <target dev="tap9bb4b8ce-57"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/console.log" append="off"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <video>
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     </video>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:01:37 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:01:37 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:01:37 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:01:37 compute-0 nova_compute[187208]: </domain>
Dec 05 12:01:37 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.491 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Preparing to wait for external event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.491 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.492 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.492 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.493 187212 DEBUG nova.virt.libvirt.vif [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1341604448',display_name='tempest-ImagesNegativeTestJSON-server-1341604448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1341604448',id=23,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8400e354e93c4b33b8d683012dfe5c94',ramdisk_id='',reservation_id='r-7yvjshh1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-965315619',owner_user_name='tempest-ImagesNegativeTestJSON-965315619-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:25Z,user_data=None,user_id='496da6872d53413ea1c201178cf5b05c',uuid=1282e776-5758-493b-8f52-59839ebcd31b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.493 187212 DEBUG nova.network.os_vif_util [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converting VIF {"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.494 187212 DEBUG nova.network.os_vif_util [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.494 187212 DEBUG os_vif [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.495 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.496 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.500 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.500 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bb4b8ce-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.501 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9bb4b8ce-57, col_values=(('external_ids', {'iface-id': '9bb4b8ce-5722-4698-aa3d-6d891ab14b0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:f0:53', 'vm-uuid': '1282e776-5758-493b-8f52-59839ebcd31b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.502 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:37 compute-0 NetworkManager[55691]: <info>  [1764936097.5039] manager: (tap9bb4b8ce-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.510 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.512 187212 INFO os_vif [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57')
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.588 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updating instance_info_cache with network_info: [{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.640 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.641 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.649 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.653 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.724 187212 DEBUG nova.policy [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a5b1ecad65045afbe3c154494417765', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2c184f0f2b71412fb560981314d0574d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.756 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.757 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.757 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.757 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.932 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.932 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.933 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] No VIF found with MAC fa:16:3e:06:f0:53, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.933 187212 INFO nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Using config drive
Dec 05 12:01:37 compute-0 nova_compute[187208]: 2025-12-05 12:01:37.962 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.021 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.023 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.084 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.086 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000017, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config'
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.091 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.147 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.149 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.204 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.211 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.267 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.268 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.322 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.328 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.385 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.386 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.438 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.444 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.500 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.502 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.523 187212 INFO nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Creating config drive at /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.529 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7c7gn5vm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.560 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.566 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.621 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.622 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.653 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7c7gn5vm" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.682 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.714 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Successfully created port: 78310fa8-21e8-49e5-8b60-867d1089ad71 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:01:38 compute-0 kernel: tap9bb4b8ce-57: entered promiscuous mode
Dec 05 12:01:38 compute-0 NetworkManager[55691]: <info>  [1764936098.7184] manager: (tap9bb4b8ce-57): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Dec 05 12:01:38 compute-0 ovn_controller[95610]: 2025-12-05T12:01:38Z|00122|binding|INFO|Claiming lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d for this chassis.
Dec 05 12:01:38 compute-0 ovn_controller[95610]: 2025-12-05T12:01:38Z|00123|binding|INFO|9bb4b8ce-5722-4698-aa3d-6d891ab14b0d: Claiming fa:16:3e:06:f0:53 10.100.0.14
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.727 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.732 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:f0:53 10.100.0.14'], port_security=['fa:16:3e:06:f0:53 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1282e776-5758-493b-8f52-59839ebcd31b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455bb7e1-6680-472e-861f-da50aef09a7f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8400e354e93c4b33b8d683012dfe5c94', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9474b356-5c55-44a1-af48-0eeaf9a9ad0b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07a9feeb-8467-4a6f-b0e2-fda2f133d3ac, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.734 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d in datapath 455bb7e1-6680-472e-861f-da50aef09a7f bound to our chassis
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.736 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455bb7e1-6680-472e-861f-da50aef09a7f
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.748 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e6d41cae-5357-4a90-9b9e-ac92e16d491c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.749 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap455bb7e1-61 in ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.752 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap455bb7e1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.752 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d93dd69c-33ed-4fa2-ba43-5fc87d249d68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.753 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d81b7850-c9fa-4cfa-b91c-d71b43830c92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 systemd-udevd[217073]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:01:38 compute-0 systemd-machined[153543]: New machine qemu-28-instance-00000017.
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.770 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3a5fae-5726-4441-ab26-07e4c6d01a5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 NetworkManager[55691]: <info>  [1764936098.7771] device (tap9bb4b8ce-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:01:38 compute-0 NetworkManager[55691]: <info>  [1764936098.7781] device (tap9bb4b8ce-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:01:38 compute-0 systemd[1]: Started Virtual Machine qemu-28-instance-00000017.
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.785 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:38 compute-0 ovn_controller[95610]: 2025-12-05T12:01:38Z|00124|binding|INFO|Setting lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d ovn-installed in OVS
Dec 05 12:01:38 compute-0 ovn_controller[95610]: 2025-12-05T12:01:38Z|00125|binding|INFO|Setting lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d up in Southbound
Dec 05 12:01:38 compute-0 nova_compute[187208]: 2025-12-05 12:01:38.789 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.796 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a5666218-a345-46d7-b3e2-d75a8a00e48b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.822 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[01f78893-9813-469b-bc8d-020e096de40a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.827 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[be98a400-6ec5-49e8-ba4c-ab315f19d56d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 NetworkManager[55691]: <info>  [1764936098.8293] manager: (tap455bb7e1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.854 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[119c2620-de64-4b96-99e8-31b3a39c3441]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.857 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf52328-b3bb-461b-b218-ed64d076ee32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 NetworkManager[55691]: <info>  [1764936098.8802] device (tap455bb7e1-60): carrier: link connected
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.886 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c477cbef-1335-4058-b652-01b238068274]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.904 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a614fc-4f4f-47ed-9128-7386d5d629b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455bb7e1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:a2:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347944, 'reachable_time': 33412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217105, 'error': None, 'target': 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.920 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e85749b7-51b5-4ba1-a99b-58deed3bd1fe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:a24a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347944, 'tstamp': 347944}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217106, 'error': None, 'target': 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.939 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05fefbbc-4bd1-4712-8553-c5d66ba6bde4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455bb7e1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:a2:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347944, 'reachable_time': 33412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217107, 'error': None, 'target': 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.972 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e321311a-9284-4e8f-94e3-6257b30c7697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.005 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.007 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5053MB free_disk=73.21169662475586GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.007 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.008 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.028 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[52279ea9-3785-4f07-baf2-ff853ea3254e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.030 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455bb7e1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.030 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.030 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455bb7e1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:39 compute-0 NetworkManager[55691]: <info>  [1764936099.0661] manager: (tap455bb7e1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.066 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:39 compute-0 kernel: tap455bb7e1-60: entered promiscuous mode
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.068 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455bb7e1-60, col_values=(('external_ids', {'iface-id': '261e0bd9-3b3f-4cf7-b0f8-84547701ff1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:39 compute-0 ovn_controller[95610]: 2025-12-05T12:01:39Z|00126|binding|INFO|Releasing lport 261e0bd9-3b3f-4cf7-b0f8-84547701ff1a from this chassis (sb_readonly=0)
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.069 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.081 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.082 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/455bb7e1-6680-472e-861f-da50aef09a7f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/455bb7e1-6680-472e-861f-da50aef09a7f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.082 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a8458e30-4961-416a-b0b4-52481bf6fb67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.083 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-455bb7e1-6680-472e-861f-da50aef09a7f
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/455bb7e1-6680-472e-861f-da50aef09a7f.pid.haproxy
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 455bb7e1-6680-472e-861f-da50aef09a7f
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:01:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.084 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'env', 'PROCESS_TAG=haproxy-455bb7e1-6680-472e-861f-da50aef09a7f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/455bb7e1-6680-472e-861f-da50aef09a7f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.112 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 982a8e69-5181-4847-bdfe-8d4de12bb2e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 4e7aec76-673e-48b5-b183-cc9c7a95fd37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance d95c0324-d1d3-4960-9ab7-3a2a098a9f7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 1282e776-5758-493b-8f52-59839ebcd31b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.114 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance adc15883-b705-42dd-ac95-04f4b8964012 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.114 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.114 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1408MB phys_disk=79GB used_disk=7GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.277 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.297 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.320 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.321 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.420 187212 DEBUG nova.compute.manager [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.421 187212 DEBUG oslo_concurrency.lockutils [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.421 187212 DEBUG oslo_concurrency.lockutils [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.422 187212 DEBUG oslo_concurrency.lockutils [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.422 187212 DEBUG nova.compute.manager [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.422 187212 WARNING nova.compute.manager [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 for instance with vm_state active and task_state None.
Dec 05 12:01:39 compute-0 podman[217139]: 2025-12-05 12:01:39.458583952 +0000 UTC m=+0.051331167 container create 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:01:39 compute-0 systemd[1]: Started libpod-conmon-62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b.scope.
Dec 05 12:01:39 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:01:39 compute-0 podman[217139]: 2025-12-05 12:01:39.432190108 +0000 UTC m=+0.024937343 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:01:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4149b47c4e6793ce3b6fc5ffdd499aa39b6bd1d4bb7bbc0659f951080559deea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:01:39 compute-0 podman[217139]: 2025-12-05 12:01:39.549981543 +0000 UTC m=+0.142728758 container init 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:01:39 compute-0 podman[217139]: 2025-12-05 12:01:39.558827266 +0000 UTC m=+0.151574481 container start 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:01:39 compute-0 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [NOTICE]   (217165) : New worker (217168) forked
Dec 05 12:01:39 compute-0 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [NOTICE]   (217165) : Loading success.
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.626 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936099.626391, 1282e776-5758-493b-8f52-59839ebcd31b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.627 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] VM Started (Lifecycle Event)
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.647 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.651 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936099.6264863, 1282e776-5758-493b-8f52-59839ebcd31b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.651 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] VM Paused (Lifecycle Event)
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.673 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.676 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.695 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.728 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.730 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.731 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:39 compute-0 nova_compute[187208]: 2025-12-05 12:01:39.731 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.208 187212 DEBUG nova.network.neutron [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Updated VIF entry in instance network info cache for port 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.209 187212 DEBUG nova.network.neutron [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Updating instance_info_cache with network_info: [{"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.215 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.215 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.239 187212 DEBUG oslo_concurrency.lockutils [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.240 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.309 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.309 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.316 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.316 187212 INFO nova.compute.claims [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.361 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Successfully created port: 0d74b914-0dbd-4356-8304-a42943811e2e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.543 187212 DEBUG nova.compute.provider_tree [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.601 187212 DEBUG nova.scheduler.client.report [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.867 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.868 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.946 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:01:40 compute-0 nova_compute[187208]: 2025-12-05 12:01:40.947 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.050 187212 INFO nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.152 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.457 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.459 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.459 187212 INFO nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Creating image(s)
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.460 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.461 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.462 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.477 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.543 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.544 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.545 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.556 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.611 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.612 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.655 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.657 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.657 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.717 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.718 187212 DEBUG nova.virt.disk.api [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Checking if we can resize image /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.719 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.783 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.784 187212 DEBUG nova.virt.disk.api [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Cannot resize image /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.784 187212 DEBUG nova.objects.instance [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lazy-loading 'migration_context' on Instance uuid 1606eea3-5389-4437-b0f9-cfe6084d7871 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.789 187212 DEBUG nova.policy [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff53b25ec85543eeb2bdea04a6eeaac4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3cd52d70d1a4be8ae891298ff7e1018', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.806 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.806 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Ensure instance console log exists: /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.807 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.807 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:41 compute-0 nova_compute[187208]: 2025-12-05 12:01:41.807 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:42 compute-0 nova_compute[187208]: 2025-12-05 12:01:42.376 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:42 compute-0 nova_compute[187208]: 2025-12-05 12:01:42.501 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Successfully updated port: 78310fa8-21e8-49e5-8b60-867d1089ad71 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:01:42 compute-0 nova_compute[187208]: 2025-12-05 12:01:42.503 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:42 compute-0 nova_compute[187208]: 2025-12-05 12:01:42.548 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:42 compute-0 nova_compute[187208]: 2025-12-05 12:01:42.548 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquired lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:42 compute-0 nova_compute[187208]: 2025-12-05 12:01:42.549 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:01:42 compute-0 nova_compute[187208]: 2025-12-05 12:01:42.927 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:43.302 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:43 compute-0 nova_compute[187208]: 2025-12-05 12:01:43.535 187212 DEBUG nova.compute.manager [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:43 compute-0 nova_compute[187208]: 2025-12-05 12:01:43.535 187212 DEBUG oslo_concurrency.lockutils [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:43 compute-0 nova_compute[187208]: 2025-12-05 12:01:43.536 187212 DEBUG oslo_concurrency.lockutils [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:43 compute-0 nova_compute[187208]: 2025-12-05 12:01:43.536 187212 DEBUG oslo_concurrency.lockutils [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:43 compute-0 nova_compute[187208]: 2025-12-05 12:01:43.536 187212 DEBUG nova.compute.manager [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No event matching network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d in dict_keys([('network-vif-plugged', '380c99a7-9480-45f8-b2f4-adfcdfa8576d')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Dec 05 12:01:43 compute-0 nova_compute[187208]: 2025-12-05 12:01:43.536 187212 WARNING nova.compute.manager [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state rebuild_spawning.
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.025 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Successfully updated port: 0d74b914-0dbd-4356-8304-a42943811e2e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.051 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.052 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquired lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.052 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.421 187212 WARNING nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] 423f0bba-22e2-4219-9338-a671dbe69e42 already exists in list: networks containing: ['423f0bba-22e2-4219-9338-a671dbe69e42']. ignoring it
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.731 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updating instance_info_cache with network_info: [{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.748 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Releasing lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.748 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance network_info: |[{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.750 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start _get_guest_xml network_info=[{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.756 187212 WARNING nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.763 187212 DEBUG nova.virt.libvirt.host [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.764 187212 DEBUG nova.virt.libvirt.host [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.769 187212 DEBUG nova.virt.libvirt.host [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.769 187212 DEBUG nova.virt.libvirt.host [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.770 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.770 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1144768517',id=26,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-837660852',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.770 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.770 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.772 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.772 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.772 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.776 187212 DEBUG nova.virt.libvirt.vif [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-555517467',id=25,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-hjkfnf9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=adc15883-b705-42dd-ac95-04f4b8964012,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.776 187212 DEBUG nova.network.os_vif_util [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.776 187212 DEBUG nova.network.os_vif_util [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.777 187212 DEBUG nova.objects.instance [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'pci_devices' on Instance uuid adc15883-b705-42dd-ac95-04f4b8964012 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.790 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <uuid>adc15883-b705-42dd-ac95-04f4b8964012</uuid>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <name>instance-00000019</name>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-555517467</nova:name>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:01:44</nova:creationTime>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <nova:flavor name="tempest-flavor_with_ephemeral_0-837660852">
Dec 05 12:01:44 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:01:44 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:01:44 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:01:44 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:01:44 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:01:44 compute-0 nova_compute[187208]:         <nova:user uuid="79758a6c7516459bb1907270241d266a">tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member</nova:user>
Dec 05 12:01:44 compute-0 nova_compute[187208]:         <nova:project uuid="342e6d694cf6482c9f1b7557a17bce60">tempest-ServersWithSpecificFlavorTestJSON-1976479976</nova:project>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:01:44 compute-0 nova_compute[187208]:         <nova:port uuid="78310fa8-21e8-49e5-8b60-867d1089ad71">
Dec 05 12:01:44 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <system>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <entry name="serial">adc15883-b705-42dd-ac95-04f4b8964012</entry>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <entry name="uuid">adc15883-b705-42dd-ac95-04f4b8964012</entry>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     </system>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <os>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   </os>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <features>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   </features>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.config"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:c8:42:5d"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <target dev="tap78310fa8-21"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/console.log" append="off"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <video>
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     </video>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:01:44 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:01:44 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:01:44 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:01:44 compute-0 nova_compute[187208]: </domain>
Dec 05 12:01:44 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.791 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Preparing to wait for external event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.791 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.791 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.792 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.792 187212 DEBUG nova.virt.libvirt.vif [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-555517467',id=25,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-hjkfnf9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=adc15883-b705-42dd-ac95-04f4b8964012,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.793 187212 DEBUG nova.network.os_vif_util [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.794 187212 DEBUG nova.network.os_vif_util [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.794 187212 DEBUG os_vif [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.795 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.796 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.800 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap78310fa8-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.800 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap78310fa8-21, col_values=(('external_ids', {'iface-id': '78310fa8-21e8-49e5-8b60-867d1089ad71', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:42:5d', 'vm-uuid': 'adc15883-b705-42dd-ac95-04f4b8964012'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:44 compute-0 NetworkManager[55691]: <info>  [1764936104.8405] manager: (tap78310fa8-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.842 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.844 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.848 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.849 187212 INFO os_vif [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21')
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.871 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Successfully created port: c72089e0-4937-40b6-86b5-f9d6d0982058 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.917 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.918 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.918 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No VIF found with MAC fa:16:3e:c8:42:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:01:44 compute-0 nova_compute[187208]: 2025-12-05 12:01:44.919 187212 INFO nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Using config drive
Dec 05 12:01:44 compute-0 podman[217208]: 2025-12-05 12:01:44.969991398 +0000 UTC m=+0.073866381 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:01:45 compute-0 nova_compute[187208]: 2025-12-05 12:01:45.739 187212 INFO nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Creating config drive at /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.config
Dec 05 12:01:45 compute-0 nova_compute[187208]: 2025-12-05 12:01:45.745 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi8bht2hb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:45 compute-0 nova_compute[187208]: 2025-12-05 12:01:45.879 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi8bht2hb" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:45 compute-0 NetworkManager[55691]: <info>  [1764936105.9446] manager: (tap78310fa8-21): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Dec 05 12:01:45 compute-0 kernel: tap78310fa8-21: entered promiscuous mode
Dec 05 12:01:45 compute-0 systemd-udevd[217244]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:01:45 compute-0 ovn_controller[95610]: 2025-12-05T12:01:45Z|00127|binding|INFO|Claiming lport 78310fa8-21e8-49e5-8b60-867d1089ad71 for this chassis.
Dec 05 12:01:45 compute-0 ovn_controller[95610]: 2025-12-05T12:01:45Z|00128|binding|INFO|78310fa8-21e8-49e5-8b60-867d1089ad71: Claiming fa:16:3e:c8:42:5d 10.100.0.11
Dec 05 12:01:45 compute-0 nova_compute[187208]: 2025-12-05 12:01:45.995 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.000 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:42:5d 10.100.0.11'], port_security=['fa:16:3e:c8:42:5d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'adc15883-b705-42dd-ac95-04f4b8964012', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '342e6d694cf6482c9f1b7557a17bce60', 'neutron:revision_number': '2', 'neutron:security_group_ids': '710ea28e-d1ba-4c63-a751-16b460b2129b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a85cd729-c72e-4d3c-b444-ff0b42d436ff, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=78310fa8-21e8-49e5-8b60-867d1089ad71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.001 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 78310fa8-21e8-49e5-8b60-867d1089ad71 in datapath 393d33f9-2dde-4fb5-b5db-3f0fb98d4637 bound to our chassis
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.006 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec 05 12:01:46 compute-0 NetworkManager[55691]: <info>  [1764936106.0088] device (tap78310fa8-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:01:46 compute-0 NetworkManager[55691]: <info>  [1764936106.0099] device (tap78310fa8-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.019 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91f8a959-e5d2-4f59-98c7-7c9acbb2b526]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.020 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap393d33f9-21 in ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.023 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap393d33f9-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.024 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4c91fa4f-2f91-4f92-83d3-878bf932100e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.025 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[88ca9964-7479-46ef-bac6-b381fb062808]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 systemd-machined[153543]: New machine qemu-29-instance-00000019.
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.037 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[01844aeb-ad28-4b0e-b2e9-074a8b5b5acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.047 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:46 compute-0 ovn_controller[95610]: 2025-12-05T12:01:46Z|00129|binding|INFO|Setting lport 78310fa8-21e8-49e5-8b60-867d1089ad71 ovn-installed in OVS
Dec 05 12:01:46 compute-0 ovn_controller[95610]: 2025-12-05T12:01:46Z|00130|binding|INFO|Setting lport 78310fa8-21e8-49e5-8b60-867d1089ad71 up in Southbound
Dec 05 12:01:46 compute-0 systemd[1]: Started Virtual Machine qemu-29-instance-00000019.
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.050 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.055 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d37485-bde1-4be2-adf7-2840488d714d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.097 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[805a4542-5121-4306-bf49-7cdd9263a442]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 NetworkManager[55691]: <info>  [1764936106.1065] manager: (tap393d33f9-20): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Dec 05 12:01:46 compute-0 systemd-udevd[217250]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.106 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[594dc9b6-7d54-418a-b562-50a331aa3a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.142 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0c57c635-e590-4017-bb81-c7308f5c6119]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.146 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[49e51972-16b1-4567-b0a9-f0ee9d020dc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 NetworkManager[55691]: <info>  [1764936106.1837] device (tap393d33f9-20): carrier: link connected
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.191 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b490da-3e07-43d9-a9bb-44d3bb951305]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.211 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5125daff-7dcb-4db8-a91f-e5d99ea2c6cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap393d33f9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:b1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348674, 'reachable_time': 19169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217280, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.232 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[96525c79-9fbe-498c-82a6-7b03db581dfb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe23:b198'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348674, 'tstamp': 348674}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217281, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.257 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[deded7e2-f591-4c62-bd60-332b2dc707b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap393d33f9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:b1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348674, 'reachable_time': 19169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217282, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[41fc2a99-7fb6-46f1-9275-d03b658c75d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.360 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[45350962-a7fd-4500-b547-25d154555e6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.361 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap393d33f9-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.361 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.361 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap393d33f9-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.363 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:46 compute-0 NetworkManager[55691]: <info>  [1764936106.3642] manager: (tap393d33f9-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Dec 05 12:01:46 compute-0 kernel: tap393d33f9-20: entered promiscuous mode
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.370 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap393d33f9-20, col_values=(('external_ids', {'iface-id': '4f5e3c8a-5273-4414-820c-16ae051153f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.371 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:46 compute-0 ovn_controller[95610]: 2025-12-05T12:01:46Z|00131|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.374 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.375 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8652d6b4-78c3-4225-9295-6b11bfe81602]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.377 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:01:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.379 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'env', 'PROCESS_TAG=haproxy-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.384 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.438 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-changed-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.439 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Refreshing instance network info cache due to event network-changed-78310fa8-21e8-49e5-8b60-867d1089ad71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.440 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.440 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.440 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Refreshing network info cache for port 78310fa8-21e8-49e5-8b60-867d1089ad71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.459 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936106.4568458, adc15883-b705-42dd-ac95-04f4b8964012 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.460 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] VM Started (Lifecycle Event)
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.480 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.487 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936106.4571378, adc15883-b705-42dd-ac95-04f4b8964012 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.487 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] VM Paused (Lifecycle Event)
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.508 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.514 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:46 compute-0 nova_compute[187208]: 2025-12-05 12:01:46.535 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:01:46 compute-0 podman[217321]: 2025-12-05 12:01:46.719206509 +0000 UTC m=+0.032636843 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.496 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 podman[217321]: 2025-12-05 12:01:47.523136675 +0000 UTC m=+0.836566989 container create f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 12:01:47 compute-0 systemd[1]: Started libpod-conmon-f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751.scope.
Dec 05 12:01:47 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:01:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26091b88884c53d07a76d03c6c9c66adb5d232a7c306c3b78dafe02bf1e95c96/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.668 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.686 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Releasing lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.690 187212 DEBUG nova.virt.libvirt.vif [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:35Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.691 187212 DEBUG nova.network.os_vif_util [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.691 187212 DEBUG nova.network.os_vif_util [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.692 187212 DEBUG os_vif [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.692 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.692 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.693 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.695 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.695 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d74b914-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.696 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0d74b914-0d, col_values=(('external_ids', {'iface-id': '0d74b914-0dbd-4356-8304-a42943811e2e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:e2:69', 'vm-uuid': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.698 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 NetworkManager[55691]: <info>  [1764936107.6989] manager: (tap0d74b914-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.699 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.705 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.706 187212 INFO os_vif [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d')
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.706 187212 DEBUG nova.virt.libvirt.vif [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:35Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.707 187212 DEBUG nova.network.os_vif_util [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.707 187212 DEBUG nova.network.os_vif_util [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.710 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Successfully updated port: c72089e0-4937-40b6-86b5-f9d6d0982058 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.711 187212 DEBUG nova.virt.libvirt.guest [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] attach device xml: <interface type="ethernet">
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:a5:e2:69"/>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <target dev="tap0d74b914-0d"/>
Dec 05 12:01:47 compute-0 nova_compute[187208]: </interface>
Dec 05 12:01:47 compute-0 nova_compute[187208]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 05 12:01:47 compute-0 kernel: tap0d74b914-0d: entered promiscuous mode
Dec 05 12:01:47 compute-0 NetworkManager[55691]: <info>  [1764936107.7257] manager: (tap0d74b914-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Dec 05 12:01:47 compute-0 systemd-udevd[217262]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:01:47 compute-0 ovn_controller[95610]: 2025-12-05T12:01:47Z|00132|binding|INFO|Claiming lport 0d74b914-0dbd-4356-8304-a42943811e2e for this chassis.
Dec 05 12:01:47 compute-0 podman[217321]: 2025-12-05 12:01:47.727117202 +0000 UTC m=+1.040547536 container init f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.727 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 ovn_controller[95610]: 2025-12-05T12:01:47Z|00133|binding|INFO|0d74b914-0dbd-4356-8304-a42943811e2e: Claiming fa:16:3e:a5:e2:69 10.100.0.10
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.731 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.731 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquired lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.731 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:01:47 compute-0 podman[217321]: 2025-12-05 12:01:47.735167752 +0000 UTC m=+1.048598066 container start f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.737 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:e2:69 10.100.0.10'], port_security=['fa:16:3e:a5:e2:69 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-423f0bba-22e2-4219-9338-a671dbe69e42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c184f0f2b71412fb560981314d0574d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8510d8eb-f367-43d1-be5f-8be0c3ab7e61', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eacee27-dbb3-4c60-a47d-c1f874faea06, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0d74b914-0dbd-4356-8304-a42943811e2e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:47 compute-0 NetworkManager[55691]: <info>  [1764936107.7392] device (tap0d74b914-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:01:47 compute-0 NetworkManager[55691]: <info>  [1764936107.7401] device (tap0d74b914-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:01:47 compute-0 ovn_controller[95610]: 2025-12-05T12:01:47Z|00134|binding|INFO|Setting lport 0d74b914-0dbd-4356-8304-a42943811e2e ovn-installed in OVS
Dec 05 12:01:47 compute-0 ovn_controller[95610]: 2025-12-05T12:01:47Z|00135|binding|INFO|Setting lport 0d74b914-0dbd-4356-8304-a42943811e2e up in Southbound
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.743 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [NOTICE]   (217354) : New worker (217357) forked
Dec 05 12:01:47 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [NOTICE]   (217354) : Loading success.
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.821 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0d74b914-0dbd-4356-8304-a42943811e2e in datapath 423f0bba-22e2-4219-9338-a671dbe69e42 unbound from our chassis
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.824 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 423f0bba-22e2-4219-9338-a671dbe69e42
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.827 187212 DEBUG nova.virt.libvirt.driver [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.827 187212 DEBUG nova.virt.libvirt.driver [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.827 187212 DEBUG nova.virt.libvirt.driver [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No VIF found with MAC fa:16:3e:d4:b7:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.827 187212 DEBUG nova.virt.libvirt.driver [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No VIF found with MAC fa:16:3e:a5:e2:69, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.838 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ead51a60-412c-44c6-9080-b5d44619ffe0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.853 187212 DEBUG nova.virt.libvirt.guest [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesV270Test-server-1046212835</nova:name>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:01:47</nova:creationTime>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:01:47 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     <nova:user uuid="9a5b1ecad65045afbe3c154494417765">tempest-AttachInterfacesV270Test-1975383464-project-member</nova:user>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     <nova:project uuid="2c184f0f2b71412fb560981314d0574d">tempest-AttachInterfacesV270Test-1975383464</nova:project>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     <nova:port uuid="02d6eab5-4561-4d9f-ad9a-169b57667224">
Dec 05 12:01:47 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     <nova:port uuid="0d74b914-0dbd-4356-8304-a42943811e2e">
Dec 05 12:01:47 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:01:47 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:01:47 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:01:47 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:01:47 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.876 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cccf12c8-68bd-4408-ab47-1d9ad70a8e4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.879 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "interface-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 10.902s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.880 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[98a64066-02ce-415b-9969-9beed3f64113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.913 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4bcc21-3945-49b0-a92d-8a98a72dadf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.936 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ecbc5671-bc03-42ef-bd47-1b417e14a599]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap423f0bba-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:51:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347615, 'reachable_time': 21997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217383, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.952 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[235eb5bd-0bb2-4bb0-a3d2-8ef482be95fb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap423f0bba-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347626, 'tstamp': 347626}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217384, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap423f0bba-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347629, 'tstamp': 347629}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217384, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.954 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap423f0bba-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.956 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 nova_compute[187208]: 2025-12-05 12:01:47.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.958 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap423f0bba-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.958 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.959 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap423f0bba-20, col_values=(('external_ids', {'iface-id': '8801ec73-6ce8-4039-ab6c-4693dcbc877e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.959 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:48 compute-0 ovn_controller[95610]: 2025-12-05T12:01:48Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a5:e2:69 10.100.0.10
Dec 05 12:01:48 compute-0 ovn_controller[95610]: 2025-12-05T12:01:48Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a5:e2:69 10.100.0.10
Dec 05 12:01:48 compute-0 nova_compute[187208]: 2025-12-05 12:01:48.882 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:01:49 compute-0 ovn_controller[95610]: 2025-12-05T12:01:49Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:b7:ec 10.100.0.5
Dec 05 12:01:49 compute-0 ovn_controller[95610]: 2025-12-05T12:01:49Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:b7:ec 10.100.0.5
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.374 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updated VIF entry in instance network info cache for port 78310fa8-21e8-49e5-8b60-867d1089ad71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.375 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updating instance_info_cache with network_info: [{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.395 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.396 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.396 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Processing event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-changed-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Refreshing instance network info cache due to event network-changed-0d74b914-0dbd-4356-8304-a42943811e2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.398 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.398 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Refreshing network info cache for port 0d74b914-0dbd-4356-8304-a42943811e2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.399 187212 DEBUG nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance event wait completed in 16 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.404 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936109.4036753, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.404 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Resumed (Lifecycle Event)
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.415 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.423 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance spawned successfully.
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.423 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.430 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.433 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.445 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.446 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.446 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.446 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.447 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.447 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.453 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.507 187212 DEBUG nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.568 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.568 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.568 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:01:49 compute-0 nova_compute[187208]: 2025-12-05 12:01:49.630 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:50 compute-0 podman[217387]: 2025-12-05 12:01:50.227136162 +0000 UTC m=+0.069302451 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 05 12:01:50 compute-0 podman[217407]: 2025-12-05 12:01:50.330562855 +0000 UTC m=+0.060225870 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7)
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.793 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updating instance_info_cache with network_info: [{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.818 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Releasing lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.818 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance network_info: |[{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.820 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start _get_guest_xml network_info=[{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.824 187212 WARNING nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.832 187212 DEBUG nova.virt.libvirt.host [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.834 187212 DEBUG nova.virt.libvirt.host [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.838 187212 DEBUG nova.virt.libvirt.host [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.838 187212 DEBUG nova.virt.libvirt.host [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.839 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.839 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.839 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.841 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.841 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.841 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.845 187212 DEBUG nova.virt.libvirt.vif [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-306695219',display_name='tempest-ServersTestManualDisk-server-306695219',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-306695219',id=26,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBUGvAJW8rYe/hjaW/hFZe4neO1wzdrge/WiC/SnDk7t8/AXKetmZ8zo2NHECOEnhI/cR+zSyaxyLqYdEo4m6l7dGZQlwDucN9SIoLiq2LpSC0tXmPTDFsuOTXYjC2rzw==',key_name='tempest-keypair-2064130855',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3cd52d70d1a4be8ae891298ff7e1018',ramdisk_id='',reservation_id='r-w3qpedx4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1916815153',owner_user_name='tempest-ServersTestManualDisk-1916815153-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ff53b25ec85543eeb2bdea04a6eeaac4',uuid=1606eea3-5389-4437-b0f9-cfe6084d7871,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.845 187212 DEBUG nova.network.os_vif_util [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converting VIF {"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.846 187212 DEBUG nova.network.os_vif_util [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.848 187212 DEBUG nova.objects.instance [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1606eea3-5389-4437-b0f9-cfe6084d7871 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.852 187212 DEBUG nova.compute.manager [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-changed-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.852 187212 DEBUG nova.compute.manager [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Refreshing instance network info cache due to event network-changed-c72089e0-4937-40b6-86b5-f9d6d0982058. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.852 187212 DEBUG oslo_concurrency.lockutils [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.853 187212 DEBUG oslo_concurrency.lockutils [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.853 187212 DEBUG nova.network.neutron [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Refreshing network info cache for port c72089e0-4937-40b6-86b5-f9d6d0982058 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.868 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <uuid>1606eea3-5389-4437-b0f9-cfe6084d7871</uuid>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <name>instance-0000001a</name>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestManualDisk-server-306695219</nova:name>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:01:50</nova:creationTime>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:01:50 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:01:50 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:01:50 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:01:50 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:01:50 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:01:50 compute-0 nova_compute[187208]:         <nova:user uuid="ff53b25ec85543eeb2bdea04a6eeaac4">tempest-ServersTestManualDisk-1916815153-project-member</nova:user>
Dec 05 12:01:50 compute-0 nova_compute[187208]:         <nova:project uuid="e3cd52d70d1a4be8ae891298ff7e1018">tempest-ServersTestManualDisk-1916815153</nova:project>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:01:50 compute-0 nova_compute[187208]:         <nova:port uuid="c72089e0-4937-40b6-86b5-f9d6d0982058">
Dec 05 12:01:50 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <system>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <entry name="serial">1606eea3-5389-4437-b0f9-cfe6084d7871</entry>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <entry name="uuid">1606eea3-5389-4437-b0f9-cfe6084d7871</entry>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     </system>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <os>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   </os>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <features>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   </features>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.config"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:ea:73:d9"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <target dev="tapc72089e0-49"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/console.log" append="off"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <video>
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     </video>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:01:50 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:01:50 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:01:50 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:01:50 compute-0 nova_compute[187208]: </domain>
Dec 05 12:01:50 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.870 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Preparing to wait for external event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.870 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.870 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.870 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.871 187212 DEBUG nova.virt.libvirt.vif [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-306695219',display_name='tempest-ServersTestManualDisk-server-306695219',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-306695219',id=26,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBUGvAJW8rYe/hjaW/hFZe4neO1wzdrge/WiC/SnDk7t8/AXKetmZ8zo2NHECOEnhI/cR+zSyaxyLqYdEo4m6l7dGZQlwDucN9SIoLiq2LpSC0tXmPTDFsuOTXYjC2rzw==',key_name='tempest-keypair-2064130855',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3cd52d70d1a4be8ae891298ff7e1018',ramdisk_id='',reservation_id='r-w3qpedx4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1916815153',owner_user_name='tempest-ServersTestManualDisk-1916815153-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ff53b25ec85543eeb2bdea04a6eeaac4',uuid=1606eea3-5389-4437-b0f9-cfe6084d7871,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.872 187212 DEBUG nova.network.os_vif_util [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converting VIF {"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.872 187212 DEBUG nova.network.os_vif_util [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.873 187212 DEBUG os_vif [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.877 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.877 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.878 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.886 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.886 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc72089e0-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.887 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc72089e0-49, col_values=(('external_ids', {'iface-id': 'c72089e0-4937-40b6-86b5-f9d6d0982058', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:73:d9', 'vm-uuid': '1606eea3-5389-4437-b0f9-cfe6084d7871'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:50 compute-0 NetworkManager[55691]: <info>  [1764936110.8906] manager: (tapc72089e0-49): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.892 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.898 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.899 187212 INFO os_vif [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49')
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.904 187212 DEBUG nova.compute.manager [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.904 187212 DEBUG oslo_concurrency.lockutils [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.904 187212 DEBUG oslo_concurrency.lockutils [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.905 187212 DEBUG oslo_concurrency.lockutils [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.905 187212 DEBUG nova.compute.manager [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.905 187212 WARNING nova.compute.manager [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e for instance with vm_state active and task_state None.
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.967 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.968 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.969 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] No VIF found with MAC fa:16:3e:ea:73:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:01:50 compute-0 nova_compute[187208]: 2025-12-05 12:01:50.970 187212 INFO nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Using config drive
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.665 187212 INFO nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Creating config drive at /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.config
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.673 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppq3rnbg_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.705 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updated VIF entry in instance network info cache for port 0d74b914-0dbd-4356-8304-a42943811e2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.706 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.741 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.742 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.742 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.742 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 WARNING nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state rebuild_spawning.
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.744 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.744 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.744 187212 WARNING nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state rebuild_spawning.
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.806 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppq3rnbg_" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:01:51 compute-0 kernel: tapc72089e0-49: entered promiscuous mode
Dec 05 12:01:51 compute-0 NetworkManager[55691]: <info>  [1764936111.8715] manager: (tapc72089e0-49): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Dec 05 12:01:51 compute-0 ovn_controller[95610]: 2025-12-05T12:01:51Z|00136|binding|INFO|Claiming lport c72089e0-4937-40b6-86b5-f9d6d0982058 for this chassis.
Dec 05 12:01:51 compute-0 ovn_controller[95610]: 2025-12-05T12:01:51Z|00137|binding|INFO|c72089e0-4937-40b6-86b5-f9d6d0982058: Claiming fa:16:3e:ea:73:d9 10.100.0.11
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.875 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.894 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:73:d9 10.100.0.11'], port_security=['fa:16:3e:ea:73:d9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1606eea3-5389-4437-b0f9-cfe6084d7871', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3cd52d70d1a4be8ae891298ff7e1018', 'neutron:revision_number': '2', 'neutron:security_group_ids': '753f16cd-17e0-4f5a-8936-b01e8b5b8119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1ba8a60-bda5-4c97-91b2-1ae7ea8aa092, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c72089e0-4937-40b6-86b5-f9d6d0982058) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.895 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c72089e0-4937-40b6-86b5-f9d6d0982058 in datapath 904b3233-fdc6-4df0-b02a-f30a1e47627b bound to our chassis
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.900 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 904b3233-fdc6-4df0-b02a-f30a1e47627b
Dec 05 12:01:51 compute-0 systemd-udevd[217446]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.916 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b7921a3f-1195-4f40-9c02-49cc80573b16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.918 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap904b3233-f1 in ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.920 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap904b3233-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.920 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a60dfebf-19ee-418d-8cd9-856628b37e43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.921 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b0dede-c6a8-42c7-bd4a-4fc85e8eb800]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:51 compute-0 NetworkManager[55691]: <info>  [1764936111.9363] device (tapc72089e0-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:01:51 compute-0 NetworkManager[55691]: <info>  [1764936111.9380] device (tapc72089e0-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.938 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[43641ee1-cca4-40b6-9cea-4a956a5b60cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.946 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:51 compute-0 ovn_controller[95610]: 2025-12-05T12:01:51Z|00138|binding|INFO|Setting lport c72089e0-4937-40b6-86b5-f9d6d0982058 ovn-installed in OVS
Dec 05 12:01:51 compute-0 ovn_controller[95610]: 2025-12-05T12:01:51Z|00139|binding|INFO|Setting lport c72089e0-4937-40b6-86b5-f9d6d0982058 up in Southbound
Dec 05 12:01:51 compute-0 nova_compute[187208]: 2025-12-05 12:01:51.953 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:51 compute-0 systemd-machined[153543]: New machine qemu-30-instance-0000001a.
Dec 05 12:01:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.972 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43219dbe-4703-42e4-a897-bb84b5e90c17]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:51 compute-0 systemd[1]: Started Virtual Machine qemu-30-instance-0000001a.
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.005 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b8c83f-c0a1-4d0f-b849-e64b9885dedf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.012 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3718dcb-2bd1-4f25-b43d-af7e45569297]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 NetworkManager[55691]: <info>  [1764936112.0141] manager: (tap904b3233-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Dec 05 12:01:52 compute-0 systemd-udevd[217449]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.046 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[16adb1cb-1293-48d4-aa59-8058223810f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.049 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e319f1-bb39-4e7f-bbc5-070a4cc18f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 NetworkManager[55691]: <info>  [1764936112.0747] device (tap904b3233-f0): carrier: link connected
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.080 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4386b4fc-9866-4581-8f47-7ed0ba72c327]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.098 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bd98a545-ef0f-4648-b0f2-396ff7f797d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap904b3233-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:be:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349263, 'reachable_time': 42986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217481, 'error': None, 'target': 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.113 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62649561-7638-44f8-83e3-1e17e4f026fc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefa:be1f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349263, 'tstamp': 349263}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217482, 'error': None, 'target': 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.131 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c284a64-221e-441a-87eb-b1345d8cfe7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap904b3233-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:be:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349263, 'reachable_time': 42986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217483, 'error': None, 'target': 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.158 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[985c7a05-8e8f-4d92-ab44-e6286ce11c9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.226 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af935d5f-bf34-409d-9d4e-18bea0df1644]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.227 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap904b3233-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.228 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.228 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap904b3233-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.231 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:52 compute-0 kernel: tap904b3233-f0: entered promiscuous mode
Dec 05 12:01:52 compute-0 NetworkManager[55691]: <info>  [1764936112.2318] manager: (tap904b3233-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.234 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap904b3233-f0, col_values=(('external_ids', {'iface-id': '8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:01:52 compute-0 ovn_controller[95610]: 2025-12-05T12:01:52Z|00140|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.235 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.248 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.249 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/904b3233-fdc6-4df0-b02a-f30a1e47627b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/904b3233-fdc6-4df0-b02a-f30a1e47627b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.250 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e487b816-8c22-412f-8310-594e67ef8011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.251 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-904b3233-fdc6-4df0-b02a-f30a1e47627b
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/904b3233-fdc6-4df0-b02a-f30a1e47627b.pid.haproxy
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 904b3233-fdc6-4df0-b02a-f30a1e47627b
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:01:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.251 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'env', 'PROCESS_TAG=haproxy-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/904b3233-fdc6-4df0-b02a-f30a1e47627b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.388 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936112.387751, 1606eea3-5389-4437-b0f9-cfe6084d7871 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.388 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] VM Started (Lifecycle Event)
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.407 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.414 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936112.388031, 1606eea3-5389-4437-b0f9-cfe6084d7871 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.415 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] VM Paused (Lifecycle Event)
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.448 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.454 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.481 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:01:52 compute-0 nova_compute[187208]: 2025-12-05 12:01:52.500 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:52 compute-0 podman[217520]: 2025-12-05 12:01:52.662053751 +0000 UTC m=+0.059216072 container create 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:01:52 compute-0 systemd[1]: Started libpod-conmon-98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1.scope.
Dec 05 12:01:52 compute-0 podman[217520]: 2025-12-05 12:01:52.62559086 +0000 UTC m=+0.022753201 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:01:52 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:01:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f782e29042a4d6587770f14d2c44c7361a37482c8593077cf3a706cbf5d68ff/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:01:52 compute-0 podman[217520]: 2025-12-05 12:01:52.758042253 +0000 UTC m=+0.155204604 container init 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:01:52 compute-0 podman[217520]: 2025-12-05 12:01:52.763936402 +0000 UTC m=+0.161098723 container start 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:01:52 compute-0 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [NOTICE]   (217539) : New worker (217541) forked
Dec 05 12:01:52 compute-0 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [NOTICE]   (217539) : Loading success.
Dec 05 12:01:55 compute-0 nova_compute[187208]: 2025-12-05 12:01:55.003 187212 DEBUG nova.network.neutron [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updated VIF entry in instance network info cache for port c72089e0-4937-40b6-86b5-f9d6d0982058. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:01:55 compute-0 nova_compute[187208]: 2025-12-05 12:01:55.003 187212 DEBUG nova.network.neutron [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updating instance_info_cache with network_info: [{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:01:55 compute-0 nova_compute[187208]: 2025-12-05 12:01:55.037 187212 DEBUG oslo_concurrency.lockutils [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:01:55 compute-0 nova_compute[187208]: 2025-12-05 12:01:55.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:56 compute-0 podman[217550]: 2025-12-05 12:01:56.200939669 +0000 UTC m=+0.051109092 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:01:56 compute-0 podman[217551]: 2025-12-05 12:01:56.281305844 +0000 UTC m=+0.122315685 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 05 12:01:57 compute-0 nova_compute[187208]: 2025-12-05 12:01:57.504 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.198 187212 DEBUG nova.compute.manager [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.198 187212 DEBUG oslo_concurrency.lockutils [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.199 187212 DEBUG oslo_concurrency.lockutils [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.199 187212 DEBUG oslo_concurrency.lockutils [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.199 187212 DEBUG nova.compute.manager [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.199 187212 WARNING nova.compute.manager [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e for instance with vm_state active and task_state None.
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.302 187212 DEBUG nova.compute.manager [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.303 187212 DEBUG oslo_concurrency.lockutils [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.303 187212 DEBUG oslo_concurrency.lockutils [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.304 187212 DEBUG oslo_concurrency.lockutils [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.304 187212 DEBUG nova.compute.manager [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Processing event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.305 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance event wait completed in 18 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.316 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936118.3163302, 1282e776-5758-493b-8f52-59839ebcd31b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.317 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] VM Resumed (Lifecycle Event)
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.323 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.329 187212 INFO nova.virt.libvirt.driver [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance spawned successfully.
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.329 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.345 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.352 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.358 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.358 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.359 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.359 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.360 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.361 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.384 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.420 187212 INFO nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Took 33.13 seconds to spawn the instance on the hypervisor.
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.420 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.482 187212 INFO nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Took 33.63 seconds to build instance.
Dec 05 12:01:58 compute-0 nova_compute[187208]: 2025-12-05 12:01:58.503 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 33.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:01:59 compute-0 nova_compute[187208]: 2025-12-05 12:01:59.587 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:59 compute-0 nova_compute[187208]: 2025-12-05 12:01:59.588 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:59 compute-0 nova_compute[187208]: 2025-12-05 12:01:59.609 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:01:59 compute-0 nova_compute[187208]: 2025-12-05 12:01:59.686 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:01:59 compute-0 nova_compute[187208]: 2025-12-05 12:01:59.687 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:01:59 compute-0 nova_compute[187208]: 2025-12-05 12:01:59.694 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:01:59 compute-0 nova_compute[187208]: 2025-12-05 12:01:59.694 187212 INFO nova.compute.claims [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:01:59 compute-0 nova_compute[187208]: 2025-12-05 12:01:59.993 187212 DEBUG nova.compute.provider_tree [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.010 187212 DEBUG nova.scheduler.client.report [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.031 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.031 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.124 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.125 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.153 187212 INFO nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.174 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:02:00 compute-0 podman[217598]: 2025-12-05 12:02:00.211981214 +0000 UTC m=+0.057873815 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.706 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.707 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.707 187212 INFO nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Creating image(s)
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.707 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.708 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.708 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.729 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.798 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.801 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.801 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.818 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.865 187212 DEBUG nova.policy [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.884 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.886 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:00 compute-0 nova_compute[187208]: 2025-12-05 12:02:00.906 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.127 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.128 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.128 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.128 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.128 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.130 187212 INFO nova.compute.manager [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Terminating instance
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.131 187212 DEBUG nova.compute.manager [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.135 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk 1073741824" returned: 0 in 0.249s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.135 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.136 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:01 compute-0 kernel: tap02d6eab5-45 (unregistering): left promiscuous mode
Dec 05 12:02:01 compute-0 NetworkManager[55691]: <info>  [1764936121.1804] device (tap02d6eab5-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.190 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 ovn_controller[95610]: 2025-12-05T12:02:01Z|00141|memory|INFO|peak resident set size grew 50% in last 985.0 seconds, from 16000 kB to 24064 kB
Dec 05 12:02:01 compute-0 ovn_controller[95610]: 2025-12-05T12:02:01Z|00142|memory|INFO|idl-cells-OVN_Southbound:13154 idl-cells-Open_vSwitch:1269 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:2 lflow-cache-entries-cache-expr:359 lflow-cache-entries-cache-matches:275 lflow-cache-size-KB:1528 local_datapath_usage-KB:2 ofctrl_desired_flow_usage-KB:563 ofctrl_installed_flow_usage-KB:410 ofctrl_sb_flow_ref_usage-KB:211
Dec 05 12:02:01 compute-0 ovn_controller[95610]: 2025-12-05T12:02:01Z|00143|binding|INFO|Releasing lport 02d6eab5-4561-4d9f-ad9a-169b57667224 from this chassis (sb_readonly=0)
Dec 05 12:02:01 compute-0 ovn_controller[95610]: 2025-12-05T12:02:01Z|00144|binding|INFO|Setting lport 02d6eab5-4561-4d9f-ad9a-169b57667224 down in Southbound
Dec 05 12:02:01 compute-0 ovn_controller[95610]: 2025-12-05T12:02:01Z|00145|binding|INFO|Removing iface tap02d6eab5-45 ovn-installed in OVS
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.194 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.202 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:b7:ec 10.100.0.5'], port_security=['fa:16:3e:d4:b7:ec 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-423f0bba-22e2-4219-9338-a671dbe69e42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c184f0f2b71412fb560981314d0574d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8510d8eb-f367-43d1-be5f-8be0c3ab7e61', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eacee27-dbb3-4c60-a47d-c1f874faea06, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=02d6eab5-4561-4d9f-ad9a-169b57667224) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.204 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 02d6eab5-4561-4d9f-ad9a-169b57667224 in datapath 423f0bba-22e2-4219-9338-a671dbe69e42 unbound from our chassis
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.207 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 423f0bba-22e2-4219-9338-a671dbe69e42
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.210 187212 DEBUG nova.compute.manager [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.211 187212 DEBUG oslo_concurrency.lockutils [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.211 187212 DEBUG oslo_concurrency.lockutils [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.211 187212 DEBUG oslo_concurrency.lockutils [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.211 187212 DEBUG nova.compute.manager [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] No waiting events found dispatching network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.212 187212 WARNING nova.compute.manager [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received unexpected event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d for instance with vm_state active and task_state None.
Dec 05 12:02:01 compute-0 kernel: tap0d74b914-0d (unregistering): left promiscuous mode
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.212 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 NetworkManager[55691]: <info>  [1764936121.2165] device (tap0d74b914-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:01 compute-0 ovn_controller[95610]: 2025-12-05T12:02:01Z|00146|binding|INFO|Releasing lport 0d74b914-0dbd-4356-8304-a42943811e2e from this chassis (sb_readonly=0)
Dec 05 12:02:01 compute-0 ovn_controller[95610]: 2025-12-05T12:02:01Z|00147|binding|INFO|Setting lport 0d74b914-0dbd-4356-8304-a42943811e2e down in Southbound
Dec 05 12:02:01 compute-0 ovn_controller[95610]: 2025-12-05T12:02:01Z|00148|binding|INFO|Removing iface tap0d74b914-0d ovn-installed in OVS
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.225 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.226 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.227 187212 DEBUG nova.virt.disk.api [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.227 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.234 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:e2:69 10.100.0.10'], port_security=['fa:16:3e:a5:e2:69 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-423f0bba-22e2-4219-9338-a671dbe69e42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c184f0f2b71412fb560981314d0574d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8510d8eb-f367-43d1-be5f-8be0c3ab7e61', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eacee27-dbb3-4c60-a47d-c1f874faea06, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0d74b914-0dbd-4356-8304-a42943811e2e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.246 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54f70b11-5508-4fee-b54f-0b575c943b75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.248 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000018.scope: Deactivated successfully.
Dec 05 12:02:01 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000018.scope: Consumed 13.107s CPU time.
Dec 05 12:02:01 compute-0 systemd-machined[153543]: Machine qemu-27-instance-00000018 terminated.
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.288 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5ff3e4-3c1e-4dee-bf6f-7cab6b527aba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.292 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.292 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f55e60f2-c537-4c6e-a1b6-733ac8bc2bee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.292 187212 DEBUG nova.virt.disk.api [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.293 187212 DEBUG nova.objects.instance [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid 795a269a-5af9-4e6a-bf1f-e2bb83634855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.308 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.309 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Ensure instance console log exists: /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.309 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.310 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.310 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.321 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[50a59568-b576-4592-bd21-7ae8ed94b570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.340 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[50f1699e-5b96-49d6-963d-533a87cde544]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap423f0bba-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:51:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347615, 'reachable_time': 21997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217670, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.357 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 NetworkManager[55691]: <info>  [1764936121.3657] manager: (tap0d74b914-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.365 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dbffeae6-9b11-4b9a-88b5-ff812fe6f18f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap423f0bba-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347626, 'tstamp': 347626}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217673, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap423f0bba-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347629, 'tstamp': 347629}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217673, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.368 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap423f0bba-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.370 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.380 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.395 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.396 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap423f0bba-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.396 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.396 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap423f0bba-20, col_values=(('external_ids', {'iface-id': '8801ec73-6ce8-4039-ab6c-4693dcbc877e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.397 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.399 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0d74b914-0dbd-4356-8304-a42943811e2e in datapath 423f0bba-22e2-4219-9338-a671dbe69e42 unbound from our chassis
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.401 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 423f0bba-22e2-4219-9338-a671dbe69e42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.405 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[279203da-9479-4f09-82eb-24fecb833e2b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.407 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 namespace which is not needed anymore
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.423 187212 INFO nova.virt.libvirt.driver [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance destroyed successfully.
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.423 187212 DEBUG nova.objects.instance [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'resources' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.436 187212 DEBUG nova.virt.libvirt.vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:35Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.437 187212 DEBUG nova.network.os_vif_util [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.438 187212 DEBUG nova.network.os_vif_util [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.438 187212 DEBUG os_vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.441 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.441 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02d6eab5-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.443 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.448 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.451 187212 INFO os_vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45')
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.454 187212 DEBUG nova.virt.libvirt.vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:35Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.455 187212 DEBUG nova.network.os_vif_util [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.455 187212 DEBUG nova.network.os_vif_util [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.456 187212 DEBUG os_vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.457 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.457 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d74b914-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.459 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.464 187212 INFO os_vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d')
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.464 187212 INFO nova.virt.libvirt.driver [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Deleting instance files /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31_del
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.465 187212 INFO nova.virt.libvirt.driver [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Deletion of /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31_del complete
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.516 187212 INFO nova.compute.manager [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.516 187212 DEBUG oslo.service.loopingcall [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.517 187212 DEBUG nova.compute.manager [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.517 187212 DEBUG nova.network.neutron [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:01 compute-0 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [NOTICE]   (217003) : haproxy version is 2.8.14-c23fe91
Dec 05 12:02:01 compute-0 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [NOTICE]   (217003) : path to executable is /usr/sbin/haproxy
Dec 05 12:02:01 compute-0 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [WARNING]  (217003) : Exiting Master process...
Dec 05 12:02:01 compute-0 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [WARNING]  (217003) : Exiting Master process...
Dec 05 12:02:01 compute-0 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [ALERT]    (217003) : Current worker (217006) exited with code 143 (Terminated)
Dec 05 12:02:01 compute-0 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [WARNING]  (217003) : All workers exited. Exiting... (0)
Dec 05 12:02:01 compute-0 systemd[1]: libpod-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8.scope: Deactivated successfully.
Dec 05 12:02:01 compute-0 conmon[216985]: conmon 719877fc51b88939aab7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8.scope/container/memory.events
Dec 05 12:02:01 compute-0 podman[217712]: 2025-12-05 12:02:01.548802863 +0000 UTC m=+0.044955846 container died 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:02:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8-userdata-shm.mount: Deactivated successfully.
Dec 05 12:02:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-5c6b011a5a4020501a80db3fcb573c57c4bdcfb8d5de9a077c6de3d75c9302b5-merged.mount: Deactivated successfully.
Dec 05 12:02:01 compute-0 podman[217712]: 2025-12-05 12:02:01.5878937 +0000 UTC m=+0.084046703 container cleanup 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:02:01 compute-0 systemd[1]: libpod-conmon-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8.scope: Deactivated successfully.
Dec 05 12:02:01 compute-0 podman[217737]: 2025-12-05 12:02:01.660487623 +0000 UTC m=+0.050717150 container remove 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.672 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[575cb5f6-d59b-4035-80c4-c07c529d9f59]: (4, ('Fri Dec  5 12:02:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 (719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8)\n719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8\nFri Dec  5 12:02:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 (719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8)\n719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.674 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[87402e80-0338-487c-bea3-4dc740afd971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.675 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap423f0bba-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:01 compute-0 kernel: tap423f0bba-20: left promiscuous mode
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.719 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.729 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.732 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9846cd4e-693b-4049-b163-70bb0351ba6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.744 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a3116a-284b-445c-8d91-d270642ebf71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.745 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5bfb812d-1899-48f8-967c-4c0d49c80017]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.759 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b07cd0ba-f06e-4d2f-b4f1-b816a193b7b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347608, 'reachable_time': 27761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217750, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.761 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:02:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.761 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[33d6e22a-cc43-4dee-86ac-c0f8cf251e34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d423f0bba\x2d22e2\x2d4219\x2d9338\x2da671dbe69e42.mount: Deactivated successfully.
Dec 05 12:02:01 compute-0 nova_compute[187208]: 2025-12-05 12:02:01.972 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Successfully created port: f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:02:02 compute-0 nova_compute[187208]: 2025-12-05 12:02:02.508 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:02 compute-0 ovn_controller[95610]: 2025-12-05T12:02:02Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:4f:38 10.100.0.13
Dec 05 12:02:02 compute-0 ovn_controller[95610]: 2025-12-05T12:02:02Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:4f:38 10.100.0.13
Dec 05 12:02:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.009 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.009 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.010 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.143 187212 DEBUG nova.network.neutron [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.161 187212 INFO nova.compute.manager [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Took 1.64 seconds to deallocate network for instance.
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.203 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Successfully updated port: f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.207 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.207 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.217 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.218 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.218 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.367 187212 DEBUG nova.compute.provider_tree [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.383 187212 DEBUG nova.scheduler.client.report [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.407 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.440 187212 INFO nova.scheduler.client.report [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Deleted allocations for instance 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.487 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.493 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.547 187212 DEBUG nova.compute.manager [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-unplugged-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.548 187212 DEBUG oslo_concurrency.lockutils [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.548 187212 DEBUG oslo_concurrency.lockutils [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.548 187212 DEBUG oslo_concurrency.lockutils [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.548 187212 DEBUG nova.compute.manager [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-unplugged-02d6eab5-4561-4d9f-ad9a-169b57667224 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.549 187212 WARNING nova.compute.manager [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-unplugged-02d6eab5-4561-4d9f-ad9a-169b57667224 for instance with vm_state deleted and task_state None.
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.727 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.727 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.728 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.728 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.728 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.729 187212 INFO nova.compute.manager [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Terminating instance
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.730 187212 DEBUG nova.compute.manager [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:03 compute-0 kernel: tap9bb4b8ce-57 (unregistering): left promiscuous mode
Dec 05 12:02:03 compute-0 NetworkManager[55691]: <info>  [1764936123.7490] device (tap9bb4b8ce-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:03 compute-0 ovn_controller[95610]: 2025-12-05T12:02:03Z|00149|binding|INFO|Releasing lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d from this chassis (sb_readonly=0)
Dec 05 12:02:03 compute-0 ovn_controller[95610]: 2025-12-05T12:02:03Z|00150|binding|INFO|Setting lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d down in Southbound
Dec 05 12:02:03 compute-0 ovn_controller[95610]: 2025-12-05T12:02:03Z|00151|binding|INFO|Removing iface tap9bb4b8ce-57 ovn-installed in OVS
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.753 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.758 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:f0:53 10.100.0.14'], port_security=['fa:16:3e:06:f0:53 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1282e776-5758-493b-8f52-59839ebcd31b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455bb7e1-6680-472e-861f-da50aef09a7f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8400e354e93c4b33b8d683012dfe5c94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9474b356-5c55-44a1-af48-0eeaf9a9ad0b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07a9feeb-8467-4a6f-b0e2-fda2f133d3ac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.759 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d in datapath 455bb7e1-6680-472e-861f-da50aef09a7f unbound from our chassis
Dec 05 12:02:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.762 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 455bb7e1-6680-472e-861f-da50aef09a7f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:02:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.762 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a69778-8747-484a-ae23-e8552908e5dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.763 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f namespace which is not needed anymore
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:03 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000017.scope: Deactivated successfully.
Dec 05 12:02:03 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000017.scope: Consumed 6.174s CPU time.
Dec 05 12:02:03 compute-0 systemd-machined[153543]: Machine qemu-28-instance-00000017 terminated.
Dec 05 12:02:03 compute-0 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [NOTICE]   (217165) : haproxy version is 2.8.14-c23fe91
Dec 05 12:02:03 compute-0 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [NOTICE]   (217165) : path to executable is /usr/sbin/haproxy
Dec 05 12:02:03 compute-0 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [WARNING]  (217165) : Exiting Master process...
Dec 05 12:02:03 compute-0 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [ALERT]    (217165) : Current worker (217168) exited with code 143 (Terminated)
Dec 05 12:02:03 compute-0 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [WARNING]  (217165) : All workers exited. Exiting... (0)
Dec 05 12:02:03 compute-0 systemd[1]: libpod-62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b.scope: Deactivated successfully.
Dec 05 12:02:03 compute-0 podman[217773]: 2025-12-05 12:02:03.887582796 +0000 UTC m=+0.042516395 container died 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:02:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b-userdata-shm.mount: Deactivated successfully.
Dec 05 12:02:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-4149b47c4e6793ce3b6fc5ffdd499aa39b6bd1d4bb7bbc0659f951080559deea-merged.mount: Deactivated successfully.
Dec 05 12:02:03 compute-0 podman[217773]: 2025-12-05 12:02:03.92061062 +0000 UTC m=+0.075544219 container cleanup 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:02:03 compute-0 systemd[1]: libpod-conmon-62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b.scope: Deactivated successfully.
Dec 05 12:02:03 compute-0 podman[217801]: 2025-12-05 12:02:03.991296429 +0000 UTC m=+0.050219125 container remove 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.995 187212 INFO nova.virt.libvirt.driver [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance destroyed successfully.
Dec 05 12:02:03 compute-0 nova_compute[187208]: 2025-12-05 12:02:03.996 187212 DEBUG nova.objects.instance [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lazy-loading 'resources' on Instance uuid 1282e776-5758-493b-8f52-59839ebcd31b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.998 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3b37aedc-58bc-45c5-9a79-60348a8be8f1]: (4, ('Fri Dec  5 12:02:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f (62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b)\n62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b\nFri Dec  5 12:02:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f (62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b)\n62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.000 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0b93f904-a312-4f29-b7e8-c3590970774f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.000 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455bb7e1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.013 187212 DEBUG nova.virt.libvirt.vif [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1341604448',display_name='tempest-ImagesNegativeTestJSON-server-1341604448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1341604448',id=23,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8400e354e93c4b33b8d683012dfe5c94',ramdisk_id='',reservation_id='r-7yvjshh1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-965315619',owner_user_name='tempest-ImagesNegativeTestJSON-965315619-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:58Z,user_data=None,user_id='496da6872d53413ea1c201178cf5b05c',uuid=1282e776-5758-493b-8f52-59839ebcd31b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.013 187212 DEBUG nova.network.os_vif_util [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converting VIF {"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.014 187212 DEBUG nova.network.os_vif_util [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.014 187212 DEBUG os_vif [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.016 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bb4b8ce-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:04 compute-0 kernel: tap455bb7e1-60: left promiscuous mode
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.017 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.020 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.023 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.027 187212 INFO os_vif [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57')
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.028 187212 INFO nova.virt.libvirt.driver [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Deleting instance files /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b_del
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.028 187212 INFO nova.virt.libvirt.driver [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Deletion of /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b_del complete
Dec 05 12:02:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.029 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b201cd-b475-46c5-9d94-b6f4bfb56e8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.046 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[14eaf412-4fb8-446f-b53b-4a06c770bf7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.048 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a9961fc1-39cf-496b-b269-21249f88224a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.064 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c74d7cc5-1b1b-417d-80f4-8c6360141032]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347938, 'reachable_time': 40453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217832, 'error': None, 'target': 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.066 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:02:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.066 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[e87673e4-29f6-4ba7-8413-3335c80c3937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:04 compute-0 systemd[1]: run-netns-ovnmeta\x2d455bb7e1\x2d6680\x2d472e\x2d861f\x2dda50aef09a7f.mount: Deactivated successfully.
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.092 187212 INFO nova.compute.manager [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.092 187212 DEBUG oslo.service.loopingcall [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.093 187212 DEBUG nova.compute.manager [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.093 187212 DEBUG nova.network.neutron [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.760 187212 DEBUG nova.compute.manager [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-changed-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.761 187212 DEBUG nova.compute.manager [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Refreshing instance network info cache due to event network-changed-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.761 187212 DEBUG oslo_concurrency.lockutils [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.918 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Updating instance_info_cache with network_info: [{"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.951 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.951 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance network_info: |[{"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.951 187212 DEBUG oslo_concurrency.lockutils [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.952 187212 DEBUG nova.network.neutron [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Refreshing network info cache for port f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.955 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start _get_guest_xml network_info=[{"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.959 187212 WARNING nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.963 187212 DEBUG nova.virt.libvirt.host [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.963 187212 DEBUG nova.virt.libvirt.host [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.966 187212 DEBUG nova.virt.libvirt.host [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.966 187212 DEBUG nova.virt.libvirt.host [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.967 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.967 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.968 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.968 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.970 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.970 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.970 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.975 187212 DEBUG nova.virt.libvirt.vif [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1602902542',display_name='tempest-ImagesTestJSON-server-1602902542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1602902542',id=27,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-5xs2qi13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:00Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=795a269a-5af9-4e6a-bf1f-e2bb83634855,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.975 187212 DEBUG nova.network.os_vif_util [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.976 187212 DEBUG nova.network.os_vif_util [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.976 187212 DEBUG nova.objects.instance [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid 795a269a-5af9-4e6a-bf1f-e2bb83634855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.996 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <uuid>795a269a-5af9-4e6a-bf1f-e2bb83634855</uuid>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <name>instance-0000001b</name>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesTestJSON-server-1602902542</nova:name>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:02:04</nova:creationTime>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:02:04 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:02:04 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:02:04 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:02:04 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:02:04 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:02:04 compute-0 nova_compute[187208]:         <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec 05 12:02:04 compute-0 nova_compute[187208]:         <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:02:04 compute-0 nova_compute[187208]:         <nova:port uuid="f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa">
Dec 05 12:02:04 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <system>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <entry name="serial">795a269a-5af9-4e6a-bf1f-e2bb83634855</entry>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <entry name="uuid">795a269a-5af9-4e6a-bf1f-e2bb83634855</entry>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     </system>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <os>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   </os>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <features>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   </features>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.config"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:5f:9d:39"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <target dev="tapf6fc1ec5-ea"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/console.log" append="off"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <video>
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     </video>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:02:04 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:02:04 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:02:04 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:02:04 compute-0 nova_compute[187208]: </domain>
Dec 05 12:02:04 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.997 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Preparing to wait for external event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.997 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.997 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.997 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.998 187212 DEBUG nova.virt.libvirt.vif [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1602902542',display_name='tempest-ImagesTestJSON-server-1602902542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1602902542',id=27,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-5xs2qi13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:00Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=795a269a-5af9-4e6a-bf1f-e2bb83634855,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.998 187212 DEBUG nova.network.os_vif_util [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.999 187212 DEBUG nova.network.os_vif_util [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:04 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.999 187212 DEBUG os_vif [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:04.999 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.000 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.000 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.003 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6fc1ec5-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.003 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6fc1ec5-ea, col_values=(('external_ids', {'iface-id': 'f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:9d:39', 'vm-uuid': '795a269a-5af9-4e6a-bf1f-e2bb83634855'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.005 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:05 compute-0 NetworkManager[55691]: <info>  [1764936125.0069] manager: (tapf6fc1ec5-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.007 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.009 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.010 187212 INFO os_vif [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea')
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.064 187212 DEBUG nova.network.neutron [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.067 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.068 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.068 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:5f:9d:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.069 187212 INFO nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Using config drive
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.080 187212 INFO nova.compute.manager [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Took 0.99 seconds to deallocate network for instance.
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.128 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.128 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.326 187212 DEBUG nova.compute.provider_tree [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.350 187212 DEBUG nova.scheduler.client.report [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.393 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.428 187212 INFO nova.scheduler.client.report [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Deleted allocations for instance 1282e776-5758-493b-8f52-59839ebcd31b
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.485 187212 INFO nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Creating config drive at /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.config
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.490 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz5h6rcr6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.523 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.619 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz5h6rcr6" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:05 compute-0 kernel: tapf6fc1ec5-ea: entered promiscuous mode
Dec 05 12:02:05 compute-0 ovn_controller[95610]: 2025-12-05T12:02:05Z|00152|binding|INFO|Claiming lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa for this chassis.
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.669 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:05 compute-0 ovn_controller[95610]: 2025-12-05T12:02:05Z|00153|binding|INFO|f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa: Claiming fa:16:3e:5f:9d:39 10.100.0.12
Dec 05 12:02:05 compute-0 NetworkManager[55691]: <info>  [1764936125.6703] manager: (tapf6fc1ec5-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.681 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:9d:39 10.100.0.12'], port_security=['fa:16:3e:5f:9d:39 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '795a269a-5af9-4e6a-bf1f-e2bb83634855', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.682 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.686 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.696 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5db2eb61-a2a5-4ff2-858b-fa1e9cdb653f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.697 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.699 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.699 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[befe14be-fc4e-4392-bb99-ec1e067fbbd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 systemd-udevd[217850]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.700 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9cdbd98a-e977-4591-b4d7-dabbe74f3a6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.711 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[bbce06b2-50dd-46cc-90b8-3e3823f14823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 NetworkManager[55691]: <info>  [1764936125.7139] device (tapf6fc1ec5-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:02:05 compute-0 NetworkManager[55691]: <info>  [1764936125.7150] device (tapf6fc1ec5-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:02:05 compute-0 systemd-machined[153543]: New machine qemu-31-instance-0000001b.
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.745 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f496c5b-03df-4ccd-a0b3-435b84b93e6f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 systemd[1]: Started Virtual Machine qemu-31-instance-0000001b.
Dec 05 12:02:05 compute-0 ovn_controller[95610]: 2025-12-05T12:02:05Z|00154|binding|INFO|Setting lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa ovn-installed in OVS
Dec 05 12:02:05 compute-0 ovn_controller[95610]: 2025-12-05T12:02:05Z|00155|binding|INFO|Setting lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa up in Southbound
Dec 05 12:02:05 compute-0 nova_compute[187208]: 2025-12-05 12:02:05.753 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.779 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[940c4343-60bc-46dc-b82d-fd06fb546902]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 NetworkManager[55691]: <info>  [1764936125.7848] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.784 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6f5249-65ab-4c9c-9bd0-689454f2b320]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.820 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[832bc666-fd1a-4eed-ad6a-ee46ae4bd158]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.824 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9997b021-a840-4646-ad02-8a4db6cfc6b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 NetworkManager[55691]: <info>  [1764936125.8538] device (tap41b3b495-c0): carrier: link connected
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.860 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b8fe456d-9b33-4f3b-b2fc-ca588d1a2839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.879 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3be7b72-a1bd-4f36-b0be-601e815d6836]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350641, 'reachable_time': 41239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217885, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.894 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6de8f94f-fd46-4511-957a-a2be46051bdf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 350641, 'tstamp': 350641}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217886, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.914 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4e82f9-3aa1-4570-b1d9-79c0def44f20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350641, 'reachable_time': 41239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217887, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.957 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[955c4df0-1ef8-477d-8dd9-cc910936dd29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.017 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9addf6a9-7ff2-4a6d-8dfe-f051b9db6452]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.019 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.019 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.019 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:06 compute-0 kernel: tap41b3b495-c0: entered promiscuous mode
Dec 05 12:02:06 compute-0 NetworkManager[55691]: <info>  [1764936126.0238] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.025 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:06 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00156|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.028 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.029 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84d23969-5818-4ee9-9f39-ed02e2e99fd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.030 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.032 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.039 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.100 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936126.0987484, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.100 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Started (Lifecycle Event)
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.121 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.125 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936126.100037, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.126 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Paused (Lifecycle Event)
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.143 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.151 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.174 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:06 compute-0 podman[217926]: 2025-12-05 12:02:06.399425683 +0000 UTC m=+0.045103970 container create 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:02:06 compute-0 systemd[1]: Started libpod-conmon-946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3.scope.
Dec 05 12:02:06 compute-0 podman[217926]: 2025-12-05 12:02:06.375558101 +0000 UTC m=+0.021236408 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:02:06 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:02:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55714bde8320c7ed878ac9913f0bc25c165a8c6293c219aeaa49515054bdec12/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:02:06 compute-0 podman[217926]: 2025-12-05 12:02:06.504090873 +0000 UTC m=+0.149769180 container init 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:02:06 compute-0 podman[217926]: 2025-12-05 12:02:06.511080833 +0000 UTC m=+0.156759120 container start 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:02:06 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [NOTICE]   (217946) : New worker (217948) forked
Dec 05 12:02:06 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [NOTICE]   (217946) : Loading success.
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.736 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.737 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.738 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.738 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.739 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.740 187212 INFO nova.compute.manager [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Terminating instance
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.741 187212 DEBUG nova.compute.manager [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:06 compute-0 kernel: tap47612a1a-e4 (unregistering): left promiscuous mode
Dec 05 12:02:06 compute-0 NetworkManager[55691]: <info>  [1764936126.7773] device (tap47612a1a-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:06 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00157|binding|INFO|Releasing lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e from this chassis (sb_readonly=0)
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.787 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:06 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00158|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e down in Southbound
Dec 05 12:02:06 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00159|binding|INFO|Removing iface tap47612a1a-e4 ovn-installed in OVS
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.790 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.798 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e2:12 10.100.0.10'], port_security=['fa:16:3e:45:e2:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=47612a1a-e470-434b-927c-8fcd6c2fbe4e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.800 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.800 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 47612a1a-e470-434b-927c-8fcd6c2fbe4e in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.804 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.817 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e359f209-53bb-4263-99d5-4084125dc106]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:06 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000013.scope: Deactivated successfully.
Dec 05 12:02:06 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000013.scope: Consumed 14.316s CPU time.
Dec 05 12:02:06 compute-0 systemd-machined[153543]: Machine qemu-19-instance-00000013 terminated.
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.843 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3adc6c6e-062e-451f-97e1-1701dfdb1912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.845 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7900d64e-32e1-461e-869a-b2769e8cff9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:06 compute-0 podman[217958]: 2025-12-05 12:02:06.865894479 +0000 UTC m=+0.057417101 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.880 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8967182b-6b1a-464b-b7e7-4a596e691a63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.898 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[932b0107-b235-4d9d-809b-f721e0b79ee7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 19, 'rx_bytes': 868, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 19, 'rx_bytes': 868, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217987, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.914 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1819c85a-6957-4d19-8c07-fde69cc01d04]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217988, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217988, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.917 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.925 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.926 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.926 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.927 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.927 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:06 compute-0 kernel: tap47612a1a-e4: entered promiscuous mode
Dec 05 12:02:06 compute-0 kernel: tap47612a1a-e4 (unregistering): left promiscuous mode
Dec 05 12:02:06 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00160|binding|INFO|Claiming lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e for this chassis.
Dec 05 12:02:06 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.974 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:06 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00161|binding|INFO|47612a1a-e470-434b-927c-8fcd6c2fbe4e: Claiming fa:16:3e:45:e2:12 10.100.0.10
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.992 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e2:12 10.100.0.10'], port_security=['fa:16:3e:45:e2:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=47612a1a-e470-434b-927c-8fcd6c2fbe4e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.993 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 47612a1a-e470-434b-927c-8fcd6c2fbe4e in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis
Dec 05 12:02:06 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00162|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e ovn-installed in OVS
Dec 05 12:02:06 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00163|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e up in Southbound
Dec 05 12:02:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.996 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:06.997 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00164|binding|INFO|Releasing lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e from this chassis (sb_readonly=1)
Dec 05 12:02:07 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00165|binding|INFO|Removing iface tap47612a1a-e4 ovn-installed in OVS
Dec 05 12:02:07 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00166|if_status|INFO|Not setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e down as sb is readonly
Dec 05 12:02:07 compute-0 ovn_controller[95610]: 2025-12-05T12:02:06Z|00167|binding|INFO|Releasing lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e from this chassis (sb_readonly=0)
Dec 05 12:02:07 compute-0 ovn_controller[95610]: 2025-12-05T12:02:07Z|00168|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e down in Southbound
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.001 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.010 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e2:12 10.100.0.10'], port_security=['fa:16:3e:45:e2:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=47612a1a-e470-434b-927c-8fcd6c2fbe4e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.011 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.012 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4ad23d-aa2a-4e83-973b-a6626cf3fe94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.016 187212 INFO nova.virt.libvirt.driver [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance destroyed successfully.
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.017 187212 DEBUG nova.objects.instance [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid d95c0324-d1d3-4960-9ab7-3a2a098a9f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.029 187212 DEBUG nova.virt.libvirt.vif [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:00:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1974624987',display_name='tempest-ServersAdminTestJSON-server-1974624987',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1974624987',id=19,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-wb0rav5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:49Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=d95c0324-d1d3-4960-9ab7-3a2a098a9f7c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.030 187212 DEBUG nova.network.os_vif_util [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.031 187212 DEBUG nova.network.os_vif_util [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.031 187212 DEBUG os_vif [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.033 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.033 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47612a1a-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.035 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.036 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.038 187212 INFO os_vif [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4')
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.038 187212 INFO nova.virt.libvirt.driver [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Deleting instance files /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c_del
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.039 187212 INFO nova.virt.libvirt.driver [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Deletion of /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c_del complete
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.041 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[35c39737-4674-4c68-9b56-15e74547af12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.043 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[148a7f0b-53fa-4ac0-b16c-5e86b2833247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[47f890c4-6f8d-4e62-b357-ae3d8565273f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.082 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[838fc71a-66f5-4b5c-8379-c6a59a7f0c48]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 21, 'rx_bytes': 868, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 21, 'rx_bytes': 868, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218011, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.088 187212 INFO nova.compute.manager [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.088 187212 DEBUG oslo.service.loopingcall [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.088 187212 DEBUG nova.compute.manager [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.089 187212 DEBUG nova.network.neutron [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.099 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35d79243-1fd6-4782-913a-f34079f91440]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218012, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218012, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.101 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.103 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.105 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.106 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.106 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.106 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.108 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 47612a1a-e470-434b-927c-8fcd6c2fbe4e in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.110 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.126 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc20ca5-11b6-4a4d-a0ce-5542541c965a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.155 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bb85ea67-681a-46f2-a721-3beccb6d8738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.157 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3ba55b-9b66-4d7a-acaa-2b0a2ace2486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.189 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea8872d-1631-48d1-9bbb-a77f1f053818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.208 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bec57974-7caa-4847-954a-0d69b7536807]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 23, 'rx_bytes': 868, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 23, 'rx_bytes': 868, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218018, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.225 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8f28524e-9d87-436e-9011-c39be756601f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218019, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218019, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.226 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.228 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.229 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.230 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.230 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.230 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.230 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.357 187212 DEBUG nova.network.neutron [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Updated VIF entry in instance network info cache for port f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.358 187212 DEBUG nova.network.neutron [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Updating instance_info_cache with network_info: [{"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.388 187212 DEBUG oslo_concurrency.lockutils [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.482 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.483 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.483 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.483 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 WARNING nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 for instance with vm_state deleted and task_state None.
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-unplugged-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.485 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.485 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-unplugged-0d74b914-0dbd-4356-8304-a42943811e2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.485 187212 WARNING nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-unplugged-0d74b914-0dbd-4356-8304-a42943811e2e for instance with vm_state deleted and task_state None.
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.485 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 WARNING nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e for instance with vm_state deleted and task_state None.
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.487 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-deleted-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.487 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-deleted-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.487 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-deleted-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.510 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.609 187212 DEBUG nova.network.neutron [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.626 187212 INFO nova.compute.manager [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Took 0.54 seconds to deallocate network for instance.
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.673 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.673 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.837 187212 DEBUG nova.compute.provider_tree [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.854 187212 DEBUG nova.scheduler.client.report [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.873 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.903 187212 INFO nova.scheduler.client.report [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Deleted allocations for instance d95c0324-d1d3-4960-9ab7-3a2a098a9f7c
Dec 05 12:02:07 compute-0 nova_compute[187208]: 2025-12-05 12:02:07.959 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.102 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.102 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.102 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.103 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.103 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Processing event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.103 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.103 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.104 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.104 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.104 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] No waiting events found dispatching network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.104 187212 WARNING nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received unexpected event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 for instance with vm_state building and task_state spawning.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.105 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.105 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.105 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.105 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.106 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Processing event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.106 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.106 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.106 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.107 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.107 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] No waiting events found dispatching network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.107 187212 WARNING nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received unexpected event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 for instance with vm_state building and task_state spawning.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.107 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-unplugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.108 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.108 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.108 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.108 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] No waiting events found dispatching network-vif-unplugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.109 187212 WARNING nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received unexpected event network-vif-unplugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d for instance with vm_state deleted and task_state None.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.109 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.109 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.109 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.110 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.110 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] No waiting events found dispatching network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.110 187212 WARNING nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received unexpected event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d for instance with vm_state deleted and task_state None.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.110 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.111 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.111 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.111 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.111 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Processing event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.112 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance event wait completed in 21 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.113 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance event wait completed in 15 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.114 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.119 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.121 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936128.1202078, adc15883-b705-42dd-ac95-04f4b8964012 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.121 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] VM Resumed (Lifecycle Event)
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.126 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.129 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.134 187212 INFO nova.virt.libvirt.driver [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance spawned successfully.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.135 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.137 187212 INFO nova.virt.libvirt.driver [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance spawned successfully.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.137 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.142 187212 INFO nova.virt.libvirt.driver [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance spawned successfully.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.143 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.155 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.157 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.171 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.172 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.172 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.173 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.173 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.174 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.179 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.179 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936128.1224675, 1606eea3-5389-4437-b0f9-cfe6084d7871 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.179 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] VM Resumed (Lifecycle Event)
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.182 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.182 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.183 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.183 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.183 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.184 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.195 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.195 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.196 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.196 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.196 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.197 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.217 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.221 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.262 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.263 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936128.1253827, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.263 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Resumed (Lifecycle Event)
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.309 187212 INFO nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Took 26.85 seconds to spawn the instance on the hypervisor.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.309 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.311 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.317 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.323 187212 INFO nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Took 33.44 seconds to spawn the instance on the hypervisor.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.324 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.353 187212 INFO nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 7.65 seconds to spawn the instance on the hypervisor.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.354 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.365 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.435 187212 INFO nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Took 28.15 seconds to build instance.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.449 187212 INFO nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 8.79 seconds to build instance.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.452 187212 INFO nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Took 34.17 seconds to build instance.
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.464 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 28.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.467 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:08 compute-0 nova_compute[187208]: 2025-12-05 12:02:08.469 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 34.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.314 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.317 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.318 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.318 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.318 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.320 187212 INFO nova.compute.manager [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Terminating instance
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.321 187212 DEBUG nova.compute.manager [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:09 compute-0 kernel: tap75a214ef-2b (unregistering): left promiscuous mode
Dec 05 12:02:09 compute-0 NetworkManager[55691]: <info>  [1764936129.5562] device (tap75a214ef-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.571 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:09 compute-0 ovn_controller[95610]: 2025-12-05T12:02:09Z|00169|binding|INFO|Releasing lport 75a214ef-2b9f-4c81-bdad-de5791244b85 from this chassis (sb_readonly=0)
Dec 05 12:02:09 compute-0 ovn_controller[95610]: 2025-12-05T12:02:09Z|00170|binding|INFO|Setting lport 75a214ef-2b9f-4c81-bdad-de5791244b85 down in Southbound
Dec 05 12:02:09 compute-0 ovn_controller[95610]: 2025-12-05T12:02:09Z|00171|binding|INFO|Removing iface tap75a214ef-2b ovn-installed in OVS
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.574 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.585 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:46:fb 10.100.0.5'], port_security=['fa:16:3e:d9:46:fb 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=75a214ef-2b9f-4c81-bdad-de5791244b85) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.587 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 75a214ef-2b9f-4c81-bdad-de5791244b85 in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.589 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.592 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.612 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d34152de-007b-4f03-b61d-583f1c35232c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:09 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000010.scope: Deactivated successfully.
Dec 05 12:02:09 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000010.scope: Consumed 15.767s CPU time.
Dec 05 12:02:09 compute-0 systemd-machined[153543]: Machine qemu-16-instance-00000010 terminated.
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.652 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[312c9ef0-f242-475e-ac30-2d3b321a333a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.656 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[661ce4c5-8754-493e-bc0e-d763586b9a9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.697 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[937984b0-b985-45fc-92e9-2b6df9598c6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.724 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[47318d5c-ce5a-4bc3-9eeb-6054615967c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 16, 'tx_packets': 25, 'rx_bytes': 952, 'tx_bytes': 1194, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 16, 'tx_packets': 25, 'rx_bytes': 952, 'tx_bytes': 1194, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218033, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.741 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[429f6506-c206-4a80-8442-3423b7d16282]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218034, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218034, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.743 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.753 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.755 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.755 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.756 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.756 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.789 187212 INFO nova.virt.libvirt.driver [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance destroyed successfully.
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.790 187212 DEBUG nova.objects.instance [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 4e7aec76-673e-48b5-b183-cc9c7a95fd37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.804 187212 DEBUG nova.virt.libvirt.vif [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:00:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-720093205',display_name='tempest-ServersAdminTestJSON-server-720093205',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-720093205',id=16,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-00wbi3mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:26Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=4e7aec76-673e-48b5-b183-cc9c7a95fd37,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.804 187212 DEBUG nova.network.os_vif_util [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.805 187212 DEBUG nova.network.os_vif_util [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.805 187212 DEBUG os_vif [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.807 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75a214ef-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.811 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.814 187212 INFO os_vif [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b')
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.815 187212 INFO nova.virt.libvirt.driver [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Deleting instance files /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37_del
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.815 187212 INFO nova.virt.libvirt.driver [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Deletion of /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37_del complete
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.866 187212 INFO nova.compute.manager [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Took 0.55 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.867 187212 DEBUG oslo.service.loopingcall [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.867 187212 DEBUG nova.compute.manager [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:09 compute-0 nova_compute[187208]: 2025-12-05 12:02:09.867 187212 DEBUG nova.network.neutron [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:10 compute-0 nova_compute[187208]: 2025-12-05 12:02:10.967 187212 DEBUG nova.compute.manager [req-0bad4528-4357-4b90-8150-21eba27b1818 req-80fb344c-cb81-4d22-8d09-cbd9850dbb0c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-vif-deleted-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.085 187212 INFO nova.compute.manager [None req-48c50a20-93b6-40c9-880d-24f98e4785f8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Pausing
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.086 187212 DEBUG nova.objects.instance [None req-48c50a20-93b6-40c9-880d-24f98e4785f8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'flavor' on Instance uuid 795a269a-5af9-4e6a-bf1f-e2bb83634855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.116 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936131.1160843, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.116 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Paused (Lifecycle Event)
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.118 187212 DEBUG nova.compute.manager [None req-48c50a20-93b6-40c9-880d-24f98e4785f8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.161 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.164 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.192 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] During sync_power_state the instance has a pending task (pausing). Skip.
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.371 187212 DEBUG nova.network.neutron [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.399 187212 DEBUG nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 DEBUG nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] No waiting events found dispatching network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 WARNING nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received unexpected event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa for instance with vm_state paused and task_state None.
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] No waiting events found dispatching network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.402 187212 WARNING nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received unexpected event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e for instance with vm_state deleted and task_state None.
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.403 187212 INFO nova.compute.manager [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Took 1.54 seconds to deallocate network for instance.
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.450 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.450 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.584 187212 DEBUG nova.compute.provider_tree [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.600 187212 DEBUG nova.scheduler.client.report [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.624 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.647 187212 INFO nova.scheduler.client.report [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Deleted allocations for instance 4e7aec76-673e-48b5-b183-cc9c7a95fd37
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.720 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:11 compute-0 ovn_controller[95610]: 2025-12-05T12:02:11Z|00172|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:02:11 compute-0 ovn_controller[95610]: 2025-12-05T12:02:11Z|00173|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec 05 12:02:11 compute-0 ovn_controller[95610]: 2025-12-05T12:02:11Z|00174|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec 05 12:02:11 compute-0 ovn_controller[95610]: 2025-12-05T12:02:11Z|00175|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:02:11 compute-0 nova_compute[187208]: 2025-12-05 12:02:11.861 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:12 compute-0 NetworkManager[55691]: <info>  [1764936132.0202] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Dec 05 12:02:12 compute-0 NetworkManager[55691]: <info>  [1764936132.0218] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Dec 05 12:02:12 compute-0 nova_compute[187208]: 2025-12-05 12:02:12.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:12 compute-0 nova_compute[187208]: 2025-12-05 12:02:12.201 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:12 compute-0 ovn_controller[95610]: 2025-12-05T12:02:12Z|00176|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:02:12 compute-0 ovn_controller[95610]: 2025-12-05T12:02:12Z|00177|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec 05 12:02:12 compute-0 ovn_controller[95610]: 2025-12-05T12:02:12Z|00178|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec 05 12:02:12 compute-0 ovn_controller[95610]: 2025-12-05T12:02:12Z|00179|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:02:12 compute-0 nova_compute[187208]: 2025-12-05 12:02:12.237 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:12 compute-0 nova_compute[187208]: 2025-12-05 12:02:12.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.956 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-unplugged-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.957 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.957 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.957 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.958 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] No waiting events found dispatching network-vif-unplugged-75a214ef-2b9f-4c81-bdad-de5791244b85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.958 187212 WARNING nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received unexpected event network-vif-unplugged-75a214ef-2b9f-4c81-bdad-de5791244b85 for instance with vm_state deleted and task_state None.
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.958 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.959 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.959 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.960 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.960 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] No waiting events found dispatching network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.960 187212 WARNING nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received unexpected event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 for instance with vm_state deleted and task_state None.
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.961 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-changed-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.961 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Refreshing instance network info cache due to event network-changed-78310fa8-21e8-49e5-8b60-867d1089ad71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.961 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.962 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:13 compute-0 nova_compute[187208]: 2025-12-05 12:02:13.963 187212 DEBUG nova.network.neutron [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Refreshing network info cache for port 78310fa8-21e8-49e5-8b60-867d1089ad71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.273 187212 DEBUG nova.compute.manager [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.357 187212 INFO nova.compute.manager [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] instance snapshotting
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.358 187212 WARNING nova.compute.manager [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] trying to snapshot a non-running instance: (state: 3 expected: 1)
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.621 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.622 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.622 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.625 187212 INFO nova.compute.manager [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Terminating instance
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.626 187212 DEBUG nova.compute.manager [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:14 compute-0 kernel: tapf194d74d-a9 (unregistering): left promiscuous mode
Dec 05 12:02:14 compute-0 NetworkManager[55691]: <info>  [1764936134.6495] device (tapf194d74d-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:14 compute-0 ovn_controller[95610]: 2025-12-05T12:02:14Z|00180|binding|INFO|Releasing lport f194d74d-a9ec-4838-b35d-8393a2087ec5 from this chassis (sb_readonly=0)
Dec 05 12:02:14 compute-0 ovn_controller[95610]: 2025-12-05T12:02:14Z|00181|binding|INFO|Setting lport f194d74d-a9ec-4838-b35d-8393a2087ec5 down in Southbound
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 ovn_controller[95610]: 2025-12-05T12:02:14Z|00182|binding|INFO|Removing iface tapf194d74d-a9 ovn-installed in OVS
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.658 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.667 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:fa:14 10.100.0.14'], port_security=['fa:16:3e:d0:fa:14 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f194d74d-a9ec-4838-b35d-8393a2087ec5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.668 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f194d74d-a9ec-4838-b35d-8393a2087ec5 in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.671 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.674 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.688 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e743e2d-af6d-4559-aad9-9cd9df46ecaa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.715 187212 DEBUG nova.compute.manager [req-ba2385d3-df5f-40fb-bf8a-2f5d629cb9d6 req-12d93cdc-81ad-4c47-b3e8-668f1212b4bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-deleted-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:14 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec 05 12:02:14 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000000d.scope: Consumed 17.877s CPU time.
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.722 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e41f79d7-812e-4e64-9441-d9ef047a9288]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:14 compute-0 systemd-machined[153543]: Machine qemu-14-instance-0000000d terminated.
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.726 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[98e29474-79a7-4444-b672-0f700935d583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.751 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d5fda0a3-4aae-4ae8-add4-0520202cecd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.765 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f056fd39-0db5-406e-91ff-775b27adfed1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 16, 'tx_packets': 27, 'rx_bytes': 952, 'tx_bytes': 1278, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 16, 'tx_packets': 27, 'rx_bytes': 952, 'tx_bytes': 1278, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218066, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.780 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e40e1b0c-2ae7-4fa6-bd96-53f6d52f203d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218067, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218067, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.782 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.783 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.787 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.787 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.788 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.788 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.788 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.837 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Beginning live snapshot process
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.846 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.892 187212 INFO nova.virt.libvirt.driver [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance destroyed successfully.
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.892 187212 DEBUG nova.objects.instance [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.912 187212 DEBUG nova.virt.libvirt.vif [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:59:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1562123791',display_name='tempest-ServersAdminTestJSON-server-1562123791',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1562123791',id=13,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-vj86fqlt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:11Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.913 187212 DEBUG nova.network.os_vif_util [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.914 187212 DEBUG nova.network.os_vif_util [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.914 187212 DEBUG os_vif [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.916 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf194d74d-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.919 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.921 187212 INFO os_vif [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9')
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.922 187212 INFO nova.virt.libvirt.driver [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Deleting instance files /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa_del
Dec 05 12:02:14 compute-0 nova_compute[187208]: 2025-12-05 12:02:14.923 187212 INFO nova.virt.libvirt.driver [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Deletion of /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa_del complete
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.007 187212 INFO nova.compute.manager [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.007 187212 DEBUG oslo.service.loopingcall [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.008 187212 DEBUG nova.compute.manager [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.008 187212 DEBUG nova.network.neutron [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:15 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.019 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.082 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.083 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.137 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json -f qcow2" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.153 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.225 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.227 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:15 compute-0 podman[218089]: 2025-12-05 12:02:15.234571503 +0000 UTC m=+0.079499232 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.267 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113.delta 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.269 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:02:15 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.321 187212 DEBUG nova.virt.libvirt.guest [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.324 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.365 187212 DEBUG nova.privsep.utils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.365 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113.delta /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.521 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113.delta /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.523 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Snapshot extracted, beginning image upload
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.752 187212 DEBUG nova.network.neutron [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.772 187212 INFO nova.compute.manager [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Took 0.76 seconds to deallocate network for instance.
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.815 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.816 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.939 187212 DEBUG nova.network.neutron [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updated VIF entry in instance network info cache for port 78310fa8-21e8-49e5-8b60-867d1089ad71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.940 187212 DEBUG nova.network.neutron [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updating instance_info_cache with network_info: [{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.965 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.969 187212 DEBUG nova.compute.provider_tree [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:15 compute-0 nova_compute[187208]: 2025-12-05 12:02:15.991 187212 DEBUG nova.scheduler.client.report [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.019 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.053 187212 INFO nova.scheduler.client.report [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Deleted allocations for instance 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.181 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.420 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936121.4190943, 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.421 187212 INFO nova.compute.manager [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] VM Stopped (Lifecycle Event)
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.894 187212 DEBUG nova.compute.manager [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-changed-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.895 187212 DEBUG nova.compute.manager [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Refreshing instance network info cache due to event network-changed-c72089e0-4937-40b6-86b5-f9d6d0982058. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.895 187212 DEBUG oslo_concurrency.lockutils [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.896 187212 DEBUG oslo_concurrency.lockutils [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.896 187212 DEBUG nova.network.neutron [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Refreshing network info cache for port c72089e0-4937-40b6-86b5-f9d6d0982058 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:02:16 compute-0 nova_compute[187208]: 2025-12-05 12:02:16.901 187212 DEBUG nova.compute.manager [None req-a64b7af6-b7a9-43e6-a455-408fe8ec746e - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.880 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.881 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.881 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.881 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.881 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.883 187212 INFO nova.compute.manager [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Terminating instance
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.884 187212 DEBUG nova.compute.manager [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:17 compute-0 kernel: tap380c99a7-94 (unregistering): left promiscuous mode
Dec 05 12:02:17 compute-0 NetworkManager[55691]: <info>  [1764936137.9189] device (tap380c99a7-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:17 compute-0 ovn_controller[95610]: 2025-12-05T12:02:17Z|00183|binding|INFO|Releasing lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d from this chassis (sb_readonly=0)
Dec 05 12:02:17 compute-0 ovn_controller[95610]: 2025-12-05T12:02:17Z|00184|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d down in Southbound
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:17 compute-0 ovn_controller[95610]: 2025-12-05T12:02:17Z|00185|binding|INFO|Removing iface tap380c99a7-94 ovn-installed in OVS
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.926 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.929 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.930 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis
Dec 05 12:02:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.933 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:02:17 compute-0 ovn_controller[95610]: 2025-12-05T12:02:17Z|00186|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:02:17 compute-0 ovn_controller[95610]: 2025-12-05T12:02:17Z|00187|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec 05 12:02:17 compute-0 ovn_controller[95610]: 2025-12-05T12:02:17Z|00188|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec 05 12:02:17 compute-0 ovn_controller[95610]: 2025-12-05T12:02:17Z|00189|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec 05 12:02:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.935 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[539d8bd8-564c-495e-a041-a9eae5874dd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.936 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 namespace which is not needed anymore
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.946 187212 DEBUG nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-unplugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.946 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.946 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 DEBUG nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] No waiting events found dispatching network-vif-unplugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 WARNING nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received unexpected event network-vif-unplugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 for instance with vm_state deleted and task_state None.
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 DEBUG nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.948 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.948 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.948 187212 DEBUG nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] No waiting events found dispatching network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.948 187212 WARNING nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received unexpected event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 for instance with vm_state deleted and task_state None.
Dec 05 12:02:17 compute-0 nova_compute[187208]: 2025-12-05 12:02:17.963 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:17 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec 05 12:02:17 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000000c.scope: Consumed 12.993s CPU time.
Dec 05 12:02:17 compute-0 systemd-machined[153543]: Machine qemu-26-instance-0000000c terminated.
Dec 05 12:02:18 compute-0 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [NOTICE]   (214873) : haproxy version is 2.8.14-c23fe91
Dec 05 12:02:18 compute-0 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [NOTICE]   (214873) : path to executable is /usr/sbin/haproxy
Dec 05 12:02:18 compute-0 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [WARNING]  (214873) : Exiting Master process...
Dec 05 12:02:18 compute-0 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [ALERT]    (214873) : Current worker (214875) exited with code 143 (Terminated)
Dec 05 12:02:18 compute-0 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [WARNING]  (214873) : All workers exited. Exiting... (0)
Dec 05 12:02:18 compute-0 systemd[1]: libpod-37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0.scope: Deactivated successfully.
Dec 05 12:02:18 compute-0 podman[218154]: 2025-12-05 12:02:18.080876252 +0000 UTC m=+0.050341929 container died 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:02:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0-userdata-shm.mount: Deactivated successfully.
Dec 05 12:02:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-38bc886dbbbea2769a63b04a9e8180064790337d80822dc7c2e5b30fc62aed96-merged.mount: Deactivated successfully.
Dec 05 12:02:18 compute-0 podman[218154]: 2025-12-05 12:02:18.12525911 +0000 UTC m=+0.094724787 container cleanup 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:02:18 compute-0 systemd[1]: libpod-conmon-37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0.scope: Deactivated successfully.
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.153 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.153 187212 DEBUG nova.objects.instance [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.171 187212 DEBUG nova.virt.libvirt.vif [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:02Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.171 187212 DEBUG nova.network.os_vif_util [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.173 187212 DEBUG nova.network.os_vif_util [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.173 187212 DEBUG os_vif [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.178 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap380c99a7-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:18 compute-0 podman[218195]: 2025-12-05 12:02:18.187099996 +0000 UTC m=+0.039184010 container remove 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.238 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Snapshot image upload complete
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.240 187212 INFO nova.compute.manager [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 3.88 seconds to snapshot the instance on the hypervisor.
Dec 05 12:02:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.244 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4718333e-7696-4cba-899c-714e671a7815]: (4, ('Fri Dec  5 12:02:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 (37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0)\n37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0\nFri Dec  5 12:02:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 (37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0)\n37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.245 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.245 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef1e967-85d3-4feb-a83b-6dbc3db2f8fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.246 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.247 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:18 compute-0 kernel: tap24c61e5e-70: left promiscuous mode
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.253 187212 INFO os_vif [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.254 187212 INFO nova.virt.libvirt.driver [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deleting instance files /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.255 187212 INFO nova.virt.libvirt.driver [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deletion of /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del complete
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.262 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.266 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[533e6808-ed9f-4809-87de-eb53b1019145]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.278 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2186617-f7c0-44dd-b931-c119881e167f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.279 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[339a1f8a-5dac-4179-8633-13e121e8db73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.295 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8a1e7d-0e4d-4c85-a91f-19463211aa3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338509, 'reachable_time': 39930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218211, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:18 compute-0 systemd[1]: run-netns-ovnmeta\x2d24c61e5e\x2d7d15\x2d4019\x2db1bd\x2dd2e253f41aa5.mount: Deactivated successfully.
Dec 05 12:02:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.297 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:02:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.297 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[288960ba-7c22-4631-865a-ad0eac004871]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.313 187212 INFO nova.compute.manager [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Took 0.43 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.314 187212 DEBUG oslo.service.loopingcall [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.314 187212 DEBUG nova.compute.manager [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.314 187212 DEBUG nova.network.neutron [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.993 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936123.9929156, 1282e776-5758-493b-8f52-59839ebcd31b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:18 compute-0 nova_compute[187208]: 2025-12-05 12:02:18.994 187212 INFO nova.compute.manager [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] VM Stopped (Lifecycle Event)
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.013 187212 DEBUG nova.compute.manager [None req-cc4b3881-0770-4a6b-9ac2-777151284987 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.022 187212 DEBUG nova.network.neutron [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.044 187212 INFO nova.compute.manager [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Took 0.73 seconds to deallocate network for instance.
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.065 187212 DEBUG nova.network.neutron [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updated VIF entry in instance network info cache for port c72089e0-4937-40b6-86b5-f9d6d0982058. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.065 187212 DEBUG nova.network.neutron [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updating instance_info_cache with network_info: [{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.102 187212 DEBUG oslo_concurrency.lockutils [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.102 187212 DEBUG nova.compute.manager [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-deleted-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.107 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.107 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.233 187212 DEBUG nova.compute.provider_tree [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.261 187212 DEBUG nova.scheduler.client.report [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.292 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.311 187212 INFO nova.scheduler.client.report [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Deleted allocations for instance 982a8e69-5181-4847-bdfe-8d4de12bb2e4
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.375 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:19 compute-0 nova_compute[187208]: 2025-12-05 12:02:19.469 187212 DEBUG nova.compute.manager [req-12e67ac1-37e7-4c50-9378-6be1ee976f58 req-11084258-784b-4a9c-9e1c-48e723ce2b14 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-deleted-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:20 compute-0 ovn_controller[95610]: 2025-12-05T12:02:20Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:42:5d 10.100.0.11
Dec 05 12:02:20 compute-0 ovn_controller[95610]: 2025-12-05T12:02:20Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:42:5d 10.100.0.11
Dec 05 12:02:20 compute-0 nova_compute[187208]: 2025-12-05 12:02:20.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:21 compute-0 podman[218248]: 2025-12-05 12:02:21.217817872 +0000 UTC m=+0.061981581 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Dec 05 12:02:21 compute-0 podman[218249]: 2025-12-05 12:02:21.235661382 +0000 UTC m=+0.074495899 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.261 187212 DEBUG nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.262 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.262 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.263 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.263 187212 DEBUG nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.263 187212 WARNING nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state deleted and task_state None.
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.264 187212 DEBUG nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.264 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.264 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.264 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.265 187212 DEBUG nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.265 187212 WARNING nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state deleted and task_state None.
Dec 05 12:02:21 compute-0 ovn_controller[95610]: 2025-12-05T12:02:21Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ea:73:d9 10.100.0.11
Dec 05 12:02:21 compute-0 ovn_controller[95610]: 2025-12-05T12:02:21Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ea:73:d9 10.100.0.11
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.662 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.663 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.663 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.663 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.663 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.665 187212 INFO nova.compute.manager [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Terminating instance
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.666 187212 DEBUG nova.compute.manager [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:21 compute-0 kernel: tapf6fc1ec5-ea (unregistering): left promiscuous mode
Dec 05 12:02:21 compute-0 NetworkManager[55691]: <info>  [1764936141.6846] device (tapf6fc1ec5-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:21 compute-0 ovn_controller[95610]: 2025-12-05T12:02:21Z|00190|binding|INFO|Releasing lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa from this chassis (sb_readonly=0)
Dec 05 12:02:21 compute-0 ovn_controller[95610]: 2025-12-05T12:02:21Z|00191|binding|INFO|Setting lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa down in Southbound
Dec 05 12:02:21 compute-0 ovn_controller[95610]: 2025-12-05T12:02:21Z|00192|binding|INFO|Removing iface tapf6fc1ec5-ea ovn-installed in OVS
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.693 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.702 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:9d:39 10.100.0.12'], port_security=['fa:16:3e:5f:9d:39 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '795a269a-5af9-4e6a-bf1f-e2bb83634855', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.703 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis
Dec 05 12:02:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.705 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:02:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.706 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ff319d56-7013-4f30-b7e9-6014c597bc35]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.707 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.709 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:21 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Dec 05 12:02:21 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000001b.scope: Consumed 3.363s CPU time.
Dec 05 12:02:21 compute-0 systemd-machined[153543]: Machine qemu-31-instance-0000001b terminated.
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.926 187212 INFO nova.virt.libvirt.driver [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance destroyed successfully.
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.927 187212 DEBUG nova.objects.instance [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid 795a269a-5af9-4e6a-bf1f-e2bb83634855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.959 187212 DEBUG nova.virt.libvirt.vif [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1602902542',display_name='tempest-ImagesTestJSON-server-1602902542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1602902542',id=27,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-5xs2qi13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:18Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=795a269a-5af9-4e6a-bf1f-e2bb83634855,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.960 187212 DEBUG nova.network.os_vif_util [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.961 187212 DEBUG nova.network.os_vif_util [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.961 187212 DEBUG os_vif [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.963 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:21 compute-0 nova_compute[187208]: 2025-12-05 12:02:21.964 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6fc1ec5-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.005 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.008 187212 INFO os_vif [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea')
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.009 187212 INFO nova.virt.libvirt.driver [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Deleting instance files /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855_del
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.010 187212 INFO nova.virt.libvirt.driver [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Deletion of /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855_del complete
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.014 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936127.013532, d95c0324-d1d3-4960-9ab7-3a2a098a9f7c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.014 187212 INFO nova.compute.manager [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] VM Stopped (Lifecycle Event)
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.056 187212 DEBUG nova.compute.manager [None req-bf31b66c-d366-4ea7-9bdb-e76be5da22f0 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.068 187212 INFO nova.compute.manager [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.068 187212 DEBUG oslo.service.loopingcall [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.069 187212 DEBUG nova.compute.manager [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.069 187212 DEBUG nova.network.neutron [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:22 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [NOTICE]   (217946) : haproxy version is 2.8.14-c23fe91
Dec 05 12:02:22 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [NOTICE]   (217946) : path to executable is /usr/sbin/haproxy
Dec 05 12:02:22 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [WARNING]  (217946) : Exiting Master process...
Dec 05 12:02:22 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [WARNING]  (217946) : Exiting Master process...
Dec 05 12:02:22 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [ALERT]    (217946) : Current worker (217948) exited with code 143 (Terminated)
Dec 05 12:02:22 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [WARNING]  (217946) : All workers exited. Exiting... (0)
Dec 05 12:02:22 compute-0 systemd[1]: libpod-946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3.scope: Deactivated successfully.
Dec 05 12:02:22 compute-0 podman[218312]: 2025-12-05 12:02:22.181255342 +0000 UTC m=+0.378509333 container died 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 05 12:02:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-55714bde8320c7ed878ac9913f0bc25c165a8c6293c219aeaa49515054bdec12-merged.mount: Deactivated successfully.
Dec 05 12:02:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3-userdata-shm.mount: Deactivated successfully.
Dec 05 12:02:22 compute-0 podman[218312]: 2025-12-05 12:02:22.227109341 +0000 UTC m=+0.424363322 container cleanup 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 05 12:02:22 compute-0 systemd[1]: libpod-conmon-946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3.scope: Deactivated successfully.
Dec 05 12:02:22 compute-0 podman[218356]: 2025-12-05 12:02:22.296767581 +0000 UTC m=+0.048033853 container remove 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 12:02:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.301 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9973b4f4-26ee-413f-b875-4d6e4b61fc69]: (4, ('Fri Dec  5 12:02:21 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3)\n946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3\nFri Dec  5 12:02:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3)\n946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.307 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cb84d507-e922-4d99-b274-02e3f5739354]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.309 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.311 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:22 compute-0 kernel: tap41b3b495-c0: left promiscuous mode
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.317 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[85de475a-f15e-4455-ad81-6eaaed22bff3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.326 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.340 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac3de163-e614-4fc6-a83f-88170d2676e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.342 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e91589d1-2f82-4123-8133-f6ec6a6bfd80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.357 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3c73c474-df8a-48e1-9e14-2ecdb1ede8d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350633, 'reachable_time': 28597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218372, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:22 compute-0 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec 05 12:02:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.359 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:02:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.359 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[092bff9e-c70c-4988-8719-ed41f00a5c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.516 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.671 187212 DEBUG nova.network.neutron [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.694 187212 INFO nova.compute.manager [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 0.62 seconds to deallocate network for instance.
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.747 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.748 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.838 187212 DEBUG nova.compute.provider_tree [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.853 187212 DEBUG nova.scheduler.client.report [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.877 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:22 compute-0 nova_compute[187208]: 2025-12-05 12:02:22.930 187212 INFO nova.scheduler.client.report [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance 795a269a-5af9-4e6a-bf1f-e2bb83634855
Dec 05 12:02:23 compute-0 nova_compute[187208]: 2025-12-05 12:02:23.008 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:23 compute-0 nova_compute[187208]: 2025-12-05 12:02:23.014 187212 DEBUG nova.compute.manager [req-9a764540-51d3-42c2-b7b6-cb8c0f7495a1 req-2541d075-80b0-47a1-99dd-b4b47ad87f0b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-deleted-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:24 compute-0 nova_compute[187208]: 2025-12-05 12:02:24.560 187212 DEBUG nova.compute.manager [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-unplugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:24 compute-0 nova_compute[187208]: 2025-12-05 12:02:24.561 187212 DEBUG oslo_concurrency.lockutils [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:24 compute-0 nova_compute[187208]: 2025-12-05 12:02:24.561 187212 DEBUG oslo_concurrency.lockutils [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:24 compute-0 nova_compute[187208]: 2025-12-05 12:02:24.561 187212 DEBUG oslo_concurrency.lockutils [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:24 compute-0 nova_compute[187208]: 2025-12-05 12:02:24.562 187212 DEBUG nova.compute.manager [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] No waiting events found dispatching network-vif-unplugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:24 compute-0 nova_compute[187208]: 2025-12-05 12:02:24.562 187212 WARNING nova.compute.manager [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received unexpected event network-vif-unplugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa for instance with vm_state deleted and task_state None.
Dec 05 12:02:24 compute-0 nova_compute[187208]: 2025-12-05 12:02:24.782 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936129.7816968, 4e7aec76-673e-48b5-b183-cc9c7a95fd37 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:24 compute-0 nova_compute[187208]: 2025-12-05 12:02:24.783 187212 INFO nova.compute.manager [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] VM Stopped (Lifecycle Event)
Dec 05 12:02:24 compute-0 nova_compute[187208]: 2025-12-05 12:02:24.807 187212 DEBUG nova.compute.manager [None req-4e58248a-0e7a-46f1-94b7-9a7ddbdf5bb7 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:25 compute-0 ovn_controller[95610]: 2025-12-05T12:02:25Z|00193|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec 05 12:02:25 compute-0 ovn_controller[95610]: 2025-12-05T12:02:25Z|00194|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.008 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.676 187212 DEBUG nova.compute.manager [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.676 187212 DEBUG oslo_concurrency.lockutils [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.677 187212 DEBUG oslo_concurrency.lockutils [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.677 187212 DEBUG oslo_concurrency.lockutils [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.677 187212 DEBUG nova.compute.manager [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] No waiting events found dispatching network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.677 187212 WARNING nova.compute.manager [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received unexpected event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa for instance with vm_state deleted and task_state None.
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.778 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.779 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.798 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.867 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.867 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.873 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:02:26 compute-0 nova_compute[187208]: 2025-12-05 12:02:26.873 187212 INFO nova.compute.claims [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.029 187212 DEBUG nova.compute.provider_tree [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.045 187212 DEBUG nova.scheduler.client.report [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.071 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.072 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.090 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:27 compute-0 podman[218375]: 2025-12-05 12:02:27.205378296 +0000 UTC m=+0.056969488 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.229 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.229 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:02:27 compute-0 podman[218376]: 2025-12-05 12:02:27.234141168 +0000 UTC m=+0.084636739 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.244 187212 INFO nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.260 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.346 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.347 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.347 187212 INFO nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Creating image(s)
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.348 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.348 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.349 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.360 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.416 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.417 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.417 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.428 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.481 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.482 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.516 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.517 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.518 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.534 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.575 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.576 187212 DEBUG nova.virt.disk.api [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.576 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.636 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.637 187212 DEBUG nova.virt.disk.api [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.637 187212 DEBUG nova.objects.instance [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.653 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.654 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Ensure instance console log exists: /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.654 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.655 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.655 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:27 compute-0 nova_compute[187208]: 2025-12-05 12:02:27.782 187212 DEBUG nova.policy [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:02:27 compute-0 sshd-session[218373]: Received disconnect from 43.225.159.82 port 44684:11:  [preauth]
Dec 05 12:02:27 compute-0 sshd-session[218373]: Disconnected from authenticating user root 43.225.159.82 port 44684 [preauth]
Dec 05 12:02:29 compute-0 nova_compute[187208]: 2025-12-05 12:02:29.604 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Successfully created port: 82089bf4-207e-4880-b8ff-9bf09a4ac3fb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:02:29 compute-0 nova_compute[187208]: 2025-12-05 12:02:29.891 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936134.8898897, 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:29 compute-0 nova_compute[187208]: 2025-12-05 12:02:29.891 187212 INFO nova.compute.manager [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] VM Stopped (Lifecycle Event)
Dec 05 12:02:29 compute-0 nova_compute[187208]: 2025-12-05 12:02:29.913 187212 DEBUG nova.compute.manager [None req-b9aadafc-d44f-46b1-a0a7-e63dece5f2aa - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:30 compute-0 ovn_controller[95610]: 2025-12-05T12:02:30Z|00195|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec 05 12:02:30 compute-0 ovn_controller[95610]: 2025-12-05T12:02:30Z|00196|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.100 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.101 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.101 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.101 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.102 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.103 187212 INFO nova.compute.manager [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Terminating instance
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.104 187212 DEBUG nova.compute.manager [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.104 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 kernel: tap78310fa8-21 (unregistering): left promiscuous mode
Dec 05 12:02:30 compute-0 NetworkManager[55691]: <info>  [1764936150.1340] device (tap78310fa8-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.144 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 ovn_controller[95610]: 2025-12-05T12:02:30Z|00197|binding|INFO|Releasing lport 78310fa8-21e8-49e5-8b60-867d1089ad71 from this chassis (sb_readonly=0)
Dec 05 12:02:30 compute-0 ovn_controller[95610]: 2025-12-05T12:02:30Z|00198|binding|INFO|Setting lport 78310fa8-21e8-49e5-8b60-867d1089ad71 down in Southbound
Dec 05 12:02:30 compute-0 ovn_controller[95610]: 2025-12-05T12:02:30Z|00199|binding|INFO|Removing iface tap78310fa8-21 ovn-installed in OVS
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.148 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.153 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:42:5d 10.100.0.11'], port_security=['fa:16:3e:c8:42:5d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'adc15883-b705-42dd-ac95-04f4b8964012', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '342e6d694cf6482c9f1b7557a17bce60', 'neutron:revision_number': '4', 'neutron:security_group_ids': '710ea28e-d1ba-4c63-a751-16b460b2129b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a85cd729-c72e-4d3c-b444-ff0b42d436ff, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=78310fa8-21e8-49e5-8b60-867d1089ad71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.155 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 78310fa8-21e8-49e5-8b60-867d1089ad71 in datapath 393d33f9-2dde-4fb5-b5db-3f0fb98d4637 unbound from our chassis
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.157 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 393d33f9-2dde-4fb5-b5db-3f0fb98d4637, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.158 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8f5b3a-f6cb-4a79-b07c-b020e01db895]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.158 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 namespace which is not needed anymore
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000019.scope: Deactivated successfully.
Dec 05 12:02:30 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000019.scope: Consumed 13.526s CPU time.
Dec 05 12:02:30 compute-0 systemd-machined[153543]: Machine qemu-29-instance-00000019 terminated.
Dec 05 12:02:30 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [NOTICE]   (217354) : haproxy version is 2.8.14-c23fe91
Dec 05 12:02:30 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [NOTICE]   (217354) : path to executable is /usr/sbin/haproxy
Dec 05 12:02:30 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [WARNING]  (217354) : Exiting Master process...
Dec 05 12:02:30 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [ALERT]    (217354) : Current worker (217357) exited with code 143 (Terminated)
Dec 05 12:02:30 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [WARNING]  (217354) : All workers exited. Exiting... (0)
Dec 05 12:02:30 compute-0 systemd[1]: libpod-f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751.scope: Deactivated successfully.
Dec 05 12:02:30 compute-0 podman[218464]: 2025-12-05 12:02:30.298935978 +0000 UTC m=+0.044294746 container died f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.327 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751-userdata-shm.mount: Deactivated successfully.
Dec 05 12:02:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-26091b88884c53d07a76d03c6c9c66adb5d232a7c306c3b78dafe02bf1e95c96-merged.mount: Deactivated successfully.
Dec 05 12:02:30 compute-0 podman[218464]: 2025-12-05 12:02:30.345164689 +0000 UTC m=+0.090523427 container cleanup f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 12:02:30 compute-0 systemd[1]: libpod-conmon-f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751.scope: Deactivated successfully.
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.381 187212 INFO nova.virt.libvirt.driver [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance destroyed successfully.
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.382 187212 DEBUG nova.objects.instance [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'resources' on Instance uuid adc15883-b705-42dd-ac95-04f4b8964012 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.395 187212 DEBUG nova.virt.libvirt.vif [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-555517467',id=25,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-hjkfnf9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=adc15883-b705-42dd-ac95-04f4b8964012,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.395 187212 DEBUG nova.network.os_vif_util [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.396 187212 DEBUG nova.network.os_vif_util [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.397 187212 DEBUG os_vif [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.398 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.398 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap78310fa8-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.400 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.401 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.403 187212 INFO os_vif [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21')
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.404 187212 INFO nova.virt.libvirt.driver [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Deleting instance files /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012_del
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.405 187212 INFO nova.virt.libvirt.driver [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Deletion of /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012_del complete
Dec 05 12:02:30 compute-0 podman[218515]: 2025-12-05 12:02:30.419873593 +0000 UTC m=+0.045918343 container remove f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:02:30 compute-0 podman[218479]: 2025-12-05 12:02:30.420631934 +0000 UTC m=+0.102689704 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.425 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a1bcd4-2b18-45ce-a2a1-efa64eb07322]: (4, ('Fri Dec  5 12:02:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 (f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751)\nf764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751\nFri Dec  5 12:02:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 (f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751)\nf764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.427 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[24c2d4f4-a800-420a-9690-b7b8983ca63c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.428 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap393d33f9-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.430 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 kernel: tap393d33f9-20: left promiscuous mode
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.431 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.435 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[afd47196-fd8a-4a32-bd5a-de296488c5fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.443 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.448 187212 INFO nova.compute.manager [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.449 187212 DEBUG oslo.service.loopingcall [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.449 187212 DEBUG nova.compute.manager [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:30 compute-0 nova_compute[187208]: 2025-12-05 12:02:30.449 187212 DEBUG nova.network.neutron [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.452 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[67a1e61b-34cf-4118-841b-0b964c02a615]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.453 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2cea19e8-ff15-4ac0-b9bc-0fe5d52096de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.469 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ad284364-93ee-46d6-ad08-906fb866e5d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348665, 'reachable_time': 44019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218546, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.471 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:02:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.471 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[11a414d8-54c2-4469-bd44-b92e198814cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:30 compute-0 systemd[1]: run-netns-ovnmeta\x2d393d33f9\x2d2dde\x2d4fb5\x2db5db\x2d3f0fb98d4637.mount: Deactivated successfully.
Dec 05 12:02:31 compute-0 nova_compute[187208]: 2025-12-05 12:02:31.075 187212 DEBUG nova.compute.manager [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-unplugged-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:31 compute-0 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG oslo_concurrency.lockutils [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:31 compute-0 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG oslo_concurrency.lockutils [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:31 compute-0 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG oslo_concurrency.lockutils [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:31 compute-0 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG nova.compute.manager [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] No waiting events found dispatching network-vif-unplugged-78310fa8-21e8-49e5-8b60-867d1089ad71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:31 compute-0 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG nova.compute.manager [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-unplugged-78310fa8-21e8-49e5-8b60-867d1089ad71 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:02:31 compute-0 sshd-session[218441]: Invalid user admin from 45.140.17.124 port 62592
Dec 05 12:02:32 compute-0 sshd-session[218441]: Connection reset by invalid user admin 45.140.17.124 port 62592 [preauth]
Dec 05 12:02:32 compute-0 nova_compute[187208]: 2025-12-05 12:02:32.524 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:32 compute-0 nova_compute[187208]: 2025-12-05 12:02:32.566 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:32 compute-0 nova_compute[187208]: 2025-12-05 12:02:32.871 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Successfully updated port: 82089bf4-207e-4880-b8ff-9bf09a4ac3fb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:02:32 compute-0 nova_compute[187208]: 2025-12-05 12:02:32.885 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:32 compute-0 nova_compute[187208]: 2025-12-05 12:02:32.886 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:32 compute-0 nova_compute[187208]: 2025-12-05 12:02:32.886 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.153 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936138.15153, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.153 187212 INFO nova.compute.manager [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Stopped (Lifecycle Event)
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.176 187212 DEBUG nova.compute.manager [None req-4b354250-7e89-455a-b0ca-09794c64b7c5 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.419 187212 DEBUG nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 DEBUG nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] No waiting events found dispatching network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 WARNING nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received unexpected event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 for instance with vm_state active and task_state deleting.
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.421 187212 DEBUG nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-changed-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.421 187212 DEBUG nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Refreshing instance network info cache due to event network-changed-82089bf4-207e-4880-b8ff-9bf09a4ac3fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.421 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:33 compute-0 nova_compute[187208]: 2025-12-05 12:02:33.446 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.059 187212 DEBUG nova.network.neutron [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.086 187212 INFO nova.compute.manager [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Took 3.64 seconds to deallocate network for instance.
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.130 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.130 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.233 187212 DEBUG nova.compute.provider_tree [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.262 187212 DEBUG nova.scheduler.client.report [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.284 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.350 187212 INFO nova.scheduler.client.report [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Deleted allocations for instance adc15883-b705-42dd-ac95-04f4b8964012
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.426 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.873 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.874 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.874 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.874 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.874 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.875 187212 INFO nova.compute.manager [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Terminating instance
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.876 187212 DEBUG nova.compute.manager [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:34 compute-0 kernel: tapc72089e0-49 (unregistering): left promiscuous mode
Dec 05 12:02:34 compute-0 NetworkManager[55691]: <info>  [1764936154.9026] device (tapc72089e0-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:34 compute-0 ovn_controller[95610]: 2025-12-05T12:02:34Z|00200|binding|INFO|Releasing lport c72089e0-4937-40b6-86b5-f9d6d0982058 from this chassis (sb_readonly=0)
Dec 05 12:02:34 compute-0 ovn_controller[95610]: 2025-12-05T12:02:34Z|00201|binding|INFO|Setting lport c72089e0-4937-40b6-86b5-f9d6d0982058 down in Southbound
Dec 05 12:02:34 compute-0 ovn_controller[95610]: 2025-12-05T12:02:34Z|00202|binding|INFO|Removing iface tapc72089e0-49 ovn-installed in OVS
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.923 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:73:d9 10.100.0.11'], port_security=['fa:16:3e:ea:73:d9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1606eea3-5389-4437-b0f9-cfe6084d7871', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3cd52d70d1a4be8ae891298ff7e1018', 'neutron:revision_number': '4', 'neutron:security_group_ids': '753f16cd-17e0-4f5a-8936-b01e8b5b8119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1ba8a60-bda5-4c97-91b2-1ae7ea8aa092, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c72089e0-4937-40b6-86b5-f9d6d0982058) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.925 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c72089e0-4937-40b6-86b5-f9d6d0982058 in datapath 904b3233-fdc6-4df0-b02a-f30a1e47627b unbound from our chassis
Dec 05 12:02:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.927 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 904b3233-fdc6-4df0-b02a-f30a1e47627b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:02:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.928 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e0fe1fab-7e91-455e-9385-a6746539a05a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.930 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b namespace which is not needed anymore
Dec 05 12:02:34 compute-0 nova_compute[187208]: 2025-12-05 12:02:34.934 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:34 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Dec 05 12:02:34 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000001a.scope: Consumed 12.719s CPU time.
Dec 05 12:02:34 compute-0 systemd-machined[153543]: Machine qemu-30-instance-0000001a terminated.
Dec 05 12:02:35 compute-0 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [NOTICE]   (217539) : haproxy version is 2.8.14-c23fe91
Dec 05 12:02:35 compute-0 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [NOTICE]   (217539) : path to executable is /usr/sbin/haproxy
Dec 05 12:02:35 compute-0 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [WARNING]  (217539) : Exiting Master process...
Dec 05 12:02:35 compute-0 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [WARNING]  (217539) : Exiting Master process...
Dec 05 12:02:35 compute-0 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [ALERT]    (217539) : Current worker (217541) exited with code 143 (Terminated)
Dec 05 12:02:35 compute-0 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [WARNING]  (217539) : All workers exited. Exiting... (0)
Dec 05 12:02:35 compute-0 systemd[1]: libpod-98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1.scope: Deactivated successfully.
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:02:35 compute-0 podman[218573]: 2025-12-05 12:02:35.062075941 +0000 UTC m=+0.044523213 container died 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 12:02:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1-userdata-shm.mount: Deactivated successfully.
Dec 05 12:02:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-9f782e29042a4d6587770f14d2c44c7361a37482c8593077cf3a706cbf5d68ff-merged.mount: Deactivated successfully.
Dec 05 12:02:35 compute-0 NetworkManager[55691]: <info>  [1764936155.0974] manager: (tapc72089e0-49): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Dec 05 12:02:35 compute-0 podman[218573]: 2025-12-05 12:02:35.108337992 +0000 UTC m=+0.090785254 container cleanup 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:02:35 compute-0 systemd[1]: libpod-conmon-98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1.scope: Deactivated successfully.
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.131 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.132 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.149 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.155 187212 INFO nova.virt.libvirt.driver [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance destroyed successfully.
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.156 187212 DEBUG nova.objects.instance [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lazy-loading 'resources' on Instance uuid 1606eea3-5389-4437-b0f9-cfe6084d7871 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.176 187212 DEBUG nova.virt.libvirt.vif [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-306695219',display_name='tempest-ServersTestManualDisk-server-306695219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-306695219',id=26,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBUGvAJW8rYe/hjaW/hFZe4neO1wzdrge/WiC/SnDk7t8/AXKetmZ8zo2NHECOEnhI/cR+zSyaxyLqYdEo4m6l7dGZQlwDucN9SIoLiq2LpSC0tXmPTDFsuOTXYjC2rzw==',key_name='tempest-keypair-2064130855',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3cd52d70d1a4be8ae891298ff7e1018',ramdisk_id='',reservation_id='r-w3qpedx4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-1916815153',owner_user_name='tempest-ServersTestManualDisk-1916815153-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ff53b25ec85543eeb2bdea04a6eeaac4',uuid=1606eea3-5389-4437-b0f9-cfe6084d7871,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.176 187212 DEBUG nova.network.os_vif_util [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converting VIF {"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.177 187212 DEBUG nova.network.os_vif_util [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.177 187212 DEBUG os_vif [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.179 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc72089e0-49, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:35 compute-0 podman[218615]: 2025-12-05 12:02:35.18316417 +0000 UTC m=+0.044509043 container remove 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.186 187212 INFO os_vif [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49')
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.187 187212 INFO nova.virt.libvirt.driver [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Deleting instance files /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871_del
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.188 187212 INFO nova.virt.libvirt.driver [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Deletion of /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871_del complete
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.188 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c3dc55b8-381e-49ea-8236-a2cc6661a01f]: (4, ('Fri Dec  5 12:02:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b (98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1)\n98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1\nFri Dec  5 12:02:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b (98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1)\n98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.192 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[467066ee-eb7c-4989-830d-61dff951a6bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.193 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap904b3233-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:35 compute-0 kernel: tap904b3233-f0: left promiscuous mode
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.195 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.206 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.209 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c12c75f7-c5e9-4d2c-afed-6bbed9665ccf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.218 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.218 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.224 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.224 187212 INFO nova.compute.claims [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.233 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[71ff184b-da5f-465b-86d4-e13d144c1ab6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.235 187212 INFO nova.compute.manager [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.235 187212 DEBUG oslo.service.loopingcall [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.235 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84468348-a198-4008-8d79-b69054dd61d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.235 187212 DEBUG nova.compute.manager [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.235 187212 DEBUG nova.network.neutron [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.252 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bab32d45-1bb6-43eb-86d6-93778f0edc74]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349256, 'reachable_time': 30173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218635, 'error': None, 'target': 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.255 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.255 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[4dcfa2a7-ccc6-4b65-9ab1-b250489116b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d904b3233\x2dfdc6\x2d4df0\x2db02a\x2df30a1e47627b.mount: Deactivated successfully.
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.354 187212 DEBUG nova.compute.provider_tree [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.369 187212 DEBUG nova.scheduler.client.report [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.393 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.394 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:02:35 compute-0 sshd-session[218547]: Connection reset by authenticating user root 45.140.17.124 port 28344 [preauth]
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.441 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.442 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.465 187212 INFO nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.503 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.613 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.615 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.615 187212 INFO nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Creating image(s)
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.616 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.616 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.617 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.630 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.662 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Updating instance_info_cache with network_info: [{"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.666 187212 DEBUG nova.compute.manager [req-5f9fdda8-f52c-4d14-99eb-1c9167fd6a7f req-b15a3d5b-418e-41ec-b05f-40016a081e39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-deleted-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.692 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.693 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance network_info: |[{"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.693 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.693 187212 DEBUG nova.network.neutron [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Refreshing network info cache for port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.697 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start _get_guest_xml network_info=[{"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.700 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.701 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.702 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.718 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.755 187212 WARNING nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.766 187212 DEBUG nova.virt.libvirt.host [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.766 187212 DEBUG nova.virt.libvirt.host [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.770 187212 DEBUG nova.virt.libvirt.host [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.770 187212 DEBUG nova.virt.libvirt.host [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.771 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.771 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.777 187212 DEBUG nova.virt.libvirt.vif [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1705515982',display_name='tempest-ImagesTestJSON-server-1705515982',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1705515982',id=28,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-9ph9qh0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:27Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=9efa988a-19ae-440a-8a56-0bac68cb3c9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.777 187212 DEBUG nova.network.os_vif_util [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.778 187212 DEBUG nova.network.os_vif_util [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.779 187212 DEBUG nova.objects.instance [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.789 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.789 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.810 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <uuid>9efa988a-19ae-440a-8a56-0bac68cb3c9e</uuid>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <name>instance-0000001c</name>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesTestJSON-server-1705515982</nova:name>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:02:35</nova:creationTime>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:02:35 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:02:35 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:02:35 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:02:35 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:02:35 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:02:35 compute-0 nova_compute[187208]:         <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec 05 12:02:35 compute-0 nova_compute[187208]:         <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:02:35 compute-0 nova_compute[187208]:         <nova:port uuid="82089bf4-207e-4880-b8ff-9bf09a4ac3fb">
Dec 05 12:02:35 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <system>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <entry name="serial">9efa988a-19ae-440a-8a56-0bac68cb3c9e</entry>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <entry name="uuid">9efa988a-19ae-440a-8a56-0bac68cb3c9e</entry>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     </system>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <os>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   </os>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <features>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   </features>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.config"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:53:25:56"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <target dev="tap82089bf4-20"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/console.log" append="off"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <video>
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     </video>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:02:35 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:02:35 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:02:35 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:02:35 compute-0 nova_compute[187208]: </domain>
Dec 05 12:02:35 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.812 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Preparing to wait for external event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.813 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.813 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.813 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.814 187212 DEBUG nova.virt.libvirt.vif [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1705515982',display_name='tempest-ImagesTestJSON-server-1705515982',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1705515982',id=28,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-9ph9qh0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:27Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=9efa988a-19ae-440a-8a56-0bac68cb3c9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.815 187212 DEBUG nova.network.os_vif_util [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.815 187212 DEBUG nova.network.os_vif_util [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.816 187212 DEBUG os_vif [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.816 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.817 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.817 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.820 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.821 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82089bf4-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.821 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82089bf4-20, col_values=(('external_ids', {'iface-id': '82089bf4-207e-4880-b8ff-9bf09a4ac3fb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:25:56', 'vm-uuid': '9efa988a-19ae-440a-8a56-0bac68cb3c9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:35 compute-0 NetworkManager[55691]: <info>  [1764936155.8244] manager: (tap82089bf4-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.827 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.831 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.832 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.851 187212 INFO os_vif [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20')
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.889 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.890 187212 DEBUG nova.virt.disk.api [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Checking if we can resize image /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.891 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.891 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.893 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.909 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.936 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.936 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.936 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:53:25:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.937 187212 INFO nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Using config drive
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.944 187212 DEBUG nova.policy [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '79758a6c7516459bb1907270241d266a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '342e6d694cf6482c9f1b7557a17bce60', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.960 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.961 187212 DEBUG nova.virt.disk.api [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Cannot resize image /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.962 187212 DEBUG nova.objects.instance [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'migration_context' on Instance uuid bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.978 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.979 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.980 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.980 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.981 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:35 compute-0 nova_compute[187208]: 2025-12-05 12:02:35.981 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.006 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.007 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.050 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.051 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.064 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.121 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.126 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.127 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.128 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.145 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.200 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.201 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.eph0 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.233 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.eph0 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.234 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.234 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.297 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.298 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.298 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Ensure instance console log exists: /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.299 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.299 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.299 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.461 187212 DEBUG nova.network.neutron [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.477 187212 INFO nova.compute.manager [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Took 1.24 seconds to deallocate network for instance.
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.591 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.592 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.683 187212 DEBUG nova.compute.provider_tree [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.698 187212 DEBUG nova.scheduler.client.report [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.720 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.757 187212 INFO nova.scheduler.client.report [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Deleted allocations for instance 1606eea3-5389-4437-b0f9-cfe6084d7871
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.837 187212 DEBUG nova.compute.manager [req-fac4c2fb-eeca-4675-85cf-78e81a664d28 req-94089a6b-0687-4204-abd7-2ac25bc5073b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-deleted-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.840 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.967s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:36.897 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.902 187212 INFO nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Creating config drive at /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.config
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.906 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ersl1ph execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.925 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936141.9238977, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.926 187212 INFO nova.compute.manager [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Stopped (Lifecycle Event)
Dec 05 12:02:36 compute-0 nova_compute[187208]: 2025-12-05 12:02:36.951 187212 DEBUG nova.compute.manager [None req-a41ee703-2c98-4540-9e13-a6d8a24dd1a5 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.034 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ersl1ph" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:02:37 compute-0 kernel: tap82089bf4-20: entered promiscuous mode
Dec 05 12:02:37 compute-0 ovn_controller[95610]: 2025-12-05T12:02:37Z|00203|binding|INFO|Claiming lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb for this chassis.
Dec 05 12:02:37 compute-0 systemd-udevd[218553]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:02:37 compute-0 NetworkManager[55691]: <info>  [1764936157.1267] manager: (tap82089bf4-20): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Dec 05 12:02:37 compute-0 ovn_controller[95610]: 2025-12-05T12:02:37Z|00204|binding|INFO|82089bf4-207e-4880-b8ff-9bf09a4ac3fb: Claiming fa:16:3e:53:25:56 10.100.0.7
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.136 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:25:56 10.100.0.7'], port_security=['fa:16:3e:53:25:56 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9efa988a-19ae-440a-8a56-0bac68cb3c9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=82089bf4-207e-4880-b8ff-9bf09a4ac3fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.138 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.140 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.144 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:37 compute-0 ovn_controller[95610]: 2025-12-05T12:02:37Z|00205|binding|INFO|Setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb ovn-installed in OVS
Dec 05 12:02:37 compute-0 ovn_controller[95610]: 2025-12-05T12:02:37Z|00206|binding|INFO|Setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb up in Southbound
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.147 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:37 compute-0 NetworkManager[55691]: <info>  [1764936157.1542] device (tap82089bf4-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:02:37 compute-0 NetworkManager[55691]: <info>  [1764936157.1552] device (tap82089bf4-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.150 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[272335ef-db9f-4114-81e5-c34568bb286b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.151 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.157 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.157 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a305e5aa-00f4-49a4-9b12-6f1a8aafb1b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.158 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b5283076-864c-4047-bebf-4e9f57a6eeeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.170 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b13267c0-97f5-4d14-9a01-4063a7518ae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 systemd-machined[153543]: New machine qemu-32-instance-0000001c.
Dec 05 12:02:37 compute-0 systemd[1]: Started Virtual Machine qemu-32-instance-0000001c.
Dec 05 12:02:37 compute-0 podman[218681]: 2025-12-05 12:02:37.196327132 +0000 UTC m=+0.079320067 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.199 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54957442-e824-44f5-83fc-4e8eb45ac0f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.237 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[49157da1-adc4-473a-8a41-116aa4b91abe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.246 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca04bc0-7b2a-4059-b72c-1d86fb13393f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 NetworkManager[55691]: <info>  [1764936157.2467] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.272 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3d8938c0-5699-4433-9479-16a6e7ac08c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.274 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc15d67-69f8-45de-bf83-ebefba5ea24b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 NetworkManager[55691]: <info>  [1764936157.2943] device (tap41b3b495-c0): carrier: link connected
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.299 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[81e0ae70-2e41-4435-a08a-5920da16755b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f557dd8-b923-4118-b6e4-3f8afd4386b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353785, 'reachable_time': 36100, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218743, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.330 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[86a6c34e-30cb-4647-ab5e-1acb7bb5c3dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353785, 'tstamp': 353785}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218744, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.344 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff6eddd-beb1-460b-a29f-d07310db2ae3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353785, 'reachable_time': 36100, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218745, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.367 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[69615094-f217-47d8-bcd4-76a095b3ef98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.417 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6b5c9de8-ceab-4749-ac42-f60d6093b68c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.419 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.419 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.420 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:37 compute-0 NetworkManager[55691]: <info>  [1764936157.4234] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.422 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:37 compute-0 kernel: tap41b3b495-c0: entered promiscuous mode
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.437 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.439 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.440 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:37 compute-0 ovn_controller[95610]: 2025-12-05T12:02:37Z|00207|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.451 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.453 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.454 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78194e8f-ce7b-49f1-9eb1-ce959e418412]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.455 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:02:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.456 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.526 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:37 compute-0 sshd-session[218636]: Connection reset by authenticating user root 45.140.17.124 port 28356 [preauth]
Dec 05 12:02:37 compute-0 podman[218779]: 2025-12-05 12:02:37.806638144 +0000 UTC m=+0.049942777 container create 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.828 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936157.8281279, 9efa988a-19ae-440a-8a56-0bac68cb3c9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.829 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] VM Started (Lifecycle Event)
Dec 05 12:02:37 compute-0 systemd[1]: Started libpod-conmon-5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5.scope.
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.848 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.852 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936157.828551, 9efa988a-19ae-440a-8a56-0bac68cb3c9e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.852 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] VM Paused (Lifecycle Event)
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.867 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:37 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.870 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df6dbe72f381b99c1a982824974be4cad9c714499ad9f3e0b3b9579fdd54f357/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:02:37 compute-0 podman[218779]: 2025-12-05 12:02:37.779488069 +0000 UTC m=+0.022792712 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:02:37 compute-0 podman[218779]: 2025-12-05 12:02:37.887119843 +0000 UTC m=+0.130424476 container init 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 12:02:37 compute-0 nova_compute[187208]: 2025-12-05 12:02:37.890 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:37 compute-0 podman[218779]: 2025-12-05 12:02:37.892839436 +0000 UTC m=+0.136144069 container start 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:02:37 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [NOTICE]   (218801) : New worker (218803) forked
Dec 05 12:02:37 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [NOTICE]   (218801) : Loading success.
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.092 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.093 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.184 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.221 187212 DEBUG nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-unplugged-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.222 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.222 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.223 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.223 187212 DEBUG nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] No waiting events found dispatching network-vif-unplugged-c72089e0-4937-40b6-86b5-f9d6d0982058 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.223 187212 WARNING nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received unexpected event network-vif-unplugged-c72089e0-4937-40b6-86b5-f9d6d0982058 for instance with vm_state deleted and task_state None.
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.224 187212 DEBUG nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.224 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.224 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.225 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.225 187212 DEBUG nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] No waiting events found dispatching network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.225 187212 WARNING nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received unexpected event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 for instance with vm_state deleted and task_state None.
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.253 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.254 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.308 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.318 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.318 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.332 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.404 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.405 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.410 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.410 187212 INFO nova.compute.claims [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.497 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.498 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5596MB free_disk=73.29545211791992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.498 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.590 187212 DEBUG nova.compute.provider_tree [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.644 187212 DEBUG nova.scheduler.client.report [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.678 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.679 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.681 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.732 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.733 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.746 187212 INFO nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.761 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.763 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 9efa988a-19ae-440a-8a56-0bac68cb3c9e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.764 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.764 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.764 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.764 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.840 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.841 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.841 187212 INFO nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Creating image(s)
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.842 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.842 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.843 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.856 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.882 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.900 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.924 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.925 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.926 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.927 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.928 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:38 compute-0 nova_compute[187208]: 2025-12-05 12:02:38.944 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.026 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.027 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.069 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.070 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.070 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.131 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.133 187212 DEBUG nova.virt.disk.api [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Checking if we can resize image /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.134 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.188 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.190 187212 DEBUG nova.virt.disk.api [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Cannot resize image /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.190 187212 DEBUG nova.objects.instance [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lazy-loading 'migration_context' on Instance uuid fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.206 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.207 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Ensure instance console log exists: /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.207 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.208 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.208 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.216 187212 DEBUG nova.network.neutron [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Updated VIF entry in instance network info cache for port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.217 187212 DEBUG nova.network.neutron [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Updating instance_info_cache with network_info: [{"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.233 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.556 187212 DEBUG nova.policy [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e4a0640c63a14775b62a4d40c4860519', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6a2f4fffdace4b2fa0e0b6cdfc1055f5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.788 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Successfully created port: b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.926 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.927 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:02:39 compute-0 nova_compute[187208]: 2025-12-05 12:02:39.927 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:02:40 compute-0 sshd-session[218812]: Connection reset by authenticating user root 45.140.17.124 port 28370 [preauth]
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.333 187212 DEBUG nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.334 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.334 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.334 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.335 187212 DEBUG nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Processing event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.335 187212 DEBUG nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.335 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.336 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.336 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.336 187212 DEBUG nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] No waiting events found dispatching network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.336 187212 WARNING nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received unexpected event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb for instance with vm_state building and task_state spawning.
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.338 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.343 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.343 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936160.3430412, 9efa988a-19ae-440a-8a56-0bac68cb3c9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.344 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] VM Resumed (Lifecycle Event)
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.347 187212 INFO nova.virt.libvirt.driver [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance spawned successfully.
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.348 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.368 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.370 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.375 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.378 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.378 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.379 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.379 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.380 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.380 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.413 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.455 187212 INFO nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 13.11 seconds to spawn the instance on the hypervisor.
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.455 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.529 187212 INFO nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 13.68 seconds to build instance.
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.549 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:40 compute-0 nova_compute[187208]: 2025-12-05 12:02:40.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:41 compute-0 nova_compute[187208]: 2025-12-05 12:02:41.601 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Successfully created port: d067fc33-ba4d-48f6-98f5-51ebca4adbc5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:02:41 compute-0 sshd-session[218836]: Connection reset by authenticating user root 45.140.17.124 port 28374 [preauth]
Dec 05 12:02:41 compute-0 nova_compute[187208]: 2025-12-05 12:02:41.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:42 compute-0 nova_compute[187208]: 2025-12-05 12:02:42.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:02:42 compute-0 nova_compute[187208]: 2025-12-05 12:02:42.529 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:42 compute-0 nova_compute[187208]: 2025-12-05 12:02:42.902 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Successfully updated port: b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:02:42 compute-0 nova_compute[187208]: 2025-12-05 12:02:42.918 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:42 compute-0 nova_compute[187208]: 2025-12-05 12:02:42.919 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquired lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:42 compute-0 nova_compute[187208]: 2025-12-05 12:02:42.919 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:02:43 compute-0 nova_compute[187208]: 2025-12-05 12:02:43.214 187212 DEBUG nova.compute.manager [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-changed-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:43 compute-0 nova_compute[187208]: 2025-12-05 12:02:43.214 187212 DEBUG nova.compute.manager [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Refreshing instance network info cache due to event network-changed-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:02:43 compute-0 nova_compute[187208]: 2025-12-05 12:02:43.215 187212 DEBUG oslo_concurrency.lockutils [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:43 compute-0 nova_compute[187208]: 2025-12-05 12:02:43.258 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.181 187212 DEBUG oslo_concurrency.lockutils [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.182 187212 DEBUG oslo_concurrency.lockutils [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.182 187212 DEBUG nova.compute.manager [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.186 187212 DEBUG nova.compute.manager [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.186 187212 DEBUG nova.objects.instance [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'flavor' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.213 187212 DEBUG nova.virt.libvirt.driver [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.532 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Successfully updated port: d067fc33-ba4d-48f6-98f5-51ebca4adbc5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.552 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.553 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquired lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.553 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:02:44 compute-0 ovn_controller[95610]: 2025-12-05T12:02:44Z|00208|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.626 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:44 compute-0 ovn_controller[95610]: 2025-12-05T12:02:44Z|00209|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:44 compute-0 nova_compute[187208]: 2025-12-05 12:02:44.915 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.030 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updating instance_info_cache with network_info: [{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.067 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Releasing lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.067 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance network_info: |[{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.068 187212 DEBUG oslo_concurrency.lockutils [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.068 187212 DEBUG nova.network.neutron [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Refreshing network info cache for port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.072 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start _get_guest_xml network_info=[{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 1, 'encrypted': False, 'device_name': '/dev/vdb', 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio'}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.076 187212 WARNING nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.080 187212 DEBUG nova.virt.libvirt.host [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.081 187212 DEBUG nova.virt.libvirt.host [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.084 187212 DEBUG nova.virt.libvirt.host [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.084 187212 DEBUG nova.virt.libvirt.host [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.085 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.085 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:01:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={hw_rng:allowed='True'},flavorid='307317883',id=25,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_1-1528646215',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.086 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.086 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.086 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.087 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.087 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.087 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.088 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.088 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.088 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.089 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.093 187212 DEBUG nova.virt.libvirt.vif [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(25),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-479694898',id=29,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=25,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-70canao4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.093 187212 DEBUG nova.network.os_vif_util [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.094 187212 DEBUG nova.network.os_vif_util [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.095 187212 DEBUG nova.objects.instance [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'pci_devices' on Instance uuid bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.118 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <uuid>bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77</uuid>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <name>instance-0000001d</name>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-479694898</nova:name>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:02:45</nova:creationTime>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <nova:flavor name="tempest-flavor_with_ephemeral_1-1528646215">
Dec 05 12:02:45 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:02:45 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:02:45 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:02:45 compute-0 nova_compute[187208]:         <nova:ephemeral>1</nova:ephemeral>
Dec 05 12:02:45 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:02:45 compute-0 nova_compute[187208]:         <nova:user uuid="79758a6c7516459bb1907270241d266a">tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member</nova:user>
Dec 05 12:02:45 compute-0 nova_compute[187208]:         <nova:project uuid="342e6d694cf6482c9f1b7557a17bce60">tempest-ServersWithSpecificFlavorTestJSON-1976479976</nova:project>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:02:45 compute-0 nova_compute[187208]:         <nova:port uuid="b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff">
Dec 05 12:02:45 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <system>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <entry name="serial">bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77</entry>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <entry name="uuid">bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77</entry>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </system>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <os>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   </os>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <features>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   </features>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.eph0"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <target dev="vdb" bus="virtio"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.config"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:2e:33:fe"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <target dev="tapb5ee44c8-34"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/console.log" append="off"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <video>
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </video>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:02:45 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:02:45 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:02:45 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:02:45 compute-0 nova_compute[187208]: </domain>
Dec 05 12:02:45 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.120 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Preparing to wait for external event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.120 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.121 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.121 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.122 187212 DEBUG nova.virt.libvirt.vif [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(25),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-479694898',id=29,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=25,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-70canao4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.122 187212 DEBUG nova.network.os_vif_util [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.123 187212 DEBUG nova.network.os_vif_util [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.123 187212 DEBUG os_vif [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.124 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.124 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.124 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.127 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.127 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5ee44c8-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.128 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5ee44c8-34, col_values=(('external_ids', {'iface-id': 'b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:33:fe', 'vm-uuid': 'bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:45 compute-0 NetworkManager[55691]: <info>  [1764936165.1307] manager: (tapb5ee44c8-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.133 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.141 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.142 187212 INFO os_vif [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34')
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.367 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.368 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.368 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.368 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No VIF found with MAC fa:16:3e:2e:33:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.369 187212 INFO nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Using config drive
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.379 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936150.3783724, adc15883-b705-42dd-ac95-04f4b8964012 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.380 187212 INFO nova.compute.manager [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] VM Stopped (Lifecycle Event)
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.398 187212 DEBUG nova.compute.manager [None req-e5e43849-eccc-47f7-b43b-c1a21d7eaffe - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.901 187212 INFO nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Creating config drive at /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.config
Dec 05 12:02:45 compute-0 nova_compute[187208]: 2025-12-05 12:02:45.907 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpextcoanm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.035 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpextcoanm" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:46 compute-0 kernel: tapb5ee44c8-34: entered promiscuous mode
Dec 05 12:02:46 compute-0 NetworkManager[55691]: <info>  [1764936166.1260] manager: (tapb5ee44c8-34): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.137 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.141 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:46 compute-0 ovn_controller[95610]: 2025-12-05T12:02:46Z|00210|binding|INFO|Claiming lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff for this chassis.
Dec 05 12:02:46 compute-0 ovn_controller[95610]: 2025-12-05T12:02:46Z|00211|binding|INFO|b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff: Claiming fa:16:3e:2e:33:fe 10.100.0.9
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.151 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:33:fe 10.100.0.9'], port_security=['fa:16:3e:2e:33:fe 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '342e6d694cf6482c9f1b7557a17bce60', 'neutron:revision_number': '2', 'neutron:security_group_ids': '710ea28e-d1ba-4c63-a751-16b460b2129b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a85cd729-c72e-4d3c-b444-ff0b42d436ff, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.152 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff in datapath 393d33f9-2dde-4fb5-b5db-3f0fb98d4637 bound to our chassis
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.154 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.167 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bca352b9-6192-4896-abaa-f5d153379bc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.168 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap393d33f9-21 in ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.171 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap393d33f9-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.171 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[febee430-1b05-4cad-97d1-36cddfbf0d25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.173 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[974287bd-cabb-48b0-8bb0-5cdecbcc302b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 systemd-udevd[218869]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.184 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[4ebbf65a-3ea3-4beb-9780-394d24abe194]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 systemd-machined[153543]: New machine qemu-33-instance-0000001d.
Dec 05 12:02:46 compute-0 ovn_controller[95610]: 2025-12-05T12:02:46Z|00212|binding|INFO|Setting lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff ovn-installed in OVS
Dec 05 12:02:46 compute-0 ovn_controller[95610]: 2025-12-05T12:02:46Z|00213|binding|INFO|Setting lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff up in Southbound
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.196 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:46 compute-0 NetworkManager[55691]: <info>  [1764936166.2006] device (tapb5ee44c8-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:02:46 compute-0 systemd[1]: Started Virtual Machine qemu-33-instance-0000001d.
Dec 05 12:02:46 compute-0 NetworkManager[55691]: <info>  [1764936166.2014] device (tapb5ee44c8-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.208 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[887b6b1e-b724-4571-9643-8b9e5775bf0f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 podman[218850]: 2025-12-05 12:02:46.218590917 +0000 UTC m=+0.107007608 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.241 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[509f4b85-1858-4abd-a6de-f23579b4a14a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.262 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4d2fb4-494e-4174-9bfb-4f15515c8eb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 NetworkManager[55691]: <info>  [1764936166.2653] manager: (tap393d33f9-20): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.295 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2b3ec3fc-e25e-416a-850a-23a8256664c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.299 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[02ab0d77-8d4f-49a5-bffe-2a37f39a6b7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 NetworkManager[55691]: <info>  [1764936166.3213] device (tap393d33f9-20): carrier: link connected
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.327 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[55136a13-d4b8-40b6-a343-e842315e32e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.342 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f16fde2-cedb-482b-b686-c73a97093988]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap393d33f9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:b1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354688, 'reachable_time': 36852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218909, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.359 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72b76ae2-6228-480e-aa05-c133b6fd235a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe23:b198'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 354688, 'tstamp': 354688}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218910, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.373 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3557c0-d83a-46e0-a87f-fb3e504258ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap393d33f9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:b1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354688, 'reachable_time': 36852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218911, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.399 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c037f1-da00-461e-92e7-0ebed0b93076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.458 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c978c579-7032-4e29-bb61-2c962f9655f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.460 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap393d33f9-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.460 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.460 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap393d33f9-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:46 compute-0 kernel: tap393d33f9-20: entered promiscuous mode
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.462 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:46 compute-0 NetworkManager[55691]: <info>  [1764936166.4654] manager: (tap393d33f9-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.467 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap393d33f9-20, col_values=(('external_ids', {'iface-id': '4f5e3c8a-5273-4414-820c-16ae051153f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:46 compute-0 ovn_controller[95610]: 2025-12-05T12:02:46Z|00214|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.472 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.472 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8152e76d-7945-4402-b84a-d01feea87b77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.473 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:02:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.474 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'env', 'PROCESS_TAG=haproxy-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.479 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.508 187212 DEBUG nova.compute.manager [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received event network-changed-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.509 187212 DEBUG nova.compute.manager [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Refreshing instance network info cache due to event network-changed-d067fc33-ba4d-48f6-98f5-51ebca4adbc5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.510 187212 DEBUG oslo_concurrency.lockutils [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.555 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936166.554855, bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.555 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] VM Started (Lifecycle Event)
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.574 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.587 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936166.5555856, bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.587 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] VM Paused (Lifecycle Event)
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.609 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.612 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.630 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.868 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Updating instance_info_cache with network_info: [{"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:46 compute-0 podman[218950]: 2025-12-05 12:02:46.882049557 +0000 UTC m=+0.108418817 container create 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 05 12:02:46 compute-0 podman[218950]: 2025-12-05 12:02:46.794828236 +0000 UTC m=+0.021197506 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:02:46 compute-0 systemd[1]: Started libpod-conmon-1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5.scope.
Dec 05 12:02:46 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:02:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61c2e9c401d8673a2e5697d0fcec7ebf4e56c09bb215df29178f1e818f6cd815/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.961 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Releasing lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.961 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance network_info: |[{"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.962 187212 DEBUG oslo_concurrency.lockutils [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.962 187212 DEBUG nova.network.neutron [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Refreshing network info cache for port d067fc33-ba4d-48f6-98f5-51ebca4adbc5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.965 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start _get_guest_xml network_info=[{"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.970 187212 WARNING nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.976 187212 DEBUG nova.virt.libvirt.host [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.976 187212 DEBUG nova.virt.libvirt.host [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.981 187212 DEBUG nova.virt.libvirt.host [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.982 187212 DEBUG nova.virt.libvirt.host [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.982 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.982 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.982 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.984 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.984 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.984 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.987 187212 DEBUG nova.virt.libvirt.vif [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1938885940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1938885940',id=30,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a2f4fffdace4b2fa0e0b6cdfc1055f5',ramdisk_id='',reservation_id='r-n0y57b39',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-44253202',owner_user_name='tempest-InstanceActionsV221TestJSON-44253202-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:38Z,user_data=None,user_id='e4a0640c63a14775b62a4d40c4860519',uuid=fe8aefc3-96cb-4d4e-a684-1453a7df2fa1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.988 187212 DEBUG nova.network.os_vif_util [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converting VIF {"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.988 187212 DEBUG nova.network.os_vif_util [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:46 compute-0 nova_compute[187208]: 2025-12-05 12:02:46.989 187212 DEBUG nova.objects.instance [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.004 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <uuid>fe8aefc3-96cb-4d4e-a684-1453a7df2fa1</uuid>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <name>instance-0000001e</name>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <nova:name>tempest-InstanceActionsV221TestJSON-server-1938885940</nova:name>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:02:46</nova:creationTime>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:02:47 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:02:47 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:02:47 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:02:47 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:02:47 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:02:47 compute-0 nova_compute[187208]:         <nova:user uuid="e4a0640c63a14775b62a4d40c4860519">tempest-InstanceActionsV221TestJSON-44253202-project-member</nova:user>
Dec 05 12:02:47 compute-0 nova_compute[187208]:         <nova:project uuid="6a2f4fffdace4b2fa0e0b6cdfc1055f5">tempest-InstanceActionsV221TestJSON-44253202</nova:project>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:02:47 compute-0 nova_compute[187208]:         <nova:port uuid="d067fc33-ba4d-48f6-98f5-51ebca4adbc5">
Dec 05 12:02:47 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <system>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <entry name="serial">fe8aefc3-96cb-4d4e-a684-1453a7df2fa1</entry>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <entry name="uuid">fe8aefc3-96cb-4d4e-a684-1453a7df2fa1</entry>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     </system>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <os>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   </os>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <features>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   </features>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.config"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:cf:98:15"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <target dev="tapd067fc33-ba"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/console.log" append="off"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <video>
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     </video>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:02:47 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:02:47 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:02:47 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:02:47 compute-0 nova_compute[187208]: </domain>
Dec 05 12:02:47 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.004 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Preparing to wait for external event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.004 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.005 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.005 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.005 187212 DEBUG nova.virt.libvirt.vif [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1938885940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1938885940',id=30,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a2f4fffdace4b2fa0e0b6cdfc1055f5',ramdisk_id='',reservation_id='r-n0y57b39',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-44253202',owner_user_name='tempest-InstanceActionsV221TestJSON-44253202-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:38Z,user_data=None,user_id='e4a0640c63a14775b62a4d40c4860519',uuid=fe8aefc3-96cb-4d4e-a684-1453a7df2fa1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.006 187212 DEBUG nova.network.os_vif_util [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converting VIF {"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.006 187212 DEBUG nova.network.os_vif_util [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.006 187212 DEBUG os_vif [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.007 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.007 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.008 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.011 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.011 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd067fc33-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.012 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd067fc33-ba, col_values=(('external_ids', {'iface-id': 'd067fc33-ba4d-48f6-98f5-51ebca4adbc5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:98:15', 'vm-uuid': 'fe8aefc3-96cb-4d4e-a684-1453a7df2fa1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.013 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:47 compute-0 NetworkManager[55691]: <info>  [1764936167.0143] manager: (tapd067fc33-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.016 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.020 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.021 187212 INFO os_vif [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba')
Dec 05 12:02:47 compute-0 podman[218950]: 2025-12-05 12:02:47.035951713 +0000 UTC m=+0.262321003 container init 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:02:47 compute-0 podman[218950]: 2025-12-05 12:02:47.041571964 +0000 UTC m=+0.267941224 container start 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:02:47 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [NOTICE]   (218972) : New worker (218974) forked
Dec 05 12:02:47 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [NOTICE]   (218972) : Loading success.
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.077 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.078 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.078 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] No VIF found with MAC fa:16:3e:cf:98:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.079 187212 INFO nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Using config drive
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.530 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.612 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.612 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.628 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.722 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.722 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.728 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.728 187212 INFO nova.compute.claims [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.901 187212 DEBUG nova.compute.provider_tree [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.906 187212 INFO nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Creating config drive at /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.config
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.910 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeyo1hah9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.933 187212 DEBUG nova.scheduler.client.report [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.959 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:47 compute-0 nova_compute[187208]: 2025-12-05 12:02:47.959 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.005 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.005 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.030 187212 INFO nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.041 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeyo1hah9" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.048 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:02:48 compute-0 kernel: tapd067fc33-ba: entered promiscuous mode
Dec 05 12:02:48 compute-0 NetworkManager[55691]: <info>  [1764936168.1057] manager: (tapd067fc33-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Dec 05 12:02:48 compute-0 systemd-udevd[218902]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:48 compute-0 ovn_controller[95610]: 2025-12-05T12:02:48Z|00215|binding|INFO|Claiming lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 for this chassis.
Dec 05 12:02:48 compute-0 ovn_controller[95610]: 2025-12-05T12:02:48Z|00216|binding|INFO|d067fc33-ba4d-48f6-98f5-51ebca4adbc5: Claiming fa:16:3e:cf:98:15 10.100.0.10
Dec 05 12:02:48 compute-0 NetworkManager[55691]: <info>  [1764936168.1251] device (tapd067fc33-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:02:48 compute-0 NetworkManager[55691]: <info>  [1764936168.1264] device (tapd067fc33-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:02:48 compute-0 systemd-machined[153543]: New machine qemu-34-instance-0000001e.
Dec 05 12:02:48 compute-0 ovn_controller[95610]: 2025-12-05T12:02:48Z|00217|binding|INFO|Setting lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 ovn-installed in OVS
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:48 compute-0 systemd[1]: Started Virtual Machine qemu-34-instance-0000001e.
Dec 05 12:02:48 compute-0 ovn_controller[95610]: 2025-12-05T12:02:48Z|00218|binding|INFO|Setting lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 up in Southbound
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.439 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:98:15 10.100.0.10'], port_security=['fa:16:3e:cf:98:15 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fe8aefc3-96cb-4d4e-a684-1453a7df2fa1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a2f4fffdace4b2fa0e0b6cdfc1055f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69ff7ffc-62fc-4ff2-b5ba-0e716613e8dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50afd2c5-83ef-4c4d-9a1d-616d6eca472d, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d067fc33-ba4d-48f6-98f5-51ebca4adbc5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.440 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d067fc33-ba4d-48f6-98f5-51ebca4adbc5 in datapath 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 bound to our chassis
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.443 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.449 187212 DEBUG nova.network.neutron [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updated VIF entry in instance network info cache for port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.450 187212 DEBUG nova.network.neutron [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updating instance_info_cache with network_info: [{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.453 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[52e09350-476e-46d2-85cb-6062702988cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.454 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7364e4f7-51 in ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.457 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7364e4f7-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.457 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d928020d-85e0-487e-b654-007c0c7b2fa0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.459 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5a5f5b16-e529-4d5c-899c-2c7a0d70935a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.474 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[cbdfc457-35c2-4ae6-852f-e06fe635308b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.491 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c788a826-39f9-49d1-81d5-64c05d0c3a9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.505 187212 DEBUG oslo_concurrency.lockutils [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.518 187212 DEBUG nova.policy [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b5f1bf811e6c42d699922035de0b538c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '55d3be64e01442ca8f492d2f3e10d1cc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.520 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc8ac7b-61e0-40b7-bc1a-ad546574366d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.524 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.525 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.525 187212 INFO nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Creating image(s)
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.526 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.526 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.527 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:48 compute-0 NetworkManager[55691]: <info>  [1764936168.5275] manager: (tap7364e4f7-50): new Veth device (/org/freedesktop/NetworkManager/Devices/92)
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.526 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[654af35f-a83a-46df-b4ec-75378276e0d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.548 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936168.5221484, fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.548 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] VM Started (Lifecycle Event)
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.550 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.567 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b69bee7f-e811-43f0-8885-067db3924e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.572 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[be88aeb8-e09b-497b-a9c1-ac1815feeb99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.582 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.587 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936168.5224235, fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.588 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] VM Paused (Lifecycle Event)
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.596 187212 DEBUG nova.compute.manager [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.596 187212 DEBUG oslo_concurrency.lockutils [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.596 187212 DEBUG oslo_concurrency.lockutils [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.596 187212 DEBUG oslo_concurrency.lockutils [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.597 187212 DEBUG nova.compute.manager [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Processing event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.597 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.602 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:02:48 compute-0 NetworkManager[55691]: <info>  [1764936168.6038] device (tap7364e4f7-50): carrier: link connected
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.607 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.611 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.612 187212 INFO nova.virt.libvirt.driver [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance spawned successfully.
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.613 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.621 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.621 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.622 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.610 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3d1f26-325b-409b-b979-df9f2d7f54ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.634 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.641 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f65194e-0e3e-4f86-a2b6-d55647b39633]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7364e4f7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:e4:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354916, 'reachable_time': 36098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219028, 'error': None, 'target': 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.659 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.659 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936168.6013017, bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.660 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] VM Resumed (Lifecycle Event)
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.661 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6d508e-c766-4451-b01c-0dc4edd9a2b1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:e450'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 354916, 'tstamp': 354916}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219030, 'error': None, 'target': 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.669 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.669 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.670 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.670 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.670 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.671 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.680 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[737b6254-820a-4338-a5d6-9b8103bd58ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7364e4f7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:e4:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354916, 'reachable_time': 36098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219031, 'error': None, 'target': 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.682 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.692 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.699 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.699 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.717 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[333667aa-17ec-4d35-adbf-6afa872e64b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.723 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.735 187212 INFO nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Took 13.12 seconds to spawn the instance on the hypervisor.
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.736 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.741 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.743 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.743 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.773 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8541ba2d-f688-4346-a7c2-77f0a6c287ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.774 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7364e4f7-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.775 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.775 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7364e4f7-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.777 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:48 compute-0 NetworkManager[55691]: <info>  [1764936168.7785] manager: (tap7364e4f7-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Dec 05 12:02:48 compute-0 kernel: tap7364e4f7-50: entered promiscuous mode
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.781 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7364e4f7-50, col_values=(('external_ids', {'iface-id': '4371716d-4b21-4191-b690-7541d0a79660'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.783 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:48 compute-0 ovn_controller[95610]: 2025-12-05T12:02:48Z|00219|binding|INFO|Releasing lport 4371716d-4b21-4191-b690-7541d0a79660 from this chassis (sb_readonly=0)
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.784 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.785 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dbcd3118-5806-45e6-9865-702296e75f39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.786 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7.pid.haproxy
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:02:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.787 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'env', 'PROCESS_TAG=haproxy-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.808 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.810 187212 DEBUG nova.virt.disk.api [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Checking if we can resize image /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.810 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.832 187212 INFO nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Took 13.63 seconds to build instance.
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.867 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.884 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.887 187212 DEBUG nova.virt.disk.api [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Cannot resize image /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.888 187212 DEBUG nova.objects.instance [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lazy-loading 'migration_context' on Instance uuid d70544d6-04e3-4b2a-914a-72db3052216a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.900 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.900 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Ensure instance console log exists: /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.901 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.901 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:48 compute-0 nova_compute[187208]: 2025-12-05 12:02:48.901 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:49 compute-0 podman[219074]: 2025-12-05 12:02:49.185063239 +0000 UTC m=+0.061894469 container create 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 12:02:49 compute-0 systemd[1]: Started libpod-conmon-3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f.scope.
Dec 05 12:02:49 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:02:49 compute-0 podman[219074]: 2025-12-05 12:02:49.152281202 +0000 UTC m=+0.029112452 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:02:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b94271b6f3b062c4a7ae06b3c93c6c47e697dfeccf693d310815cf2eb398a9d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:02:49 compute-0 podman[219074]: 2025-12-05 12:02:49.281945576 +0000 UTC m=+0.158776806 container init 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:02:49 compute-0 podman[219074]: 2025-12-05 12:02:49.290574492 +0000 UTC m=+0.167405722 container start 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:02:49 compute-0 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [NOTICE]   (219094) : New worker (219096) forked
Dec 05 12:02:49 compute-0 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [NOTICE]   (219094) : Loading success.
Dec 05 12:02:49 compute-0 nova_compute[187208]: 2025-12-05 12:02:49.448 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Successfully created port: 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:02:49 compute-0 nova_compute[187208]: 2025-12-05 12:02:49.576 187212 DEBUG nova.network.neutron [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Updated VIF entry in instance network info cache for port d067fc33-ba4d-48f6-98f5-51ebca4adbc5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:02:49 compute-0 nova_compute[187208]: 2025-12-05 12:02:49.576 187212 DEBUG nova.network.neutron [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Updating instance_info_cache with network_info: [{"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:49 compute-0 nova_compute[187208]: 2025-12-05 12:02:49.604 187212 DEBUG oslo_concurrency.lockutils [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:50 compute-0 nova_compute[187208]: 2025-12-05 12:02:50.154 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936155.1529152, 1606eea3-5389-4437-b0f9-cfe6084d7871 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:50 compute-0 nova_compute[187208]: 2025-12-05 12:02:50.155 187212 INFO nova.compute.manager [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] VM Stopped (Lifecycle Event)
Dec 05 12:02:50 compute-0 nova_compute[187208]: 2025-12-05 12:02:50.181 187212 DEBUG nova.compute.manager [None req-ea223c4b-90e6-4dac-ab6b-16464321c2e0 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:50 compute-0 nova_compute[187208]: 2025-12-05 12:02:50.933 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Successfully updated port: 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:02:50 compute-0 nova_compute[187208]: 2025-12-05 12:02:50.948 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:50 compute-0 nova_compute[187208]: 2025-12-05 12:02:50.948 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquired lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:50 compute-0 nova_compute[187208]: 2025-12-05 12:02:50.948 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.106 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.106 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.186 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.271 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.272 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.281 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.281 187212 INFO nova.compute.claims [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.488 187212 DEBUG nova.compute.provider_tree [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.508 187212 DEBUG nova.scheduler.client.report [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.532 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.533 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.604 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.605 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.626 187212 INFO nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.645 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.652 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.747 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.749 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.749 187212 INFO nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Creating image(s)
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.750 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.751 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.752 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.768 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.827 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.828 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.829 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.841 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.900 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.901 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.936 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.937 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.937 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.995 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.996 187212 DEBUG nova.virt.disk.api [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Checking if we can resize image /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:02:51 compute-0 nova_compute[187208]: 2025-12-05 12:02:51.997 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.057 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.058 187212 DEBUG nova.virt.disk.api [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Cannot resize image /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.059 187212 DEBUG nova.objects.instance [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'migration_context' on Instance uuid 2f42f732-65c6-4c4a-9332-47098d7350b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.075 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.076 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Ensure instance console log exists: /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.076 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.076 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.077 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:52 compute-0 podman[219136]: 2025-12-05 12:02:52.209722672 +0000 UTC m=+0.055034193 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 12:02:52 compute-0 podman[219135]: 2025-12-05 12:02:52.224038261 +0000 UTC m=+0.074385785 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.222 187212 DEBUG nova.policy [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4671f6c82ea049fab3a314ecf45b7656', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.234 187212 DEBUG nova.compute.manager [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 DEBUG oslo_concurrency.lockutils [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 DEBUG oslo_concurrency.lockutils [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 DEBUG oslo_concurrency.lockutils [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 DEBUG nova.compute.manager [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] No waiting events found dispatching network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 WARNING nova.compute.manager [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received unexpected event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff for instance with vm_state active and task_state None.
Dec 05 12:02:52 compute-0 ovn_controller[95610]: 2025-12-05T12:02:52Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:53:25:56 10.100.0.7
Dec 05 12:02:52 compute-0 ovn_controller[95610]: 2025-12-05T12:02:52Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:25:56 10.100.0.7
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.531 187212 DEBUG nova.compute.manager [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.532 187212 DEBUG oslo_concurrency.lockutils [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.532 187212 DEBUG oslo_concurrency.lockutils [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.532 187212 DEBUG oslo_concurrency.lockutils [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.532 187212 DEBUG nova.compute.manager [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Processing event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.533 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.536 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.540 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936172.5399163, fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.541 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] VM Resumed (Lifecycle Event)
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.543 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.547 187212 INFO nova.virt.libvirt.driver [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance spawned successfully.
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.547 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.567 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.574 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.577 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.578 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.578 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.578 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.579 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.579 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.613 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.657 187212 INFO nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Took 13.82 seconds to spawn the instance on the hypervisor.
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.658 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.757 187212 INFO nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Took 14.37 seconds to build instance.
Dec 05 12:02:52 compute-0 nova_compute[187208]: 2025-12-05 12:02:52.779 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.311 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updating instance_info_cache with network_info: [{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.335 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Releasing lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.335 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance network_info: |[{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.338 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start _get_guest_xml network_info=[{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.347 187212 WARNING nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.362 187212 DEBUG nova.virt.libvirt.host [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.363 187212 DEBUG nova.virt.libvirt.host [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.369 187212 DEBUG nova.virt.libvirt.host [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.370 187212 DEBUG nova.virt.libvirt.host [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.371 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.371 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.372 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.372 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.372 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.372 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.373 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.373 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.373 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.374 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.374 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.375 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.380 187212 DEBUG nova.virt.libvirt.vif [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1222437752',display_name='tempest-ImagesOneServerTestJSON-server-1222437752',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1222437752',id=31,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55d3be64e01442ca8f492d2f3e10d1cc',ramdisk_id='',reservation_id='r-zbitw7u9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1350277374',owner_user_name='tempest-ImagesOneServerTestJSON-1350277374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:48Z,user_data=None,user_id='b5f1bf811e6c42d699922035de0b538c',uuid=d70544d6-04e3-4b2a-914a-72db3052216a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.380 187212 DEBUG nova.network.os_vif_util [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converting VIF {"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.382 187212 DEBUG nova.network.os_vif_util [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.383 187212 DEBUG nova.objects.instance [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lazy-loading 'pci_devices' on Instance uuid d70544d6-04e3-4b2a-914a-72db3052216a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.399 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <uuid>d70544d6-04e3-4b2a-914a-72db3052216a</uuid>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <name>instance-0000001f</name>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesOneServerTestJSON-server-1222437752</nova:name>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:02:53</nova:creationTime>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:02:53 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:02:53 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:02:53 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:02:53 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:02:53 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:02:53 compute-0 nova_compute[187208]:         <nova:user uuid="b5f1bf811e6c42d699922035de0b538c">tempest-ImagesOneServerTestJSON-1350277374-project-member</nova:user>
Dec 05 12:02:53 compute-0 nova_compute[187208]:         <nova:project uuid="55d3be64e01442ca8f492d2f3e10d1cc">tempest-ImagesOneServerTestJSON-1350277374</nova:project>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:02:53 compute-0 nova_compute[187208]:         <nova:port uuid="99a1ab7f-bf64-4cc9-846c-9748ff4a93dc">
Dec 05 12:02:53 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <system>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <entry name="serial">d70544d6-04e3-4b2a-914a-72db3052216a</entry>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <entry name="uuid">d70544d6-04e3-4b2a-914a-72db3052216a</entry>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     </system>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <os>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   </os>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <features>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   </features>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.config"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:a9:8a:0c"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <target dev="tap99a1ab7f-bf"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/console.log" append="off"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <video>
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     </video>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:02:53 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:02:53 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:02:53 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:02:53 compute-0 nova_compute[187208]: </domain>
Dec 05 12:02:53 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.400 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Preparing to wait for external event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.409 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.410 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.410 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.411 187212 DEBUG nova.virt.libvirt.vif [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1222437752',display_name='tempest-ImagesOneServerTestJSON-server-1222437752',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1222437752',id=31,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55d3be64e01442ca8f492d2f3e10d1cc',ramdisk_id='',reservation_id='r-zbitw7u9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1350277374',owner_user_name='tempest-ImagesOneServerTestJSON-1350277374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:48Z,user_data=None,user_id='b5f1bf811e6c42d699922035de0b538c',uuid=d70544d6-04e3-4b2a-914a-72db3052216a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.411 187212 DEBUG nova.network.os_vif_util [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converting VIF {"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.412 187212 DEBUG nova.network.os_vif_util [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.414 187212 DEBUG os_vif [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.415 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.416 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.416 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.421 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.422 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99a1ab7f-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.422 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap99a1ab7f-bf, col_values=(('external_ids', {'iface-id': '99a1ab7f-bf64-4cc9-846c-9748ff4a93dc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:8a:0c', 'vm-uuid': 'd70544d6-04e3-4b2a-914a-72db3052216a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.424 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:53 compute-0 NetworkManager[55691]: <info>  [1764936173.4253] manager: (tap99a1ab7f-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.427 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.431 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.432 187212 INFO os_vif [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf')
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.491 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.491 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.491 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] No VIF found with MAC fa:16:3e:a9:8a:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.492 187212 INFO nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Using config drive
Dec 05 12:02:53 compute-0 nova_compute[187208]: 2025-12-05 12:02:53.549 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Successfully created port: b785a426-63ba-453e-95dc-3aa63f9f75a9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.005 187212 INFO nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Creating config drive at /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.config
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.011 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpautor3nt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.138 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpautor3nt" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:02:54 compute-0 kernel: tap99a1ab7f-bf: entered promiscuous mode
Dec 05 12:02:54 compute-0 NetworkManager[55691]: <info>  [1764936174.2070] manager: (tap99a1ab7f-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Dec 05 12:02:54 compute-0 ovn_controller[95610]: 2025-12-05T12:02:54Z|00220|binding|INFO|Claiming lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc for this chassis.
Dec 05 12:02:54 compute-0 ovn_controller[95610]: 2025-12-05T12:02:54Z|00221|binding|INFO|99a1ab7f-bf64-4cc9-846c-9748ff4a93dc: Claiming fa:16:3e:a9:8a:0c 10.100.0.12
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.213 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.224 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:8a:0c 10.100.0.12'], port_security=['fa:16:3e:a9:8a:0c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd70544d6-04e3-4b2a-914a-72db3052216a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39702279-01de-4f4b-bc33-58c8c6f673e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55d3be64e01442ca8f492d2f3e10d1cc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7da5af47-2519-44c3-bc78-6f5347e93e10', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94b6aed5-905a-43ff-81d8-6adfe368f476, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.227 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc in datapath 39702279-01de-4f4b-bc33-58c8c6f673e3 bound to our chassis
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.230 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 39702279-01de-4f4b-bc33-58c8c6f673e3
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.241 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d80db8c2-899f-4621-a519-f023130ccca3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.243 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap39702279-01 in ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.247 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap39702279-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.247 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[46b13f17-7080-4e82-a12f-3c68257176d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.248 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bedc39ee-fc8c-4e2b-b129-49004f3dae3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 systemd-udevd[219189]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.263 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[55952aac-c8db-4a11-83c1-632ca82dd112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 NetworkManager[55691]: <info>  [1764936174.2667] device (tap99a1ab7f-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:02:54 compute-0 NetworkManager[55691]: <info>  [1764936174.2678] device (tap99a1ab7f-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.277 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:54 compute-0 ovn_controller[95610]: 2025-12-05T12:02:54Z|00222|binding|INFO|Setting lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc ovn-installed in OVS
Dec 05 12:02:54 compute-0 ovn_controller[95610]: 2025-12-05T12:02:54Z|00223|binding|INFO|Setting lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc up in Southbound
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.282 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0dff8c46-75b0-479f-bcf1-acee21c915c0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.283 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:54 compute-0 systemd-machined[153543]: New machine qemu-35-instance-0000001f.
Dec 05 12:02:54 compute-0 systemd[1]: Started Virtual Machine qemu-35-instance-0000001f.
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.324 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6184b2-4f5a-4509-a25e-ff5b6193a066]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.330 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8cc81d-9054-419a-a04a-f53df8548b48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 NetworkManager[55691]: <info>  [1764936174.3314] manager: (tap39702279-00): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.373 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c5207d-663e-4fef-a6b9-a80ffa994d94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.376 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[66677081-35c9-421e-90a2-446541e14336]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 NetworkManager[55691]: <info>  [1764936174.3996] device (tap39702279-00): carrier: link connected
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.404 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[de805b68-ca91-49a8-9a6f-9de202e99489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.441 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0dafb4e8-3a1a-4e40-a85c-7e01f6214745]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap39702279-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:7b:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355496, 'reachable_time': 44993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219224, 'error': None, 'target': 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.454 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92ce0d99-77d3-4e6b-af35-a6131441989a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:7bef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 355496, 'tstamp': 355496}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219227, 'error': None, 'target': 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.475 187212 DEBUG nova.virt.libvirt.driver [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.476 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[548a3c49-21fd-44d9-94df-4acb28392592]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap39702279-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:7b:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355496, 'reachable_time': 44993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219231, 'error': None, 'target': 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.506 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[31ea6568-2900-40a8-a973-992d609bf4d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.552 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6e5a02-f83d-4ff5-b906-f83fe94c2a2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39702279-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39702279-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:54 compute-0 NetworkManager[55691]: <info>  [1764936174.5581] manager: (tap39702279-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Dec 05 12:02:54 compute-0 kernel: tap39702279-00: entered promiscuous mode
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.557 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.560 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap39702279-00, col_values=(('external_ids', {'iface-id': '55380907-78ff-4f14-8b9a-7ccb714bf36a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:54 compute-0 ovn_controller[95610]: 2025-12-05T12:02:54Z|00224|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.578 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/39702279-01de-4f4b-bc33-58c8c6f673e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/39702279-01de-4f4b-bc33-58c8c6f673e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.579 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43d38983-b2d1-43be-9394-5098de7b5f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.580 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-39702279-01de-4f4b-bc33-58c8c6f673e3
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/39702279-01de-4f4b-bc33-58c8c6f673e3.pid.haproxy
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 39702279-01de-4f4b-bc33-58c8c6f673e3
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:02:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.580 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'env', 'PROCESS_TAG=haproxy-39702279-01de-4f4b-bc33-58c8c6f673e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/39702279-01de-4f4b-bc33-58c8c6f673e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.725 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936174.7248533, d70544d6-04e3-4b2a-914a-72db3052216a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.725 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] VM Started (Lifecycle Event)
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.747 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.752 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936174.725062, d70544d6-04e3-4b2a-914a-72db3052216a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.752 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] VM Paused (Lifecycle Event)
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.775 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.777 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:02:54 compute-0 nova_compute[187208]: 2025-12-05 12:02:54.803 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:02:54 compute-0 podman[219264]: 2025-12-05 12:02:54.982128251 +0000 UTC m=+0.053423917 container create 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:02:55 compute-0 systemd[1]: Started libpod-conmon-004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff.scope.
Dec 05 12:02:55 compute-0 podman[219264]: 2025-12-05 12:02:54.95688135 +0000 UTC m=+0.028177046 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:02:55 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:02:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc8314cf85594aa36784fe5dbf1012ba087261d9468b0e3431f9bb9c756a87d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:02:55 compute-0 podman[219264]: 2025-12-05 12:02:55.072623306 +0000 UTC m=+0.143919002 container init 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:02:55 compute-0 podman[219264]: 2025-12-05 12:02:55.077644229 +0000 UTC m=+0.148939895 container start 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:02:55 compute-0 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [NOTICE]   (219283) : New worker (219285) forked
Dec 05 12:02:55 compute-0 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [NOTICE]   (219283) : Loading success.
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.160 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.161 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.161 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.161 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.162 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.163 187212 INFO nova.compute.manager [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Terminating instance
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.166 187212 DEBUG nova.compute.manager [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:02:55 compute-0 kernel: tapd067fc33-ba (unregistering): left promiscuous mode
Dec 05 12:02:55 compute-0 NetworkManager[55691]: <info>  [1764936175.1925] device (tapd067fc33-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:55 compute-0 ovn_controller[95610]: 2025-12-05T12:02:55Z|00225|binding|INFO|Releasing lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 from this chassis (sb_readonly=0)
Dec 05 12:02:55 compute-0 ovn_controller[95610]: 2025-12-05T12:02:55Z|00226|binding|INFO|Setting lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 down in Southbound
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.198 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:55 compute-0 ovn_controller[95610]: 2025-12-05T12:02:55Z|00227|binding|INFO|Removing iface tapd067fc33-ba ovn-installed in OVS
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.205 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:98:15 10.100.0.10'], port_security=['fa:16:3e:cf:98:15 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fe8aefc3-96cb-4d4e-a684-1453a7df2fa1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a2f4fffdace4b2fa0e0b6cdfc1055f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69ff7ffc-62fc-4ff2-b5ba-0e716613e8dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50afd2c5-83ef-4c4d-9a1d-616d6eca472d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d067fc33-ba4d-48f6-98f5-51ebca4adbc5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.206 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d067fc33-ba4d-48f6-98f5-51ebca4adbc5 in datapath 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 unbound from our chassis
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.208 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.209 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc6030a-3ca7-48a2-a7f2-851ac5d1fffb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.210 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 namespace which is not needed anymore
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.209 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.251 187212 DEBUG nova.compute.manager [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-changed-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:55 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Dec 05 12:02:55 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000001e.scope: Consumed 2.909s CPU time.
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.252 187212 DEBUG nova.compute.manager [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Refreshing instance network info cache due to event network-changed-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.253 187212 DEBUG oslo_concurrency.lockutils [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.254 187212 DEBUG oslo_concurrency.lockutils [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.255 187212 DEBUG nova.network.neutron [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Refreshing network info cache for port 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:02:55 compute-0 systemd-machined[153543]: Machine qemu-34-instance-0000001e terminated.
Dec 05 12:02:55 compute-0 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [NOTICE]   (219094) : haproxy version is 2.8.14-c23fe91
Dec 05 12:02:55 compute-0 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [NOTICE]   (219094) : path to executable is /usr/sbin/haproxy
Dec 05 12:02:55 compute-0 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [WARNING]  (219094) : Exiting Master process...
Dec 05 12:02:55 compute-0 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [ALERT]    (219094) : Current worker (219096) exited with code 143 (Terminated)
Dec 05 12:02:55 compute-0 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [WARNING]  (219094) : All workers exited. Exiting... (0)
Dec 05 12:02:55 compute-0 systemd[1]: libpod-3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f.scope: Deactivated successfully.
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.343 187212 DEBUG nova.compute.manager [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.344 187212 DEBUG oslo_concurrency.lockutils [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.345 187212 DEBUG oslo_concurrency.lockutils [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:55 compute-0 podman[219314]: 2025-12-05 12:02:55.345689656 +0000 UTC m=+0.048206798 container died 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.345 187212 DEBUG oslo_concurrency.lockutils [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.346 187212 DEBUG nova.compute.manager [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] No waiting events found dispatching network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.346 187212 WARNING nova.compute.manager [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received unexpected event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 for instance with vm_state active and task_state deleting.
Dec 05 12:02:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f-userdata-shm.mount: Deactivated successfully.
Dec 05 12:02:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-3b94271b6f3b062c4a7ae06b3c93c6c47e697dfeccf693d310815cf2eb398a9d-merged.mount: Deactivated successfully.
Dec 05 12:02:55 compute-0 podman[219314]: 2025-12-05 12:02:55.392472582 +0000 UTC m=+0.094989714 container cleanup 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Dec 05 12:02:55 compute-0 systemd[1]: libpod-conmon-3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f.scope: Deactivated successfully.
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.461 187212 INFO nova.virt.libvirt.driver [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance destroyed successfully.
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.462 187212 DEBUG nova.objects.instance [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lazy-loading 'resources' on Instance uuid fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.477 187212 DEBUG nova.virt.libvirt.vif [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1938885940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1938885940',id=30,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6a2f4fffdace4b2fa0e0b6cdfc1055f5',ramdisk_id='',reservation_id='r-n0y57b39',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsV221TestJSON-44253202',owner_user_name='tempest-InstanceActionsV221TestJSON-44253202-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:52Z,user_data=None,user_id='e4a0640c63a14775b62a4d40c4860519',uuid=fe8aefc3-96cb-4d4e-a684-1453a7df2fa1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.477 187212 DEBUG nova.network.os_vif_util [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converting VIF {"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.479 187212 DEBUG nova.network.os_vif_util [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.479 187212 DEBUG os_vif [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.482 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.482 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd067fc33-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.485 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.488 187212 INFO os_vif [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba')
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.489 187212 INFO nova.virt.libvirt.driver [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Deleting instance files /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1_del
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.489 187212 INFO nova.virt.libvirt.driver [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Deletion of /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1_del complete
Dec 05 12:02:55 compute-0 podman[219352]: 2025-12-05 12:02:55.501264009 +0000 UTC m=+0.085915345 container remove 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.509 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[19513d5b-3614-4192-8da7-8e0718f680d7]: (4, ('Fri Dec  5 12:02:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 (3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f)\n3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f\nFri Dec  5 12:02:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 (3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f)\n3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.514 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ccc435bc-7e02-4831-80a2-b193c78bf3ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.515 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7364e4f7-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.517 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:55 compute-0 kernel: tap7364e4f7-50: left promiscuous mode
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.521 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d84e912a-e46b-4934-aa97-5ff975cd9c8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.544 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3f72f5-a5bf-4e15-be97-e3fbdd6433c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.545 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6deec074-4923-46ba-b989-e70ba7f8ed68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.551 187212 INFO nova.compute.manager [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.552 187212 DEBUG oslo.service.loopingcall [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.553 187212 DEBUG nova.compute.manager [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:02:55 compute-0 nova_compute[187208]: 2025-12-05 12:02:55.553 187212 DEBUG nova.network.neutron [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.571 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5733b884-37b3-4df7-82e4-0bff7789446d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354907, 'reachable_time': 17533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219372, 'error': None, 'target': 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.573 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:02:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.574 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[eb3de0c1-2c55-4469-9e9d-230fd3b54725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d7364e4f7\x2d59f6\x2d4a8e\x2d95d7\x2ddd1efe1ab7f7.mount: Deactivated successfully.
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.523 187212 DEBUG nova.network.neutron [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updated VIF entry in instance network info cache for port 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.523 187212 DEBUG nova.network.neutron [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updating instance_info_cache with network_info: [{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.547 187212 DEBUG oslo_concurrency.lockutils [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.565 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:56 compute-0 NetworkManager[55691]: <info>  [1764936176.5694] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Dec 05 12:02:56 compute-0 NetworkManager[55691]: <info>  [1764936176.5706] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.598 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Successfully updated port: b785a426-63ba-453e-95dc-3aa63f9f75a9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.615 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.615 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.615 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:56 compute-0 ovn_controller[95610]: 2025-12-05T12:02:56Z|00228|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:02:56 compute-0 ovn_controller[95610]: 2025-12-05T12:02:56Z|00229|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec 05 12:02:56 compute-0 ovn_controller[95610]: 2025-12-05T12:02:56Z|00230|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:56 compute-0 kernel: tap82089bf4-20 (unregistering): left promiscuous mode
Dec 05 12:02:56 compute-0 NetworkManager[55691]: <info>  [1764936176.8273] device (tap82089bf4-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.832 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:56 compute-0 ovn_controller[95610]: 2025-12-05T12:02:56Z|00231|binding|INFO|Releasing lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb from this chassis (sb_readonly=0)
Dec 05 12:02:56 compute-0 ovn_controller[95610]: 2025-12-05T12:02:56Z|00232|binding|INFO|Setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb down in Southbound
Dec 05 12:02:56 compute-0 ovn_controller[95610]: 2025-12-05T12:02:56Z|00233|binding|INFO|Removing iface tap82089bf4-20 ovn-installed in OVS
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.834 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.845 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:25:56 10.100.0.7'], port_security=['fa:16:3e:53:25:56 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9efa988a-19ae-440a-8a56-0bac68cb3c9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=82089bf4-207e-4880-b8ff-9bf09a4ac3fb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.846 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.847 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.849 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:02:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.850 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bdac87a9-0e95-4272-aa40-7b45ffc1b5be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.851 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore
Dec 05 12:02:56 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Dec 05 12:02:56 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000001c.scope: Consumed 12.859s CPU time.
Dec 05 12:02:56 compute-0 systemd-machined[153543]: Machine qemu-32-instance-0000001c terminated.
Dec 05 12:02:56 compute-0 nova_compute[187208]: 2025-12-05 12:02:56.967 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:02:56 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [NOTICE]   (218801) : haproxy version is 2.8.14-c23fe91
Dec 05 12:02:56 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [NOTICE]   (218801) : path to executable is /usr/sbin/haproxy
Dec 05 12:02:56 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [WARNING]  (218801) : Exiting Master process...
Dec 05 12:02:56 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [ALERT]    (218801) : Current worker (218803) exited with code 143 (Terminated)
Dec 05 12:02:56 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [WARNING]  (218801) : All workers exited. Exiting... (0)
Dec 05 12:02:56 compute-0 systemd[1]: libpod-5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5.scope: Deactivated successfully.
Dec 05 12:02:56 compute-0 podman[219395]: 2025-12-05 12:02:56.985037561 +0000 UTC m=+0.052556492 container died 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5-userdata-shm.mount: Deactivated successfully.
Dec 05 12:02:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-df6dbe72f381b99c1a982824974be4cad9c714499ad9f3e0b3b9579fdd54f357-merged.mount: Deactivated successfully.
Dec 05 12:02:57 compute-0 podman[219395]: 2025-12-05 12:02:57.043149521 +0000 UTC m=+0.110668442 container cleanup 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 12:02:57 compute-0 systemd[1]: libpod-conmon-5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5.scope: Deactivated successfully.
Dec 05 12:02:57 compute-0 kernel: tap82089bf4-20: entered promiscuous mode
Dec 05 12:02:57 compute-0 systemd-udevd[219214]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:02:57 compute-0 ovn_controller[95610]: 2025-12-05T12:02:57Z|00234|binding|INFO|Claiming lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb for this chassis.
Dec 05 12:02:57 compute-0 ovn_controller[95610]: 2025-12-05T12:02:57Z|00235|binding|INFO|82089bf4-207e-4880-b8ff-9bf09a4ac3fb: Claiming fa:16:3e:53:25:56 10.100.0.7
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.068 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 NetworkManager[55691]: <info>  [1764936177.0690] manager: (tap82089bf4-20): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Dec 05 12:02:57 compute-0 kernel: tap82089bf4-20 (unregistering): left promiscuous mode
Dec 05 12:02:57 compute-0 ovn_controller[95610]: 2025-12-05T12:02:57Z|00236|binding|INFO|Setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb ovn-installed in OVS
Dec 05 12:02:57 compute-0 ovn_controller[95610]: 2025-12-05T12:02:57Z|00237|if_status|INFO|Not setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb down as sb is readonly
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.093 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 podman[219428]: 2025-12-05 12:02:57.121832179 +0000 UTC m=+0.050334399 container remove 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.128 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e72e3c02-bd24-4cd5-8ddb-701b9b13ba56]: (4, ('Fri Dec  5 12:02:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5)\n5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5\nFri Dec  5 12:02:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5)\n5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.130 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6da58949-93d3-402a-9e02-09819381cb68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.131 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.132 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 kernel: tap41b3b495-c0: left promiscuous mode
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.147 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.151 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3c22d8e7-e5e0-4739-adb9-6827387750e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.167 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ad222c77-6f28-4769-bca3-a140830b9082]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.169 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f44ac454-f43f-4e82-9dfb-022e03f1b6c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.182 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e6e4f4-d7d8-478c-bbe6-6ce850318ff2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353779, 'reachable_time': 38401, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219456, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.184 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.184 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a80faedf-a9b1-4ac9-add0-ca34fa20d15f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec 05 12:02:57 compute-0 ovn_controller[95610]: 2025-12-05T12:02:57Z|00238|binding|INFO|Releasing lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb from this chassis (sb_readonly=0)
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.323 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:25:56 10.100.0.7'], port_security=['fa:16:3e:53:25:56 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9efa988a-19ae-440a-8a56-0bac68cb3c9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=82089bf4-207e-4880-b8ff-9bf09a4ac3fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.324 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.326 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.339 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.339 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:25:56 10.100.0.7'], port_security=['fa:16:3e:53:25:56 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9efa988a-19ae-440a-8a56-0bac68cb3c9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=82089bf4-207e-4880-b8ff-9bf09a4ac3fb) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.339 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[662291bc-93ff-47e2-85bc-66e3b1fd1c39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.341 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.343 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.343 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c4cfd6a1-8840-4e1d-8df4-4f76d7aeb5d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.343 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6039f0b5-d917-4cb1-9478-65be7cc93ea2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.356 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[6deb6055-8941-4fdc-bb60-091e4cd60cf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.368 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0db11547-e025-46ab-b8ef-96982e732441]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.397 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[164d9ff9-ee75-499a-9267-5a38201ad610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.432 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3088a0d2-9057-436a-8a6d-42ff8ee8bcef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 NetworkManager[55691]: <info>  [1764936177.4339] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/101)
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.469 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9f7fff7f-b1d6-419c-9b4f-ba2d50b3c635]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.472 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[aec30d88-f539-4d9c-ad36-51abcc5be7c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 systemd-udevd[219515]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:02:57 compute-0 NetworkManager[55691]: <info>  [1764936177.4951] device (tap41b3b495-c0): carrier: link connected
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.503 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1abeb798-bd34-43c1-a2c5-27693604031f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.505 187212 INFO nova.virt.libvirt.driver [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance shutdown successfully after 13 seconds.
Dec 05 12:02:57 compute-0 podman[219459]: 2025-12-05 12:02:57.509847812 +0000 UTC m=+0.131809296 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.510 187212 INFO nova.virt.libvirt.driver [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance destroyed successfully.
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.510 187212 DEBUG nova.objects.instance [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:57 compute-0 podman[219461]: 2025-12-05 12:02:57.517268504 +0000 UTC m=+0.135237444 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.518 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[61711658-da08-4de3-a0bd-41c68bcaff7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355805, 'reachable_time': 21360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219535, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.527 187212 DEBUG nova.compute.manager [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.533 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b5e546-53fa-4051-be52-c9e09cbdaa92]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 355805, 'tstamp': 355805}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219536, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.534 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.549 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[123090fc-2700-46b1-96ec-03fb22432a3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355805, 'reachable_time': 21360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219537, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.571 187212 DEBUG oslo_concurrency.lockutils [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.573 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4fcccf-64e4-4057-9a89-dde768a2a304]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.627 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eb664645-d0b5-49fe-8679-32f6b2697995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.629 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.629 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.629 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.631 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 NetworkManager[55691]: <info>  [1764936177.6323] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Dec 05 12:02:57 compute-0 kernel: tap41b3b495-c0: entered promiscuous mode
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.637 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.637 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 ovn_controller[95610]: 2025-12-05T12:02:57Z|00239|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.655 187212 DEBUG nova.compute.manager [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-changed-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.655 187212 DEBUG nova.compute.manager [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Refreshing instance network info cache due to event network-changed-b785a426-63ba-453e-95dc-3aa63f9f75a9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.655 187212 DEBUG oslo_concurrency.lockutils [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.656 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.657 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.657 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c7b386-6f02-44f7-a5d9-a85daafd08de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.658 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:02:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.659 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.967 187212 DEBUG nova.network.neutron [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:57 compute-0 nova_compute[187208]: 2025-12-05 12:02:57.989 187212 INFO nova.compute.manager [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Took 2.44 seconds to deallocate network for instance.
Dec 05 12:02:58 compute-0 podman[219572]: 2025-12-05 12:02:58.000245399 +0000 UTC m=+0.054998062 container create 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:02:58 compute-0 systemd[1]: Started libpod-conmon-8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24.scope.
Dec 05 12:02:58 compute-0 nova_compute[187208]: 2025-12-05 12:02:58.056 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:58 compute-0 nova_compute[187208]: 2025-12-05 12:02:58.057 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:58 compute-0 podman[219572]: 2025-12-05 12:02:57.971505858 +0000 UTC m=+0.026258531 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:02:58 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:02:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e4710660d98a71ae1e5a3da48dbc488f07c954d543d7bb420eaeff7c7de543/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:02:58 compute-0 podman[219572]: 2025-12-05 12:02:58.098704751 +0000 UTC m=+0.153457414 container init 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:02:58 compute-0 podman[219572]: 2025-12-05 12:02:58.104910339 +0000 UTC m=+0.159662982 container start 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:02:58 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [NOTICE]   (219591) : New worker (219593) forked
Dec 05 12:02:58 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [NOTICE]   (219591) : Loading success.
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.170 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.173 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.174 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c04d690b-14a1-4dc6-a604-8ae881e3e4cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.176 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore
Dec 05 12:02:58 compute-0 nova_compute[187208]: 2025-12-05 12:02:58.215 187212 DEBUG nova.compute.provider_tree [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:02:58 compute-0 nova_compute[187208]: 2025-12-05 12:02:58.230 187212 DEBUG nova.scheduler.client.report [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:02:58 compute-0 nova_compute[187208]: 2025-12-05 12:02:58.258 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:58 compute-0 nova_compute[187208]: 2025-12-05 12:02:58.283 187212 INFO nova.scheduler.client.report [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Deleted allocations for instance fe8aefc3-96cb-4d4e-a684-1453a7df2fa1
Dec 05 12:02:58 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [NOTICE]   (219591) : haproxy version is 2.8.14-c23fe91
Dec 05 12:02:58 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [NOTICE]   (219591) : path to executable is /usr/sbin/haproxy
Dec 05 12:02:58 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [WARNING]  (219591) : Exiting Master process...
Dec 05 12:02:58 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [WARNING]  (219591) : Exiting Master process...
Dec 05 12:02:58 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [ALERT]    (219591) : Current worker (219593) exited with code 143 (Terminated)
Dec 05 12:02:58 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [WARNING]  (219591) : All workers exited. Exiting... (0)
Dec 05 12:02:58 compute-0 systemd[1]: libpod-8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24.scope: Deactivated successfully.
Dec 05 12:02:58 compute-0 podman[219619]: 2025-12-05 12:02:58.316971306 +0000 UTC m=+0.049465094 container died 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:02:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24-userdata-shm.mount: Deactivated successfully.
Dec 05 12:02:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-47e4710660d98a71ae1e5a3da48dbc488f07c954d543d7bb420eaeff7c7de543-merged.mount: Deactivated successfully.
Dec 05 12:02:58 compute-0 podman[219619]: 2025-12-05 12:02:58.356548665 +0000 UTC m=+0.089042463 container cleanup 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:02:58 compute-0 nova_compute[187208]: 2025-12-05 12:02:58.363 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:58 compute-0 systemd[1]: libpod-conmon-8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24.scope: Deactivated successfully.
Dec 05 12:02:58 compute-0 podman[219649]: 2025-12-05 12:02:58.414446769 +0000 UTC m=+0.037838132 container remove 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.419 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[380d1215-a7f4-433a-a2a2-79359fd95416]: (4, ('Fri Dec  5 12:02:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24)\n8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24\nFri Dec  5 12:02:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24)\n8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.422 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed96107c-2bbb-4659-8baa-678042bdc084]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.423 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:58 compute-0 nova_compute[187208]: 2025-12-05 12:02:58.425 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:58 compute-0 kernel: tap41b3b495-c0: left promiscuous mode
Dec 05 12:02:58 compute-0 nova_compute[187208]: 2025-12-05 12:02:58.441 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.445 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72758f80-0660-4d67-b3cc-ea6d12dfaf02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.463 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5709e252-b7e7-4a26-8591-7b76f2b04f48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.464 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[739d74f7-6031-45e8-a624-88e6067c0dd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.477 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9e86514e-37ba-417d-859e-3f0321402d6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355795, 'reachable_time': 39271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219667, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.480 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:02:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.480 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[afc492e0-00d7-4029-bfd9-34a73c2f1a9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:02:58 compute-0 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.016 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Updating instance_info_cache with network_info: [{"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.046 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.047 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance network_info: |[{"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.047 187212 DEBUG oslo_concurrency.lockutils [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.048 187212 DEBUG nova.network.neutron [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Refreshing network info cache for port b785a426-63ba-453e-95dc-3aa63f9f75a9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.051 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start _get_guest_xml network_info=[{"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.056 187212 WARNING nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.072 187212 DEBUG nova.virt.libvirt.host [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.073 187212 DEBUG nova.virt.libvirt.host [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.082 187212 DEBUG nova.virt.libvirt.host [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.083 187212 DEBUG nova.virt.libvirt.host [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.084 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.084 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.084 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.086 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.086 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.086 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.090 187212 DEBUG nova.virt.libvirt.vif [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1246678959',display_name='tempest-DeleteServersTestJSON-server-1246678959',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1246678959',id=32,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-1e9h9ypl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:51Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=2f42f732-65c6-4c4a-9332-47098d7350b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.091 187212 DEBUG nova.network.os_vif_util [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.091 187212 DEBUG nova.network.os_vif_util [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.093 187212 DEBUG nova.objects.instance [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f42f732-65c6-4c4a-9332-47098d7350b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.109 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <uuid>2f42f732-65c6-4c4a-9332-47098d7350b9</uuid>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <name>instance-00000020</name>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <nova:name>tempest-DeleteServersTestJSON-server-1246678959</nova:name>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:02:59</nova:creationTime>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:02:59 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:02:59 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:02:59 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:02:59 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:02:59 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:02:59 compute-0 nova_compute[187208]:         <nova:user uuid="ff425b7b04144f93a2c15e3a347fc15c">tempest-DeleteServersTestJSON-554028480-project-member</nova:user>
Dec 05 12:02:59 compute-0 nova_compute[187208]:         <nova:project uuid="4671f6c82ea049fab3a314ecf45b7656">tempest-DeleteServersTestJSON-554028480</nova:project>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:02:59 compute-0 nova_compute[187208]:         <nova:port uuid="b785a426-63ba-453e-95dc-3aa63f9f75a9">
Dec 05 12:02:59 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <system>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <entry name="serial">2f42f732-65c6-4c4a-9332-47098d7350b9</entry>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <entry name="uuid">2f42f732-65c6-4c4a-9332-47098d7350b9</entry>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     </system>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <os>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   </os>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <features>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   </features>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.config"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:4e:26:bd"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <target dev="tapb785a426-63"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/console.log" append="off"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <video>
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     </video>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:02:59 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:02:59 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:02:59 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:02:59 compute-0 nova_compute[187208]: </domain>
Dec 05 12:02:59 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.109 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Preparing to wait for external event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.109 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.110 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.110 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.111 187212 DEBUG nova.virt.libvirt.vif [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1246678959',display_name='tempest-DeleteServersTestJSON-server-1246678959',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1246678959',id=32,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-1e9h9ypl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:51Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=2f42f732-65c6-4c4a-9332-47098d7350b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.111 187212 DEBUG nova.network.os_vif_util [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.111 187212 DEBUG nova.network.os_vif_util [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.112 187212 DEBUG os_vif [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.112 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.113 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.113 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.116 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.116 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb785a426-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.117 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb785a426-63, col_values=(('external_ids', {'iface-id': 'b785a426-63ba-453e-95dc-3aa63f9f75a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:26:bd', 'vm-uuid': '2f42f732-65c6-4c4a-9332-47098d7350b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:59 compute-0 NetworkManager[55691]: <info>  [1764936179.1197] manager: (tapb785a426-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.129 187212 INFO os_vif [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63')
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.187 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.188 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.188 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No VIF found with MAC fa:16:3e:4e:26:bd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.189 187212 INFO nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Using config drive
Dec 05 12:02:59 compute-0 nova_compute[187208]: 2025-12-05 12:02:59.196 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.161 187212 INFO nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Creating config drive at /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.config
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.166 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmskh5vso execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.290 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmskh5vso" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:00 compute-0 NetworkManager[55691]: <info>  [1764936180.3430] manager: (tapb785a426-63): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Dec 05 12:03:00 compute-0 systemd-udevd[219533]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:00 compute-0 kernel: tapb785a426-63: entered promiscuous mode
Dec 05 12:03:00 compute-0 ovn_controller[95610]: 2025-12-05T12:03:00Z|00240|binding|INFO|Claiming lport b785a426-63ba-453e-95dc-3aa63f9f75a9 for this chassis.
Dec 05 12:03:00 compute-0 ovn_controller[95610]: 2025-12-05T12:03:00Z|00241|binding|INFO|b785a426-63ba-453e-95dc-3aa63f9f75a9: Claiming fa:16:3e:4e:26:bd 10.100.0.9
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.347 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:00 compute-0 NetworkManager[55691]: <info>  [1764936180.3626] device (tapb785a426-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:03:00 compute-0 NetworkManager[55691]: <info>  [1764936180.3640] device (tapb785a426-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:03:00 compute-0 ovn_controller[95610]: 2025-12-05T12:03:00Z|00242|binding|INFO|Setting lport b785a426-63ba-453e-95dc-3aa63f9f75a9 ovn-installed in OVS
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.375 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.381 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:00 compute-0 ovn_controller[95610]: 2025-12-05T12:03:00Z|00243|binding|INFO|Setting lport b785a426-63ba-453e-95dc-3aa63f9f75a9 up in Southbound
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.419 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:26:bd 10.100.0.9'], port_security=['fa:16:3e:4e:26:bd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2f42f732-65c6-4c4a-9332-47098d7350b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b785a426-63ba-453e-95dc-3aa63f9f75a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.420 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b785a426-63ba-453e-95dc-3aa63f9f75a9 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 bound to our chassis
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.423 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.433 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9b87fb-8f17-496f-834f-cb3214b3ddd6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.434 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7360f84-b1 in ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.436 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7360f84-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.436 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b719c047-b8e1-4b60-8ee5-07f2457a1483]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.437 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e5fdbe6a-1291-4a30-9184-17bc18e50145]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.447 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[39ac8c3e-c7e4-4880-8062-9b3e6d1ec5a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.460 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f596888-25e9-446e-9d56-79368c4d300a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 systemd[1]: Started Virtual Machine qemu-36-instance-00000020.
Dec 05 12:03:00 compute-0 systemd-machined[153543]: New machine qemu-36-instance-00000020.
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.498 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[53f077f3-abda-4c06-abca-1c6c76375e69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 NetworkManager[55691]: <info>  [1764936180.5051] manager: (tapd7360f84-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/105)
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.503 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[027a8405-71dd-4769-9926-f1f96c947a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 systemd-udevd[219744]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.551 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[50625570-e2ae-4889-ae6b-bd23c5cef478]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.555 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[64413629-bf9e-46b2-b8e5-46fa7d4e7058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 NetworkManager[55691]: <info>  [1764936180.5782] device (tapd7360f84-b0): carrier: link connected
Dec 05 12:03:00 compute-0 podman[219718]: 2025-12-05 12:03:00.581697263 +0000 UTC m=+0.104719852 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.584 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3828a0a1-b3c6-4ded-ae08-b07c7de5aa70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.602 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[898d3411-2677-440b-83b8-fd1b44db997c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356114, 'reachable_time': 43986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219770, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.616 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[999acab2-f81a-46f1-885e-3b21dd5f3594]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:2b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 356114, 'tstamp': 356114}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219771, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.632 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[833385d3-be34-4b3c-b869-103a13012601]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356114, 'reachable_time': 43986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219772, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.664 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6959e2-fb9d-4703-9ae9-80d0fa3f6597]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.716 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d299b329-87f3-460a-9e16-8638d73fde87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.717 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.717 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.717 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7360f84-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.719 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:00 compute-0 NetworkManager[55691]: <info>  [1764936180.7201] manager: (tapd7360f84-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Dec 05 12:03:00 compute-0 kernel: tapd7360f84-b0: entered promiscuous mode
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.727 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7360f84-b0, col_values=(('external_ids', {'iface-id': 'd85bc323-c3ce-47e3-ac1f-d5f27467a4e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.731 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:00 compute-0 ovn_controller[95610]: 2025-12-05T12:03:00Z|00244|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.745 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.746 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.747 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6afb1ff5-bb7f-4213-8e63-1a1ccd7c4eee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.749 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:03:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.749 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'env', 'PROCESS_TAG=haproxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.779 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936180.7791884, 2f42f732-65c6-4c4a-9332-47098d7350b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.781 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] VM Started (Lifecycle Event)
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.805 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.810 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936180.7792728, 2f42f732-65c6-4c4a-9332-47098d7350b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.811 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] VM Paused (Lifecycle Event)
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.827 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.830 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:00 compute-0 nova_compute[187208]: 2025-12-05 12:03:00.845 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:01 compute-0 podman[219816]: 2025-12-05 12:03:01.1049533 +0000 UTC m=+0.051759000 container create ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:03:01 compute-0 systemd[1]: Started libpod-conmon-ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3.scope.
Dec 05 12:03:01 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:03:01 compute-0 podman[219816]: 2025-12-05 12:03:01.076766544 +0000 UTC m=+0.023572044 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:03:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85ca59d6a34c179f3b81febc919aaa04cecc14ab33813fa899326030a0893116/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:03:01 compute-0 ovn_controller[95610]: 2025-12-05T12:03:01Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2e:33:fe 10.100.0.9
Dec 05 12:03:01 compute-0 ovn_controller[95610]: 2025-12-05T12:03:01Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:33:fe 10.100.0.9
Dec 05 12:03:01 compute-0 podman[219816]: 2025-12-05 12:03:01.371550895 +0000 UTC m=+0.318356415 container init ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:03:01 compute-0 podman[219816]: 2025-12-05 12:03:01.377858235 +0000 UTC m=+0.324663725 container start ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.410 187212 DEBUG nova.compute.manager [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-changed-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.410 187212 DEBUG nova.compute.manager [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Refreshing instance network info cache due to event network-changed-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.411 187212 DEBUG oslo_concurrency.lockutils [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.411 187212 DEBUG oslo_concurrency.lockutils [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.411 187212 DEBUG nova.network.neutron [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Refreshing network info cache for port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:03:01 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [NOTICE]   (219834) : New worker (219836) forked
Dec 05 12:03:01 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [NOTICE]   (219834) : Loading success.
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.807 187212 DEBUG nova.compute.manager [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.830 187212 DEBUG nova.compute.manager [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.830 187212 DEBUG oslo_concurrency.lockutils [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.830 187212 DEBUG oslo_concurrency.lockutils [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.831 187212 DEBUG oslo_concurrency.lockutils [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.831 187212 DEBUG nova.compute.manager [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Processing event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.831 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.834 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936181.8345957, 2f42f732-65c6-4c4a-9332-47098d7350b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.835 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] VM Resumed (Lifecycle Event)
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.836 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.839 187212 INFO nova.virt.libvirt.driver [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance spawned successfully.
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.839 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.879 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.881 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.882 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.882 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.882 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.883 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.883 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.887 187212 INFO nova.compute.manager [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] instance snapshotting
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.887 187212 WARNING nova.compute.manager [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] trying to snapshot a non-running instance: (state: 4 expected: 1)
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.889 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.929 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.981 187212 INFO nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Took 10.23 seconds to spawn the instance on the hypervisor.
Dec 05 12:03:01 compute-0 nova_compute[187208]: 2025-12-05 12:03:01.981 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.073 187212 INFO nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Took 10.82 seconds to build instance.
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.088 187212 DEBUG nova.network.neutron [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Updated VIF entry in instance network info cache for port b785a426-63ba-453e-95dc-3aa63f9f75a9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.088 187212 DEBUG nova.network.neutron [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Updating instance_info_cache with network_info: [{"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.092 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.986s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.115 187212 DEBUG oslo_concurrency.lockutils [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.207 187212 INFO nova.virt.libvirt.driver [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Beginning cold snapshot process
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.434 187212 DEBUG nova.privsep.utils [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.435 187212 DEBUG oslo_concurrency.processutils [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk /var/lib/nova/instances/snapshots/tmpv7bua9ay/621f542e4f4a4ada9eb46bd90c685ad0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.536 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.727 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.834 187212 DEBUG oslo_concurrency.processutils [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk /var/lib/nova/instances/snapshots/tmpv7bua9ay/621f542e4f4a4ada9eb46bd90c685ad0" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:02 compute-0 nova_compute[187208]: 2025-12-05 12:03:02.835 187212 INFO nova.virt.libvirt.driver [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Snapshot extracted, beginning image upload
Dec 05 12:03:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:03.010 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:03.011 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:03.012 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:03 compute-0 nova_compute[187208]: 2025-12-05 12:03:03.714 187212 DEBUG nova.network.neutron [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updated VIF entry in instance network info cache for port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:03:03 compute-0 nova_compute[187208]: 2025-12-05 12:03:03.715 187212 DEBUG nova.network.neutron [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updating instance_info_cache with network_info: [{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:03 compute-0 nova_compute[187208]: 2025-12-05 12:03:03.741 187212 DEBUG oslo_concurrency.lockutils [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:03 compute-0 nova_compute[187208]: 2025-12-05 12:03:03.742 187212 DEBUG nova.compute.manager [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received event network-vif-deleted-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.251 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.252 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.252 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Processing event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] No waiting events found dispatching network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 WARNING nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received unexpected event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc for instance with vm_state building and task_state spawning.
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-unplugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] No waiting events found dispatching network-vif-unplugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 WARNING nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received unexpected event network-vif-unplugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb for instance with vm_state stopped and task_state image_uploading.
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.256 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.256 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.256 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] No waiting events found dispatching network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.256 187212 WARNING nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received unexpected event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb for instance with vm_state stopped and task_state image_uploading.
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.258 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.263 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.264 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936184.2641354, d70544d6-04e3-4b2a-914a-72db3052216a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.264 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] VM Resumed (Lifecycle Event)
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.271 187212 INFO nova.virt.libvirt.driver [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance spawned successfully.
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.271 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.290 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.293 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.304 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.305 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.306 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.306 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.306 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.307 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.321 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.386 187212 INFO nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 15.86 seconds to spawn the instance on the hypervisor.
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.387 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.503 187212 INFO nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 16.81 seconds to build instance.
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.563 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.564 187212 DEBUG nova.compute.manager [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.565 187212 DEBUG oslo_concurrency.lockutils [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.565 187212 DEBUG oslo_concurrency.lockutils [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.565 187212 DEBUG oslo_concurrency.lockutils [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.565 187212 DEBUG nova.compute.manager [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] No waiting events found dispatching network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.566 187212 WARNING nova.compute.manager [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received unexpected event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 for instance with vm_state active and task_state None.
Dec 05 12:03:04 compute-0 nova_compute[187208]: 2025-12-05 12:03:04.610 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.043 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.043 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.053 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.054 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.054 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.054 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.055 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.056 187212 INFO nova.compute.manager [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Terminating instance
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.057 187212 DEBUG nova.compute.manager [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.066 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:03:05 compute-0 kernel: tapb785a426-63 (unregistering): left promiscuous mode
Dec 05 12:03:05 compute-0 NetworkManager[55691]: <info>  [1764936185.0834] device (tapb785a426-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:03:05 compute-0 ovn_controller[95610]: 2025-12-05T12:03:05Z|00245|binding|INFO|Releasing lport b785a426-63ba-453e-95dc-3aa63f9f75a9 from this chassis (sb_readonly=0)
Dec 05 12:03:05 compute-0 ovn_controller[95610]: 2025-12-05T12:03:05Z|00246|binding|INFO|Setting lport b785a426-63ba-453e-95dc-3aa63f9f75a9 down in Southbound
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.091 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:05 compute-0 ovn_controller[95610]: 2025-12-05T12:03:05Z|00247|binding|INFO|Removing iface tapb785a426-63 ovn-installed in OVS
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.102 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:26:bd 10.100.0.9'], port_security=['fa:16:3e:4e:26:bd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2f42f732-65c6-4c4a-9332-47098d7350b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b785a426-63ba-453e-95dc-3aa63f9f75a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.103 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b785a426-63ba-453e-95dc-3aa63f9f75a9 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 unbound from our chassis
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.105 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[29c48909-5bcb-499b-8ba4-f41f925353ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.107 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace which is not needed anymore
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:05 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000020.scope: Deactivated successfully.
Dec 05 12:03:05 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000020.scope: Consumed 3.541s CPU time.
Dec 05 12:03:05 compute-0 systemd-machined[153543]: Machine qemu-36-instance-00000020 terminated.
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.167 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.168 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.177 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.178 187212 INFO nova.compute.claims [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:03:05 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [NOTICE]   (219834) : haproxy version is 2.8.14-c23fe91
Dec 05 12:03:05 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [NOTICE]   (219834) : path to executable is /usr/sbin/haproxy
Dec 05 12:03:05 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [WARNING]  (219834) : Exiting Master process...
Dec 05 12:03:05 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [ALERT]    (219834) : Current worker (219836) exited with code 143 (Terminated)
Dec 05 12:03:05 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [WARNING]  (219834) : All workers exited. Exiting... (0)
Dec 05 12:03:05 compute-0 systemd[1]: libpod-ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3.scope: Deactivated successfully.
Dec 05 12:03:05 compute-0 podman[219880]: 2025-12-05 12:03:05.247432842 +0000 UTC m=+0.050471983 container died ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:03:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3-userdata-shm.mount: Deactivated successfully.
Dec 05 12:03:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-85ca59d6a34c179f3b81febc919aaa04cecc14ab33813fa899326030a0893116-merged.mount: Deactivated successfully.
Dec 05 12:03:05 compute-0 podman[219880]: 2025-12-05 12:03:05.29184194 +0000 UTC m=+0.094881061 container cleanup ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:03:05 compute-0 systemd[1]: libpod-conmon-ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3.scope: Deactivated successfully.
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.333 187212 INFO nova.virt.libvirt.driver [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance destroyed successfully.
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.335 187212 DEBUG nova.objects.instance [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'resources' on Instance uuid 2f42f732-65c6-4c4a-9332-47098d7350b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.350 187212 DEBUG nova.virt.libvirt.vif [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1246678959',display_name='tempest-DeleteServersTestJSON-server-1246678959',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1246678959',id=32,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-1e9h9ypl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:02Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=2f42f732-65c6-4c4a-9332-47098d7350b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.351 187212 DEBUG nova.network.os_vif_util [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.351 187212 DEBUG nova.network.os_vif_util [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.352 187212 DEBUG os_vif [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.353 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.354 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb785a426-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.364 187212 INFO nova.virt.libvirt.driver [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Snapshot image upload complete
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.365 187212 INFO nova.compute.manager [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 3.48 seconds to snapshot the instance on the hypervisor.
Dec 05 12:03:05 compute-0 podman[219921]: 2025-12-05 12:03:05.36607631 +0000 UTC m=+0.051016048 container remove ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.371 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.372 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[811f5e61-d239-4e98-801e-529142779747]: (4, ('Fri Dec  5 12:03:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3)\nba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3\nFri Dec  5 12:03:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3)\nba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.373 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.374 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a29da469-e358-4869-922d-5661f131d5bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.375 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.375 187212 INFO os_vif [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63')
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.376 187212 INFO nova.virt.libvirt.driver [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Deleting instance files /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9_del
Dec 05 12:03:05 compute-0 kernel: tapd7360f84-b0: left promiscuous mode
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.376 187212 INFO nova.virt.libvirt.driver [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Deletion of /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9_del complete
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.378 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.393 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.396 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[23132262-e872-4766-ac60-9c2af0ff5d4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.397 187212 DEBUG nova.compute.provider_tree [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.421 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c0c190-8495-4f1a-b31c-6e75c73fc957]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.422 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c34ee1-fa11-44fe-b865-717b4c250121]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.431 187212 DEBUG nova.scheduler.client.report [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.436 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[476b3b1c-34d4-4807-bef0-bdad5cd61db2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356105, 'reachable_time': 17391, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219946, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:05 compute-0 systemd[1]: run-netns-ovnmeta\x2dd7360f84\x2dbcd5\x2d4e64\x2dbf43\x2d1fdbd8215a70.mount: Deactivated successfully.
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.442 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:03:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.442 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[860e3b71-f644-4c2c-a6c1-0b60d8faadf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.442 187212 INFO nova.compute.manager [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Took 0.39 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.443 187212 DEBUG oslo.service.loopingcall [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.443 187212 DEBUG nova.compute.manager [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.443 187212 DEBUG nova.network.neutron [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.456 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.457 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.497 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.497 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.514 187212 INFO nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.529 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.627 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.629 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.629 187212 INFO nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Creating image(s)
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.629 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.630 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.631 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.644 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.704 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.705 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.706 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.718 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.779 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.780 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.818 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.819 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.820 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.896 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.900 187212 DEBUG nova.virt.disk.api [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Checking if we can resize image /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.901 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.927 187212 DEBUG nova.policy [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.963 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.964 187212 DEBUG nova.virt.disk.api [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Cannot resize image /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.964 187212 DEBUG nova.objects.instance [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'migration_context' on Instance uuid d2085dd9-2ebd-4804-99c1-3b15cbd216f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.982 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.983 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Ensure instance console log exists: /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.983 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.984 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:05 compute-0 nova_compute[187208]: 2025-12-05 12:03:05.984 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.539 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.587 187212 DEBUG nova.network.neutron [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.615 187212 INFO nova.compute.manager [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Took 2.17 seconds to deallocate network for instance.
Dec 05 12:03:07 compute-0 ovn_controller[95610]: 2025-12-05T12:03:07Z|00248|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec 05 12:03:07 compute-0 ovn_controller[95610]: 2025-12-05T12:03:07Z|00249|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.659 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.661 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:07 compute-0 ovn_controller[95610]: 2025-12-05T12:03:07Z|00250|binding|INFO|Removing iface tap82089bf4-20 ovn-installed in OVS
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.729 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.832 187212 DEBUG nova.compute.provider_tree [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.853 187212 DEBUG nova.scheduler.client.report [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.882 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.908 187212 INFO nova.scheduler.client.report [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Deleted allocations for instance 2f42f732-65c6-4c4a-9332-47098d7350b9
Dec 05 12:03:07 compute-0 nova_compute[187208]: 2025-12-05 12:03:07.991 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:08 compute-0 nova_compute[187208]: 2025-12-05 12:03:08.085 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Successfully created port: 0a11e563-2be9-4ce9-af51-7d29b586e233 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:03:08 compute-0 podman[219962]: 2025-12-05 12:03:08.20646988 +0000 UTC m=+0.058903503 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:03:08 compute-0 nova_compute[187208]: 2025-12-05 12:03:08.357 187212 DEBUG nova.compute.manager [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-unplugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:08 compute-0 nova_compute[187208]: 2025-12-05 12:03:08.359 187212 DEBUG oslo_concurrency.lockutils [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:08 compute-0 nova_compute[187208]: 2025-12-05 12:03:08.359 187212 DEBUG oslo_concurrency.lockutils [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:08 compute-0 nova_compute[187208]: 2025-12-05 12:03:08.359 187212 DEBUG oslo_concurrency.lockutils [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:08 compute-0 nova_compute[187208]: 2025-12-05 12:03:08.360 187212 DEBUG nova.compute.manager [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] No waiting events found dispatching network-vif-unplugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:08 compute-0 nova_compute[187208]: 2025-12-05 12:03:08.360 187212 WARNING nova.compute.manager [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received unexpected event network-vif-unplugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 for instance with vm_state deleted and task_state None.
Dec 05 12:03:08 compute-0 nova_compute[187208]: 2025-12-05 12:03:08.692 187212 DEBUG nova.compute.manager [req-1a510430-9571-4c81-b112-eb6985dd49cd req-00a56ab8-e4ee-488f-b129-5b7634a09313 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-deleted-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.272 187212 DEBUG nova.compute.manager [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.322 187212 INFO nova.compute.manager [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] instance snapshotting
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.583 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Beginning live snapshot process
Dec 05 12:03:09 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.739 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.808 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.809 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.878 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.891 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.953 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.955 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.993 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868.delta 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:09 compute-0 nova_compute[187208]: 2025-12-05 12:03:09.995 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.057 187212 DEBUG nova.virt.libvirt.guest [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.062 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.274 187212 DEBUG nova.privsep.utils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.276 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868.delta /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.432 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936175.431474, fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.434 187212 INFO nova.compute.manager [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] VM Stopped (Lifecycle Event)
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.455 187212 DEBUG nova.compute.manager [None req-787eb86c-dcfc-42f5-aa36-29c5c3821099 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.604 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868.delta /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868" returned: 0 in 0.328s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.606 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Snapshot extracted, beginning image upload
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.655 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Successfully updated port: 0a11e563-2be9-4ce9-af51-7d29b586e233 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.679 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.680 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquired lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:10 compute-0 nova_compute[187208]: 2025-12-05 12:03:10.680 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:03:11 compute-0 nova_compute[187208]: 2025-12-05 12:03:11.019 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.062 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.118 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936177.117115, 9efa988a-19ae-440a-8a56-0bac68cb3c9e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.119 187212 INFO nova.compute.manager [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] VM Stopped (Lifecycle Event)
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.147 187212 DEBUG nova.compute.manager [None req-2707a026-2eec-4940-9471-39323d98146d - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.152 187212 DEBUG nova.compute.manager [None req-2707a026-2eec-4940-9471-39323d98146d - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.515 187212 DEBUG nova.compute.manager [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.516 187212 DEBUG oslo_concurrency.lockutils [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.516 187212 DEBUG oslo_concurrency.lockutils [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.516 187212 DEBUG oslo_concurrency.lockutils [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.516 187212 DEBUG nova.compute.manager [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] No waiting events found dispatching network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.517 187212 WARNING nova.compute.manager [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received unexpected event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 for instance with vm_state deleted and task_state None.
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.541 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.574 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Updating instance_info_cache with network_info: [{"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.605 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Releasing lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.605 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance network_info: |[{"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.608 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start _get_guest_xml network_info=[{"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.611 187212 WARNING nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.619 187212 DEBUG nova.virt.libvirt.host [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.620 187212 DEBUG nova.virt.libvirt.host [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.623 187212 DEBUG nova.virt.libvirt.host [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.623 187212 DEBUG nova.virt.libvirt.host [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.624 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.624 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.625 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.625 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.626 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.626 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.626 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.627 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.627 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.627 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.628 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.628 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.632 187212 DEBUG nova.virt.libvirt.vif [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-835443144',display_name='tempest-ImagesOneServerNegativeTestJSON-server-835443144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-835443144',id=33,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-ijey2289',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:05Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=d2085dd9-2ebd-4804-99c1-3b15cbd216f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.632 187212 DEBUG nova.network.os_vif_util [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.634 187212 DEBUG nova.network.os_vif_util [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.635 187212 DEBUG nova.objects.instance [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'pci_devices' on Instance uuid d2085dd9-2ebd-4804-99c1-3b15cbd216f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.650 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <uuid>d2085dd9-2ebd-4804-99c1-3b15cbd216f8</uuid>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <name>instance-00000021</name>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-835443144</nova:name>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:03:12</nova:creationTime>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:03:12 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:03:12 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:03:12 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:03:12 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:03:12 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:03:12 compute-0 nova_compute[187208]:         <nova:user uuid="3ee170bdfdd343189ee1da01bdb80be6">tempest-ImagesOneServerNegativeTestJSON-661137252-project-member</nova:user>
Dec 05 12:03:12 compute-0 nova_compute[187208]:         <nova:project uuid="79895287bd1d488c842f6013729a1f81">tempest-ImagesOneServerNegativeTestJSON-661137252</nova:project>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:03:12 compute-0 nova_compute[187208]:         <nova:port uuid="0a11e563-2be9-4ce9-af51-7d29b586e233">
Dec 05 12:03:12 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <system>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <entry name="serial">d2085dd9-2ebd-4804-99c1-3b15cbd216f8</entry>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <entry name="uuid">d2085dd9-2ebd-4804-99c1-3b15cbd216f8</entry>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     </system>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <os>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   </os>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <features>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   </features>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.config"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:f2:70:f2"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <target dev="tap0a11e563-2b"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/console.log" append="off"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <video>
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     </video>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:03:12 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:03:12 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:03:12 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:03:12 compute-0 nova_compute[187208]: </domain>
Dec 05 12:03:12 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.651 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Preparing to wait for external event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.651 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.652 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.652 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.652 187212 DEBUG nova.virt.libvirt.vif [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-835443144',display_name='tempest-ImagesOneServerNegativeTestJSON-server-835443144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-835443144',id=33,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-ijey2289',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:05Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=d2085dd9-2ebd-4804-99c1-3b15cbd216f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.653 187212 DEBUG nova.network.os_vif_util [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.653 187212 DEBUG nova.network.os_vif_util [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.654 187212 DEBUG os_vif [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.655 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.655 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.657 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.657 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a11e563-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.658 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0a11e563-2b, col_values=(('external_ids', {'iface-id': '0a11e563-2be9-4ce9-af51-7d29b586e233', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:70:f2', 'vm-uuid': 'd2085dd9-2ebd-4804-99c1-3b15cbd216f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.659 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:12 compute-0 NetworkManager[55691]: <info>  [1764936192.6607] manager: (tap0a11e563-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.661 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.668 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.669 187212 INFO os_vif [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b')
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.717 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.717 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.718 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No VIF found with MAC fa:16:3e:f2:70:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.718 187212 INFO nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Using config drive
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.720 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.720 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.720 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.721 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.721 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.722 187212 INFO nova.compute.manager [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Terminating instance
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.723 187212 DEBUG nova.compute.manager [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.727 187212 INFO nova.virt.libvirt.driver [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance destroyed successfully.
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.728 187212 DEBUG nova.objects.instance [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.746 187212 DEBUG nova.virt.libvirt.vif [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1705515982',display_name='tempest-ImagesTestJSON-server-1705515982',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1705515982',id=28,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-9ph9qh0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:05Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=9efa988a-19ae-440a-8a56-0bac68cb3c9e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.746 187212 DEBUG nova.network.os_vif_util [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.747 187212 DEBUG nova.network.os_vif_util [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.748 187212 DEBUG os_vif [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.749 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.750 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82089bf4-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.751 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.758 187212 INFO os_vif [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20')
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.759 187212 INFO nova.virt.libvirt.driver [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Deleting instance files /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e_del
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.767 187212 INFO nova.virt.libvirt.driver [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Deletion of /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e_del complete
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.811 187212 DEBUG nova.compute.manager [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-changed-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.811 187212 DEBUG nova.compute.manager [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Refreshing instance network info cache due to event network-changed-0a11e563-2be9-4ce9-af51-7d29b586e233. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.811 187212 DEBUG oslo_concurrency.lockutils [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.812 187212 DEBUG oslo_concurrency.lockutils [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.812 187212 DEBUG nova.network.neutron [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Refreshing network info cache for port 0a11e563-2be9-4ce9-af51-7d29b586e233 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.833 187212 INFO nova.compute.manager [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 0.11 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.834 187212 DEBUG oslo.service.loopingcall [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.835 187212 DEBUG nova.compute.manager [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:12 compute-0 nova_compute[187208]: 2025-12-05 12:03:12.835 187212 DEBUG nova.network.neutron [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:13 compute-0 nova_compute[187208]: 2025-12-05 12:03:13.701 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Snapshot image upload complete
Dec 05 12:03:13 compute-0 nova_compute[187208]: 2025-12-05 12:03:13.702 187212 INFO nova.compute.manager [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 4.38 seconds to snapshot the instance on the hypervisor.
Dec 05 12:03:13 compute-0 nova_compute[187208]: 2025-12-05 12:03:13.737 187212 INFO nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Creating config drive at /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.config
Dec 05 12:03:13 compute-0 nova_compute[187208]: 2025-12-05 12:03:13.745 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp42r5frbt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:13 compute-0 nova_compute[187208]: 2025-12-05 12:03:13.873 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp42r5frbt" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:13 compute-0 kernel: tap0a11e563-2b: entered promiscuous mode
Dec 05 12:03:13 compute-0 ovn_controller[95610]: 2025-12-05T12:03:13Z|00251|binding|INFO|Claiming lport 0a11e563-2be9-4ce9-af51-7d29b586e233 for this chassis.
Dec 05 12:03:13 compute-0 ovn_controller[95610]: 2025-12-05T12:03:13Z|00252|binding|INFO|0a11e563-2be9-4ce9-af51-7d29b586e233: Claiming fa:16:3e:f2:70:f2 10.100.0.12
Dec 05 12:03:13 compute-0 NetworkManager[55691]: <info>  [1764936193.9327] manager: (tap0a11e563-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Dec 05 12:03:13 compute-0 nova_compute[187208]: 2025-12-05 12:03:13.932 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.947 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:70:f2 10.100.0.12'], port_security=['fa:16:3e:f2:70:f2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd2085dd9-2ebd-4804-99c1-3b15cbd216f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d064000-316c-46a7-a23c-1dc26318b6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79895287bd1d488c842f6013729a1f81', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e1ec2415-6840-4cf9-b5ac-efaf1a9c9a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3804b014-203a-4c47-b0bb-7634579c4ec4, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0a11e563-2be9-4ce9-af51-7d29b586e233) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.949 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0a11e563-2be9-4ce9-af51-7d29b586e233 in datapath 5d064000-316c-46a7-a23c-1dc26318b6a4 bound to our chassis
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.952 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5d064000-316c-46a7-a23c-1dc26318b6a4
Dec 05 12:03:13 compute-0 ovn_controller[95610]: 2025-12-05T12:03:13Z|00253|binding|INFO|Setting lport 0a11e563-2be9-4ce9-af51-7d29b586e233 ovn-installed in OVS
Dec 05 12:03:13 compute-0 ovn_controller[95610]: 2025-12-05T12:03:13Z|00254|binding|INFO|Setting lport 0a11e563-2be9-4ce9-af51-7d29b586e233 up in Southbound
Dec 05 12:03:13 compute-0 nova_compute[187208]: 2025-12-05 12:03:13.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.962 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[859e41c1-1655-4466-9f76-b3c3e6cc9db6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.963 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5d064000-31 in ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:03:13 compute-0 systemd-udevd[220031]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.965 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5d064000-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.965 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f19c3456-297b-4d2f-b2fa-8918e3f4bd53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.966 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5890975b-423a-4312-a132-1a4b595fd80d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:13 compute-0 NetworkManager[55691]: <info>  [1764936193.9782] device (tap0a11e563-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:03:13 compute-0 systemd-machined[153543]: New machine qemu-37-instance-00000021.
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.977 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[6227a224-997f-4b35-805b-308ac445ce31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:13 compute-0 NetworkManager[55691]: <info>  [1764936193.9796] device (tap0a11e563-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:03:13 compute-0 systemd[1]: Started Virtual Machine qemu-37-instance-00000021.
Dec 05 12:03:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.990 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd6eb71-c305-4c85-84df-242ab645d3bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.011 187212 DEBUG nova.network.neutron [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.030 187212 INFO nova.compute.manager [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 1.19 seconds to deallocate network for instance.
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.031 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e38b47eb-0741-462d-886e-a7f8f82a8eeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.036 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[63e4fcde-19d0-40e1-8fa2-b29f4727d8d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 NetworkManager[55691]: <info>  [1764936194.0375] manager: (tap5d064000-30): new Veth device (/org/freedesktop/NetworkManager/Devices/109)
Dec 05 12:03:14 compute-0 systemd-udevd[220035]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.042 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.043 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[209c5e36-290c-4bf8-bb21-e1f04c3d0ed8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.069 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e643bd-eae0-415c-885e-1c83210dcf3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.073 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:03:14 compute-0 NetworkManager[55691]: <info>  [1764936194.0890] device (tap5d064000-30): carrier: link connected
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.093 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1f0107-b1de-484d-adab-5055f8f21145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.094 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.095 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.108 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aedca57e-910e-40d2-a8c6-dac2c9cb5381]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357465, 'reachable_time': 41440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220064, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.123 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2efd67a-70d1-4a9a-a9ee-74323826e893]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:6d24'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 357465, 'tstamp': 357465}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220065, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.138 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1f461bd5-14ac-414b-b137-8bde8c36aa1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357465, 'reachable_time': 41440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220066, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.143 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.172 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ac3cfe-6204-4804-8e31-d086257e115b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.231 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4529b8-f846-4aa3-ab3b-ee1d6b40f967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.233 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d064000-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.234 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.236 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d064000-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:14 compute-0 NetworkManager[55691]: <info>  [1764936194.2948] manager: (tap5d064000-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Dec 05 12:03:14 compute-0 kernel: tap5d064000-30: entered promiscuous mode
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.296 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.299 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5d064000-30, col_values=(('external_ids', {'iface-id': '1b49f23e-d835-4ef5-82b9-a339d97fd4cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:14 compute-0 ovn_controller[95610]: 2025-12-05T12:03:14Z|00255|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.302 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.302 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.303 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f1779df-9682-4cce-99d8-c0b2fdd150d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.304 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-5d064000-316c-46a7-a23c-1dc26318b6a4
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 5d064000-316c-46a7-a23c-1dc26318b6a4
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:03:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.305 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'env', 'PROCESS_TAG=haproxy-5d064000-316c-46a7-a23c-1dc26318b6a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5d064000-316c-46a7-a23c-1dc26318b6a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.332 187212 DEBUG nova.compute.provider_tree [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.352 187212 DEBUG nova.scheduler.client.report [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.374 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.376 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.382 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.382 187212 INFO nova.compute.claims [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.420 187212 INFO nova.scheduler.client.report [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance 9efa988a-19ae-440a-8a56-0bac68cb3c9e
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.490 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936194.4897525, d2085dd9-2ebd-4804-99c1-3b15cbd216f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.490 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] VM Started (Lifecycle Event)
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.523 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.526 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.536 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936194.4922235, d2085dd9-2ebd-4804-99c1-3b15cbd216f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.537 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] VM Paused (Lifecycle Event)
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.560 187212 DEBUG nova.compute.provider_tree [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.571 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.574 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.587 187212 DEBUG nova.scheduler.client.report [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.613 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.617 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.617 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.675 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.676 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:03:14 compute-0 podman[220103]: 2025-12-05 12:03:14.690842485 +0000 UTC m=+0.049191376 container create 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.694 187212 INFO nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.718 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:03:14 compute-0 systemd[1]: Started libpod-conmon-592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c.scope.
Dec 05 12:03:14 compute-0 podman[220103]: 2025-12-05 12:03:14.664343668 +0000 UTC m=+0.022692589 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:03:14 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:03:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f72b34854b6787c4cd40b26e6f22d36e2f45382e4691a696a9fc490f51c1bb73/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:03:14 compute-0 podman[220103]: 2025-12-05 12:03:14.791802239 +0000 UTC m=+0.150151150 container init 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:03:14 compute-0 podman[220103]: 2025-12-05 12:03:14.79848132 +0000 UTC m=+0.156830211 container start 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.803 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.804 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.805 187212 INFO nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Creating image(s)
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.805 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.805 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.806 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.820 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:14 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [NOTICE]   (220123) : New worker (220125) forked
Dec 05 12:03:14 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [NOTICE]   (220123) : Loading success.
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.895 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.896 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.897 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.910 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.930 187212 DEBUG nova.network.neutron [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Updated VIF entry in instance network info cache for port 0a11e563-2be9-4ce9-af51-7d29b586e233. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.932 187212 DEBUG nova.network.neutron [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Updating instance_info_cache with network_info: [{"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.957 187212 DEBUG nova.policy [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4671f6c82ea049fab3a314ecf45b7656', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.968 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:14 compute-0 nova_compute[187208]: 2025-12-05 12:03:14.969 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.004 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.006 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.007 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.074 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.074 187212 DEBUG nova.virt.disk.api [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Checking if we can resize image /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.075 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.095 187212 DEBUG oslo_concurrency.lockutils [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.141 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.142 187212 DEBUG nova.virt.disk.api [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Cannot resize image /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.142 187212 DEBUG nova.objects.instance [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'migration_context' on Instance uuid 478fa005-452c-4e37-a919-63bb734a3c5c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.156 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.157 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Ensure instance console log exists: /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.157 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.158 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.158 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:15 compute-0 nova_compute[187208]: 2025-12-05 12:03:15.945 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Successfully created port: 7022c257-2ab5-436e-9757-387e9de66b18 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.827 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.829 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.829 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.830 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.830 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.832 187212 INFO nova.compute.manager [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Terminating instance
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.833 187212 DEBUG nova.compute.manager [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.895 187212 DEBUG nova.compute.manager [req-c3ce9f82-67ca-4282-9ba5-1d1f1bc8894b req-dc37c75e-b531-4262-bcbe-ec6037bcfd7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-deleted-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:16 compute-0 kernel: tapb5ee44c8-34 (unregistering): left promiscuous mode
Dec 05 12:03:16 compute-0 NetworkManager[55691]: <info>  [1764936196.9297] device (tapb5ee44c8-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:16 compute-0 ovn_controller[95610]: 2025-12-05T12:03:16Z|00256|binding|INFO|Releasing lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff from this chassis (sb_readonly=0)
Dec 05 12:03:16 compute-0 ovn_controller[95610]: 2025-12-05T12:03:16Z|00257|binding|INFO|Setting lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff down in Southbound
Dec 05 12:03:16 compute-0 ovn_controller[95610]: 2025-12-05T12:03:16Z|00258|binding|INFO|Removing iface tapb5ee44c8-34 ovn-installed in OVS
Dec 05 12:03:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.952 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:33:fe 10.100.0.9'], port_security=['fa:16:3e:2e:33:fe 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '342e6d694cf6482c9f1b7557a17bce60', 'neutron:revision_number': '4', 'neutron:security_group_ids': '710ea28e-d1ba-4c63-a751-16b460b2129b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a85cd729-c72e-4d3c-b444-ff0b42d436ff, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.954 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff in datapath 393d33f9-2dde-4fb5-b5db-3f0fb98d4637 unbound from our chassis
Dec 05 12:03:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.956 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 393d33f9-2dde-4fb5-b5db-3f0fb98d4637, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.957 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e3679b-44e1-4437-a056-42e4369dff79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.958 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 namespace which is not needed anymore
Dec 05 12:03:16 compute-0 nova_compute[187208]: 2025-12-05 12:03:16.959 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:16 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Dec 05 12:03:16 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000001d.scope: Consumed 13.206s CPU time.
Dec 05 12:03:16 compute-0 systemd-machined[153543]: Machine qemu-33-instance-0000001d terminated.
Dec 05 12:03:17 compute-0 podman[220167]: 2025-12-05 12:03:17.026727235 +0000 UTC m=+0.070598557 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.090 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.091 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.092 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:17 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [NOTICE]   (218972) : haproxy version is 2.8.14-c23fe91
Dec 05 12:03:17 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [NOTICE]   (218972) : path to executable is /usr/sbin/haproxy
Dec 05 12:03:17 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [WARNING]  (218972) : Exiting Master process...
Dec 05 12:03:17 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [WARNING]  (218972) : Exiting Master process...
Dec 05 12:03:17 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [ALERT]    (218972) : Current worker (218974) exited with code 143 (Terminated)
Dec 05 12:03:17 compute-0 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [WARNING]  (218972) : All workers exited. Exiting... (0)
Dec 05 12:03:17 compute-0 systemd[1]: libpod-1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5.scope: Deactivated successfully.
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.108 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:03:17 compute-0 podman[220210]: 2025-12-05 12:03:17.114281506 +0000 UTC m=+0.047715594 container died 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.121 187212 INFO nova.virt.libvirt.driver [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance destroyed successfully.
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.121 187212 DEBUG nova.objects.instance [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'resources' on Instance uuid bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.145 187212 DEBUG nova.virt.libvirt.vif [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(25),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-479694898',id=29,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=25,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-70canao4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.147 187212 DEBUG nova.network.os_vif_util [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.148 187212 DEBUG nova.network.os_vif_util [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.148 187212 DEBUG os_vif [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.150 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5ee44c8-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.151 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.153 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.155 187212 INFO os_vif [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34')
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.156 187212 INFO nova.virt.libvirt.driver [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Deleting instance files /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77_del
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.157 187212 INFO nova.virt.libvirt.driver [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Deletion of /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77_del complete
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.181 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.182 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.189 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.190 187212 INFO nova.compute.claims [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:03:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5-userdata-shm.mount: Deactivated successfully.
Dec 05 12:03:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-61c2e9c401d8673a2e5697d0fcec7ebf4e56c09bb215df29178f1e818f6cd815-merged.mount: Deactivated successfully.
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.241 187212 INFO nova.compute.manager [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Took 0.41 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.242 187212 DEBUG oslo.service.loopingcall [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.242 187212 DEBUG nova.compute.manager [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.242 187212 DEBUG nova.network.neutron [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:17 compute-0 podman[220210]: 2025-12-05 12:03:17.248440838 +0000 UTC m=+0.181874936 container cleanup 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 12:03:17 compute-0 systemd[1]: libpod-conmon-1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5.scope: Deactivated successfully.
Dec 05 12:03:17 compute-0 podman[220261]: 2025-12-05 12:03:17.308508554 +0000 UTC m=+0.038340146 container remove 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.312 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Successfully updated port: 7022c257-2ab5-436e-9757-387e9de66b18 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:03:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[187c5310-6073-4d07-9479-5a6d2bc97470]: (4, ('Fri Dec  5 12:03:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 (1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5)\n1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5\nFri Dec  5 12:03:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 (1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5)\n1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.316 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9713152f-80dd-4f2c-a8b1-ab53510e2012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.317 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap393d33f9-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:17 compute-0 kernel: tap393d33f9-20: left promiscuous mode
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.330 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.330 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.331 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0143b9aa-6cb0-439e-8519-5d66b15e2844]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.356 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0652fc-58cc-4e40-ae61-53d8b6920aef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.359 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1484b01c-839c-4b03-b929-f4959e58ce14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.372 187212 DEBUG nova.compute.provider_tree [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.384 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91d65e43-153a-4d15-88a9-1c9d414693d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354680, 'reachable_time': 37899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220276, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d393d33f9\x2d2dde\x2d4fb5\x2db5db\x2d3f0fb98d4637.mount: Deactivated successfully.
Dec 05 12:03:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.388 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:03:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.388 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2aecd1-d47e-4fa2-8bda-03099d94f26a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.389 187212 DEBUG nova.scheduler.client.report [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.416 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.417 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.459 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.459 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.481 187212 INFO nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.498 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.543 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:17 compute-0 ovn_controller[95610]: 2025-12-05T12:03:17Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:8a:0c 10.100.0.12
Dec 05 12:03:17 compute-0 ovn_controller[95610]: 2025-12-05T12:03:17Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:8a:0c 10.100.0.12
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.588 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.589 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.590 187212 INFO nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Creating image(s)
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.590 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.590 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.591 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.607 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.674 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.675 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.676 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.688 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.716 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.748 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.749 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.770 187212 DEBUG nova.policy [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '18569d5748e8448fbd1bcbf5d37ff5f6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70409a2f9710408cb377a61250853fbd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.785 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.785 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.786 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.860 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.861 187212 DEBUG nova.virt.disk.api [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Checking if we can resize image /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.861 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.919 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.921 187212 DEBUG nova.virt.disk.api [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Cannot resize image /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.922 187212 DEBUG nova.objects.instance [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lazy-loading 'migration_context' on Instance uuid 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.939 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.940 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Ensure instance console log exists: /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.940 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.940 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:17 compute-0 nova_compute[187208]: 2025-12-05 12:03:17.941 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:18 compute-0 nova_compute[187208]: 2025-12-05 12:03:18.888 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:18 compute-0 nova_compute[187208]: 2025-12-05 12:03:18.888 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:18 compute-0 nova_compute[187208]: 2025-12-05 12:03:18.908 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:03:18 compute-0 nova_compute[187208]: 2025-12-05 12:03:18.985 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:18 compute-0 nova_compute[187208]: 2025-12-05 12:03:18.985 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:18 compute-0 nova_compute[187208]: 2025-12-05 12:03:18.986 187212 DEBUG nova.network.neutron [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:18 compute-0 nova_compute[187208]: 2025-12-05 12:03:18.996 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:03:18 compute-0 nova_compute[187208]: 2025-12-05 12:03:18.996 187212 INFO nova.compute.claims [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.010 187212 INFO nova.compute.manager [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Took 1.77 seconds to deallocate network for instance.
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.093 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.364 187212 DEBUG nova.compute.provider_tree [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.383 187212 DEBUG nova.scheduler.client.report [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.411 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.412 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.414 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.433 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.506 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.507 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.534 187212 INFO nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.558 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Successfully created port: a35b6b13-07bc-4c91-aaf5-231163a6ea44 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.574 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.668 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [{"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.685 187212 DEBUG nova.compute.provider_tree [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.701 187212 DEBUG nova.scheduler.client.report [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.711 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.712 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance network_info: |[{"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.717 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start _get_guest_xml network_info=[{"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.722 187212 WARNING nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.729 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.735 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.737 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.738 187212 INFO nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Creating image(s)
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.739 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.740 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.741 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.762 187212 DEBUG nova.virt.libvirt.host [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.764 187212 DEBUG nova.virt.libvirt.host [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.766 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.794 187212 INFO nova.scheduler.client.report [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Deleted allocations for instance bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.801 187212 DEBUG nova.virt.libvirt.host [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.802 187212 DEBUG nova.virt.libvirt.host [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.802 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.803 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.803 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.803 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.805 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.805 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.809 187212 DEBUG nova.virt.libvirt.vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1018127156',display_name='tempest-DeleteServersTestJSON-server-1018127156',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1018127156',id=34,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-oru1ft1x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:14Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=478fa005-452c-4e37-a919-63bb734a3c5c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.810 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.811 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.812 187212 DEBUG nova.objects.instance [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid 478fa005-452c-4e37-a919-63bb734a3c5c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.832 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <uuid>478fa005-452c-4e37-a919-63bb734a3c5c</uuid>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <name>instance-00000022</name>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <nova:name>tempest-DeleteServersTestJSON-server-1018127156</nova:name>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:03:19</nova:creationTime>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:03:19 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:03:19 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:03:19 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:03:19 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:03:19 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:03:19 compute-0 nova_compute[187208]:         <nova:user uuid="ff425b7b04144f93a2c15e3a347fc15c">tempest-DeleteServersTestJSON-554028480-project-member</nova:user>
Dec 05 12:03:19 compute-0 nova_compute[187208]:         <nova:project uuid="4671f6c82ea049fab3a314ecf45b7656">tempest-DeleteServersTestJSON-554028480</nova:project>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:03:19 compute-0 nova_compute[187208]:         <nova:port uuid="7022c257-2ab5-436e-9757-387e9de66b18">
Dec 05 12:03:19 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <system>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <entry name="serial">478fa005-452c-4e37-a919-63bb734a3c5c</entry>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <entry name="uuid">478fa005-452c-4e37-a919-63bb734a3c5c</entry>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     </system>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <os>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   </os>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <features>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   </features>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.config"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:57:8e:0e"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <target dev="tap7022c257-2a"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/console.log" append="off"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <video>
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     </video>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:03:19 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:03:19 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:03:19 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:03:19 compute-0 nova_compute[187208]: </domain>
Dec 05 12:03:19 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.832 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Preparing to wait for external event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.833 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.833 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.833 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.834 187212 DEBUG nova.virt.libvirt.vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1018127156',display_name='tempest-DeleteServersTestJSON-server-1018127156',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1018127156',id=34,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-oru1ft1x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:14Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=478fa005-452c-4e37-a919-63bb734a3c5c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.834 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.835 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.835 187212 DEBUG os_vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.838 187212 DEBUG nova.policy [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.840 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.840 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.841 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.845 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.845 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7022c257-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.845 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7022c257-2a, col_values=(('external_ids', {'iface-id': '7022c257-2ab5-436e-9757-387e9de66b18', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:8e:0e', 'vm-uuid': '478fa005-452c-4e37-a919-63bb734a3c5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.847 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:19 compute-0 NetworkManager[55691]: <info>  [1764936199.8488] manager: (tap7022c257-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.853 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.854 187212 INFO os_vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a')
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.869 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.869 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.870 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.882 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.907 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.946 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:19 compute-0 nova_compute[187208]: 2025-12-05 12:03:19.947 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.081 187212 DEBUG nova.compute.manager [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-unplugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.081 187212 DEBUG oslo_concurrency.lockutils [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.081 187212 DEBUG oslo_concurrency.lockutils [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.082 187212 DEBUG oslo_concurrency.lockutils [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.082 187212 DEBUG nova.compute.manager [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] No waiting events found dispatching network-vif-unplugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.082 187212 WARNING nova.compute.manager [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received unexpected event network-vif-unplugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff for instance with vm_state deleted and task_state None.
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.123 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.124 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.124 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No VIF found with MAC fa:16:3e:57:8e:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.125 187212 INFO nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Using config drive
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.327 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936185.3261204, 2f42f732-65c6-4c4a-9332-47098d7350b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.327 187212 INFO nova.compute.manager [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] VM Stopped (Lifecycle Event)
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.358 187212 DEBUG nova.compute.manager [None req-6288a850-328f-41de-bbcd-77e52ec5cbf5 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.393 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk 1073741824" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.393 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.394 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.461 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.462 187212 DEBUG nova.virt.disk.api [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.463 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.523 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.524 187212 DEBUG nova.virt.disk.api [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.524 187212 DEBUG nova.objects.instance [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.541 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.542 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Ensure instance console log exists: /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.542 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.543 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.543 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.871 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Successfully created port: ecec1a41-6f3e-4852-8cdb-9d461eded987 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.905 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Successfully updated port: a35b6b13-07bc-4c91-aaf5-231163a6ea44 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.928 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.928 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquired lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:20 compute-0 nova_compute[187208]: 2025-12-05 12:03:20.928 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:03:21 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.111 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:03:21 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.522 187212 INFO nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Creating config drive at /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.config
Dec 05 12:03:21 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.529 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj3w8vgx5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:21 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.655 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj3w8vgx5" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:21 compute-0 kernel: tap7022c257-2a: entered promiscuous mode
Dec 05 12:03:21 compute-0 NetworkManager[55691]: <info>  [1764936201.7208] manager: (tap7022c257-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Dec 05 12:03:21 compute-0 ovn_controller[95610]: 2025-12-05T12:03:21Z|00259|binding|INFO|Claiming lport 7022c257-2ab5-436e-9757-387e9de66b18 for this chassis.
Dec 05 12:03:21 compute-0 ovn_controller[95610]: 2025-12-05T12:03:21Z|00260|binding|INFO|7022c257-2ab5-436e-9757-387e9de66b18: Claiming fa:16:3e:57:8e:0e 10.100.0.4
Dec 05 12:03:21 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.721 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:21 compute-0 ovn_controller[95610]: 2025-12-05T12:03:21Z|00261|binding|INFO|Setting lport 7022c257-2ab5-436e-9757-387e9de66b18 ovn-installed in OVS
Dec 05 12:03:21 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.733 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:21 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.739 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:21 compute-0 systemd-udevd[220325]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:21 compute-0 systemd-machined[153543]: New machine qemu-38-instance-00000022.
Dec 05 12:03:21 compute-0 NetworkManager[55691]: <info>  [1764936201.7621] device (tap7022c257-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:03:21 compute-0 NetworkManager[55691]: <info>  [1764936201.7631] device (tap7022c257-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:03:21 compute-0 systemd[1]: Started Virtual Machine qemu-38-instance-00000022.
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.841 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:8e:0e 10.100.0.4'], port_security=['fa:16:3e:57:8e:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '478fa005-452c-4e37-a919-63bb734a3c5c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7022c257-2ab5-436e-9757-387e9de66b18) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:21 compute-0 ovn_controller[95610]: 2025-12-05T12:03:21Z|00262|binding|INFO|Setting lport 7022c257-2ab5-436e-9757-387e9de66b18 up in Southbound
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.842 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7022c257-2ab5-436e-9757-387e9de66b18 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 bound to our chassis
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.845 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.861 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c85c3c9-7167-4be7-ad85-ef605ad4b96b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.862 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7360f84-b1 in ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.864 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7360f84-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.865 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c96b232c-0a71-4984-be48-835fe33dbe94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.865 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4bfa64d7-7e1c-434f-aa07-2a0a49d1f2ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.879 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c6798d97-86b4-41fb-b94f-3da1b1610024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.905 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e13e184a-a6b9-48f2-9b1d-febc5ee87c6e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.937 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1b80c5-0424-412f-a988-157b9994151f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.942 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed84fbf4-1cb2-4651-bbc8-ab73046f622f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:21 compute-0 NetworkManager[55691]: <info>  [1764936201.9436] manager: (tapd7360f84-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Dec 05 12:03:21 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.977 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Successfully updated port: ecec1a41-6f3e-4852-8cdb-9d461eded987 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.978 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[74e1e546-f343-44cc-88ad-9c5320f70140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.981 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6d65b9ed-a376-4b5f-8383-25740cdb586e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:21 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.999 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:21.999 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.000 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:03:22 compute-0 NetworkManager[55691]: <info>  [1764936202.0042] device (tapd7360f84-b0): carrier: link connected
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.010 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[804e9388-f52b-48c7-ba1c-a5cae8ca0ffa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.024 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b6f9d6a-ba23-43b5-813f-6c4cbb49f98f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358256, 'reachable_time': 37299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220367, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.040 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4e49f2d4-eef3-40f3-b2c4-26ca087e3b1e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:2b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358256, 'tstamp': 358256}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220368, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.058 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4883f56e-a4ef-46f9-8fdc-f56d4ebf01c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358256, 'reachable_time': 37299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220369, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.085 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[99aa8a0d-392c-404f-a9e8-4776c5b70758]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.138 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Updating instance_info_cache with network_info: [{"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.139 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f68db8d4-4007-4aa9-90c8-a8a26ba98a8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.141 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.141 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.142 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7360f84-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.145 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:22 compute-0 NetworkManager[55691]: <info>  [1764936202.1464] manager: (tapd7360f84-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Dec 05 12:03:22 compute-0 kernel: tapd7360f84-b0: entered promiscuous mode
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.148 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7360f84-b0, col_values=(('external_ids', {'iface-id': 'd85bc323-c3ce-47e3-ac1f-d5f27467a4e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.149 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:22 compute-0 ovn_controller[95610]: 2025-12-05T12:03:22Z|00263|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.151 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.152 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.153 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6f535c-75a9-4d92-b84e-485dcaff9264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.154 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:03:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.156 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'env', 'PROCESS_TAG=haproxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.164 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936202.1635714, 478fa005-452c-4e37-a919-63bb734a3c5c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.164 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] VM Started (Lifecycle Event)
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.166 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Releasing lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.166 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance network_info: |[{"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.168 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start _get_guest_xml network_info=[{"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.171 187212 WARNING nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.177 187212 DEBUG nova.virt.libvirt.host [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.177 187212 DEBUG nova.virt.libvirt.host [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.180 187212 DEBUG nova.virt.libvirt.host [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.180 187212 DEBUG nova.virt.libvirt.host [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.181 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.181 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.182 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.182 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.182 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.182 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.183 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.183 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.183 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.183 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.184 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.184 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.188 187212 DEBUG nova.virt.libvirt.vif [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-378209780',display_name='tempest-InstanceActionsNegativeTestJSON-server-378209780',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-378209780',id=35,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70409a2f9710408cb377a61250853fbd',ramdisk_id='',reservation_id='r-1rqpjj9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1806311246',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1806311246-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:17Z,user_data=None,user_id='18569d5748e8448fbd1bcbf5d37ff5f6',uuid=05008cd8-8cac-482b-9ff8-68f2f0aaa6d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.189 187212 DEBUG nova.network.os_vif_util [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converting VIF {"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.190 187212 DEBUG nova.network.os_vif_util [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.191 187212 DEBUG nova.objects.instance [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lazy-loading 'pci_devices' on Instance uuid 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.194 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.197 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.202 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936202.1636398, 478fa005-452c-4e37-a919-63bb734a3c5c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.202 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] VM Paused (Lifecycle Event)
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.217 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <uuid>05008cd8-8cac-482b-9ff8-68f2f0aaa6d4</uuid>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <name>instance-00000023</name>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <nova:name>tempest-InstanceActionsNegativeTestJSON-server-378209780</nova:name>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:03:22</nova:creationTime>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:03:22 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:03:22 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:03:22 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:03:22 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:03:22 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:03:22 compute-0 nova_compute[187208]:         <nova:user uuid="18569d5748e8448fbd1bcbf5d37ff5f6">tempest-InstanceActionsNegativeTestJSON-1806311246-project-member</nova:user>
Dec 05 12:03:22 compute-0 nova_compute[187208]:         <nova:project uuid="70409a2f9710408cb377a61250853fbd">tempest-InstanceActionsNegativeTestJSON-1806311246</nova:project>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:03:22 compute-0 nova_compute[187208]:         <nova:port uuid="a35b6b13-07bc-4c91-aaf5-231163a6ea44">
Dec 05 12:03:22 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <system>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <entry name="serial">05008cd8-8cac-482b-9ff8-68f2f0aaa6d4</entry>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <entry name="uuid">05008cd8-8cac-482b-9ff8-68f2f0aaa6d4</entry>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     </system>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <os>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   </os>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <features>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   </features>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.config"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:91:a5:f2"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <target dev="tapa35b6b13-07"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/console.log" append="off"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <video>
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     </video>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:03:22 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:03:22 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:03:22 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:03:22 compute-0 nova_compute[187208]: </domain>
Dec 05 12:03:22 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.218 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Preparing to wait for external event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.218 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.219 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.219 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.220 187212 DEBUG nova.virt.libvirt.vif [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-378209780',display_name='tempest-InstanceActionsNegativeTestJSON-server-378209780',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-378209780',id=35,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70409a2f9710408cb377a61250853fbd',ramdisk_id='',reservation_id='r-1rqpjj9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1806311246',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1806311246-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:17Z,user_data=None,user_id='18569d5748e8448fbd1bcbf5d37ff5f6',uuid=05008cd8-8cac-482b-9ff8-68f2f0aaa6d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.220 187212 DEBUG nova.network.os_vif_util [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converting VIF {"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.221 187212 DEBUG nova.network.os_vif_util [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.221 187212 DEBUG os_vif [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.223 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.223 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.223 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.224 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.233 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa35b6b13-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.234 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa35b6b13-07, col_values=(('external_ids', {'iface-id': 'a35b6b13-07bc-4c91-aaf5-231163a6ea44', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:a5:f2', 'vm-uuid': '05008cd8-8cac-482b-9ff8-68f2f0aaa6d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:22 compute-0 NetworkManager[55691]: <info>  [1764936202.2372] manager: (tapa35b6b13-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.236 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.240 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: deleting, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.243 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.245 187212 INFO os_vif [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07')
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.271 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] During sync_power_state the instance has a pending task (deleting). Skip.
Dec 05 12:03:22 compute-0 podman[220382]: 2025-12-05 12:03:22.341674387 +0000 UTC m=+0.057192474 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Dec 05 12:03:22 compute-0 podman[220383]: 2025-12-05 12:03:22.379113117 +0000 UTC m=+0.089544559 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 05 12:03:22 compute-0 nova_compute[187208]: 2025-12-05 12:03:22.545 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:22 compute-0 podman[220444]: 2025-12-05 12:03:22.498721813 +0000 UTC m=+0.026285852 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.046 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.048 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.049 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.049 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.049 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Processing event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.050 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.050 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.051 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.051 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.051 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] No waiting events found dispatching network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.052 187212 WARNING nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received unexpected event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 for instance with vm_state building and task_state spawning.
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.052 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-changed-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.052 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Refreshing instance network info cache due to event network-changed-7022c257-2ab5-436e-9757-387e9de66b18. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.053 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.053 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.053 187212 DEBUG nova.network.neutron [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Refreshing network info cache for port 7022c257-2ab5-436e-9757-387e9de66b18 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.055 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.060 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936203.0599492, d2085dd9-2ebd-4804-99c1-3b15cbd216f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.060 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] VM Resumed (Lifecycle Event)
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.062 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.066 187212 INFO nova.virt.libvirt.driver [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance spawned successfully.
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.067 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.084 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.088 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.092 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.092 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.093 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.093 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.094 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.094 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.113 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Updating instance_info_cache with network_info: [{"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.124 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.146 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.146 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance network_info: |[{"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.149 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start _get_guest_xml network_info=[{"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.154 187212 WARNING nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.157 187212 INFO nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Took 17.53 seconds to spawn the instance on the hypervisor.
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.157 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.164 187212 DEBUG nova.virt.libvirt.host [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.165 187212 DEBUG nova.virt.libvirt.host [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.168 187212 DEBUG nova.virt.libvirt.host [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.169 187212 DEBUG nova.virt.libvirt.host [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.169 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.169 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.170 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.170 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.170 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.170 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.171 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.171 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.171 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.171 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.172 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.172 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.176 187212 DEBUG nova.virt.libvirt.vif [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1049650520',display_name='tempest-ImagesTestJSON-server-1049650520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1049650520',id=36,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-kquxoeat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:19Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=c1e2f189-1777-4f28-97ab-72cf0f60fbc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.176 187212 DEBUG nova.network.os_vif_util [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.177 187212 DEBUG nova.network.os_vif_util [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.178 187212 DEBUG nova.objects.instance [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.342 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <uuid>c1e2f189-1777-4f28-97ab-72cf0f60fbc0</uuid>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <name>instance-00000024</name>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesTestJSON-server-1049650520</nova:name>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:03:23</nova:creationTime>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:03:23 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:03:23 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:03:23 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:03:23 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:03:23 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:03:23 compute-0 nova_compute[187208]:         <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec 05 12:03:23 compute-0 nova_compute[187208]:         <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:03:23 compute-0 nova_compute[187208]:         <nova:port uuid="ecec1a41-6f3e-4852-8cdb-9d461eded987">
Dec 05 12:03:23 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <system>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <entry name="serial">c1e2f189-1777-4f28-97ab-72cf0f60fbc0</entry>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <entry name="uuid">c1e2f189-1777-4f28-97ab-72cf0f60fbc0</entry>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     </system>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <os>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   </os>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <features>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   </features>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.config"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:57:88:7f"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <target dev="tapecec1a41-6f"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/console.log" append="off"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <video>
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     </video>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:03:23 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:03:23 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:03:23 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:03:23 compute-0 nova_compute[187208]: </domain>
Dec 05 12:03:23 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.343 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Preparing to wait for external event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.343 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.344 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.344 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.345 187212 DEBUG nova.virt.libvirt.vif [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1049650520',display_name='tempest-ImagesTestJSON-server-1049650520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1049650520',id=36,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-kquxoeat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:19Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=c1e2f189-1777-4f28-97ab-72cf0f60fbc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.345 187212 DEBUG nova.network.os_vif_util [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.346 187212 DEBUG nova.network.os_vif_util [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.346 187212 DEBUG os_vif [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.347 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.347 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.347 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.350 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.350 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapecec1a41-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.351 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapecec1a41-6f, col_values=(('external_ids', {'iface-id': 'ecec1a41-6f3e-4852-8cdb-9d461eded987', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:88:7f', 'vm-uuid': 'c1e2f189-1777-4f28-97ab-72cf0f60fbc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.352 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:23 compute-0 NetworkManager[55691]: <info>  [1764936203.3539] manager: (tapecec1a41-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.354 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.362 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.364 187212 INFO os_vif [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f')
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.534 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.535 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.535 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] No VIF found with MAC fa:16:3e:91:a5:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.536 187212 INFO nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Using config drive
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.546 187212 INFO nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Took 18.41 seconds to build instance.
Dec 05 12:03:23 compute-0 podman[220444]: 2025-12-05 12:03:23.799870387 +0000 UTC m=+1.327434426 container create 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.816 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.823 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.824 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.824 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:57:88:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:03:23 compute-0 nova_compute[187208]: 2025-12-05 12:03:23.825 187212 INFO nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Using config drive
Dec 05 12:03:24 compute-0 systemd[1]: Started libpod-conmon-8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43.scope.
Dec 05 12:03:24 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:03:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ded9da6901c69416b75c147b588f572cf065df3fa0a5ca976e39b2a29f8769f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:03:24 compute-0 podman[220444]: 2025-12-05 12:03:24.230407185 +0000 UTC m=+1.757971234 container init 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:03:24 compute-0 podman[220444]: 2025-12-05 12:03:24.237105396 +0000 UTC m=+1.764669405 container start 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 12:03:24 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [NOTICE]   (220467) : New worker (220469) forked
Dec 05 12:03:24 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [NOTICE]   (220467) : Loading success.
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.422 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.423 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.423 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.424 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.424 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] No waiting events found dispatching network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.425 187212 WARNING nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received unexpected event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff for instance with vm_state deleted and task_state None.
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.425 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-deleted-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.426 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-changed-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.426 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Refreshing instance network info cache due to event network-changed-a35b6b13-07bc-4c91-aaf5-231163a6ea44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.427 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.427 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.428 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Refreshing network info cache for port a35b6b13-07bc-4c91-aaf5-231163a6ea44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.549 187212 INFO nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Creating config drive at /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.config
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.554 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpilfh4oqp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.591 187212 INFO nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Creating config drive at /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.config
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.598 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplqla7n8s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.704 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpilfh4oqp" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.727 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplqla7n8s" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:25 compute-0 kernel: tapa35b6b13-07: entered promiscuous mode
Dec 05 12:03:25 compute-0 kernel: tapecec1a41-6f: entered promiscuous mode
Dec 05 12:03:25 compute-0 NetworkManager[55691]: <info>  [1764936205.8103] manager: (tapa35b6b13-07): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Dec 05 12:03:25 compute-0 NetworkManager[55691]: <info>  [1764936205.8127] manager: (tapecec1a41-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.814 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:25 compute-0 ovn_controller[95610]: 2025-12-05T12:03:25Z|00264|binding|INFO|Claiming lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 for this chassis.
Dec 05 12:03:25 compute-0 ovn_controller[95610]: 2025-12-05T12:03:25Z|00265|binding|INFO|a35b6b13-07bc-4c91-aaf5-231163a6ea44: Claiming fa:16:3e:91:a5:f2 10.100.0.10
Dec 05 12:03:25 compute-0 ovn_controller[95610]: 2025-12-05T12:03:25Z|00266|binding|INFO|Claiming lport ecec1a41-6f3e-4852-8cdb-9d461eded987 for this chassis.
Dec 05 12:03:25 compute-0 ovn_controller[95610]: 2025-12-05T12:03:25Z|00267|binding|INFO|ecec1a41-6f3e-4852-8cdb-9d461eded987: Claiming fa:16:3e:57:88:7f 10.100.0.5
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.832 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:88:7f 10.100.0.5'], port_security=['fa:16:3e:57:88:7f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c1e2f189-1777-4f28-97ab-72cf0f60fbc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ecec1a41-6f3e-4852-8cdb-9d461eded987) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.834 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:a5:f2 10.100.0.10'], port_security=['fa:16:3e:91:a5:f2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '05008cd8-8cac-482b-9ff8-68f2f0aaa6d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5223579-477c-4fbe-a58c-2e56f428541c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70409a2f9710408cb377a61250853fbd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cfbac9bb-a6fa-4e30-b1e0-c07877ef21de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85bd0838-1b85-4e8e-bf67-d21df8aa9251, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a35b6b13-07bc-4c91-aaf5-231163a6ea44) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.836 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ecec1a41-6f3e-4852-8cdb-9d461eded987 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.838 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:03:25 compute-0 systemd-udevd[220508]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:25 compute-0 systemd-udevd[220507]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:25 compute-0 ovn_controller[95610]: 2025-12-05T12:03:25Z|00268|binding|INFO|Setting lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 ovn-installed in OVS
Dec 05 12:03:25 compute-0 ovn_controller[95610]: 2025-12-05T12:03:25Z|00269|binding|INFO|Setting lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 up in Southbound
Dec 05 12:03:25 compute-0 ovn_controller[95610]: 2025-12-05T12:03:25Z|00270|binding|INFO|Setting lport ecec1a41-6f3e-4852-8cdb-9d461eded987 ovn-installed in OVS
Dec 05 12:03:25 compute-0 ovn_controller[95610]: 2025-12-05T12:03:25Z|00271|binding|INFO|Setting lport ecec1a41-6f3e-4852-8cdb-9d461eded987 up in Southbound
Dec 05 12:03:25 compute-0 nova_compute[187208]: 2025-12-05 12:03:25.847 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:25 compute-0 NetworkManager[55691]: <info>  [1764936205.8570] device (tapa35b6b13-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.853 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9351a3c9-6c59-40fa-90a2-521b22e167c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.854 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:03:25 compute-0 NetworkManager[55691]: <info>  [1764936205.8579] device (tapa35b6b13-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:03:25 compute-0 NetworkManager[55691]: <info>  [1764936205.8584] device (tapecec1a41-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:03:25 compute-0 NetworkManager[55691]: <info>  [1764936205.8589] device (tapecec1a41-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.860 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.860 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35a871f8-e2fc-46ea-adc6-abf251ad251e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.861 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a6b21d-cd64-4104-b60d-41cb279b4af1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.877 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[44b7b886-d65d-454f-8888-5459cb6596b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:25 compute-0 systemd-machined[153543]: New machine qemu-40-instance-00000024.
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.902 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f82b5f6-7be3-451f-8d25-7019cf5c9b66]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:25 compute-0 systemd[1]: Started Virtual Machine qemu-40-instance-00000024.
Dec 05 12:03:25 compute-0 systemd-machined[153543]: New machine qemu-39-instance-00000023.
Dec 05 12:03:25 compute-0 systemd[1]: Started Virtual Machine qemu-39-instance-00000023.
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.931 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[547d3c23-693c-4dc8-8d48-28a30e046455]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:25 compute-0 NetworkManager[55691]: <info>  [1764936205.9386] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.942 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a57b26b7-884e-4b12-b33c-4db2b75e67e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.978 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6d682f75-0e00-4995-b155-37231bf6cd02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.981 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6d591a16-2366-4af4-b9ad-296fa95c2873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:26 compute-0 NetworkManager[55691]: <info>  [1764936206.0049] device (tap41b3b495-c0): carrier: link connected
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.010 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[922d7a46-9880-4190-90e4-f72185796573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.024 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f4cf4ddb-c994-4104-9820-0469f81c539f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358656, 'reachable_time': 39414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220553, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.039 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef11c15-c88f-4373-a93b-dd2342caf725]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358656, 'tstamp': 358656}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220554, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.053 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bc738682-3f58-412e-8029-845fa210aff1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358656, 'reachable_time': 39414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220555, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.083 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d75b1964-8dff-4a7a-9194-ade635535dc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.152 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e112b7-df07-443a-93f8-83830ec3468a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.154 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.154 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.155 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:26 compute-0 NetworkManager[55691]: <info>  [1764936206.1579] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.157 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:26 compute-0 kernel: tap41b3b495-c0: entered promiscuous mode
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.163 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:26 compute-0 ovn_controller[95610]: 2025-12-05T12:03:26Z|00272|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.167 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.180 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eec68ef8-50f8-4101-ae66-75c469515f85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.182 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.183 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:03:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.185 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.211 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936206.2104788, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.211 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Started (Lifecycle Event)
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.236 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.243 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936206.2115746, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.243 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Paused (Lifecycle Event)
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.265 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.268 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.288 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.289 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936206.287661, 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.289 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] VM Started (Lifecycle Event)
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.310 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.313 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936206.2877815, 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.314 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] VM Paused (Lifecycle Event)
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.335 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.338 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:26 compute-0 nova_compute[187208]: 2025-12-05 12:03:26.360 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:26 compute-0 podman[220598]: 2025-12-05 12:03:26.567046288 +0000 UTC m=+0.028315900 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:03:26 compute-0 podman[220598]: 2025-12-05 12:03:26.896298282 +0000 UTC m=+0.357567864 container create f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:03:26 compute-0 systemd[1]: Started libpod-conmon-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7.scope.
Dec 05 12:03:26 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:03:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1e943a749671a1b77c4cdf6fa31b99e20bf9ab4dc014c673f02b47bd056cb36/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:03:27 compute-0 podman[220598]: 2025-12-05 12:03:27.033079368 +0000 UTC m=+0.494348970 container init f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:03:27 compute-0 podman[220598]: 2025-12-05 12:03:27.046223294 +0000 UTC m=+0.507492886 container start f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 05 12:03:27 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [NOTICE]   (220617) : New worker (220619) forked
Dec 05 12:03:27 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [NOTICE]   (220617) : Loading success.
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.144 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a35b6b13-07bc-4c91-aaf5-231163a6ea44 in datapath f5223579-477c-4fbe-a58c-2e56f428541c unbound from our chassis
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.146 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5223579-477c-4fbe-a58c-2e56f428541c
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.155 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ff6879cb-e397-4590-b291-ba3154ae5f0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.156 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf5223579-41 in ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.158 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf5223579-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.158 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[08c02eee-3fec-4255-a2a0-4121ecb22b9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.159 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e5aa68f7-6701-4b63-abea-2c25ef7bd19b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.171 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[09731575-037f-4513-b992-a404974e3418]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.187 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f2f2fd3-0841-472b-9516-2f1bead1970b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.225 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d4443f-e937-4e91-99c6-e924c698ac5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.233 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2af2aae0-32c6-4723-a359-20f97e07dec3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 NetworkManager[55691]: <info>  [1764936207.2342] manager: (tapf5223579-40): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.264 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[034b702b-91ea-4c5e-af34-e1558701446c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.266 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2fcad417-5231-4118-817e-badc1fb9b809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 NetworkManager[55691]: <info>  [1764936207.2869] device (tapf5223579-40): carrier: link connected
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.290 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3f6e47-0246-44a1-aa0b-b58c840a268a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.308 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c88e22e-b453-4486-a59d-2806c09b78fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5223579-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:8e:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358785, 'reachable_time': 37517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220638, 'error': None, 'target': 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.320 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[538657de-66bc-417b-8228-2e7b666fd1a4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:8e7d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358785, 'tstamp': 358785}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220639, 'error': None, 'target': 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a8856616-704e-4b47-b961-ae01151f8661]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5223579-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:8e:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358785, 'reachable_time': 37517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220640, 'error': None, 'target': 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.367 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[064b3a3b-6f50-4810-b04b-9a053d8d33c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.417 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4b321ccb-e8db-4589-a17f-a874dfad19ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.418 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5223579-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.419 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.419 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5223579-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:27 compute-0 kernel: tapf5223579-40: entered promiscuous mode
Dec 05 12:03:27 compute-0 nova_compute[187208]: 2025-12-05 12:03:27.450 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:27 compute-0 NetworkManager[55691]: <info>  [1764936207.4533] manager: (tapf5223579-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Dec 05 12:03:27 compute-0 nova_compute[187208]: 2025-12-05 12:03:27.454 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.455 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5223579-40, col_values=(('external_ids', {'iface-id': '41d0fbd3-22b2-4ee9-8c84-9f176e5ee865'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:27 compute-0 nova_compute[187208]: 2025-12-05 12:03:27.456 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:27 compute-0 ovn_controller[95610]: 2025-12-05T12:03:27Z|00273|binding|INFO|Releasing lport 41d0fbd3-22b2-4ee9-8c84-9f176e5ee865 from this chassis (sb_readonly=0)
Dec 05 12:03:27 compute-0 nova_compute[187208]: 2025-12-05 12:03:27.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.472 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f5223579-477c-4fbe-a58c-2e56f428541c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f5223579-477c-4fbe-a58c-2e56f428541c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.473 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ec826654-6804-4387-85ce-e05977c26caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.474 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-f5223579-477c-4fbe-a58c-2e56f428541c
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/f5223579-477c-4fbe-a58c-2e56f428541c.pid.haproxy
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID f5223579-477c-4fbe-a58c-2e56f428541c
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:03:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.475 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'env', 'PROCESS_TAG=haproxy-f5223579-477c-4fbe-a58c-2e56f428541c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f5223579-477c-4fbe-a58c-2e56f428541c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:03:27 compute-0 nova_compute[187208]: 2025-12-05 12:03:27.548 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:27 compute-0 nova_compute[187208]: 2025-12-05 12:03:27.807 187212 DEBUG nova.network.neutron [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updated VIF entry in instance network info cache for port 7022c257-2ab5-436e-9757-387e9de66b18. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:03:27 compute-0 nova_compute[187208]: 2025-12-05 12:03:27.808 187212 DEBUG nova.network.neutron [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [{"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:27 compute-0 nova_compute[187208]: 2025-12-05 12:03:27.834 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:27 compute-0 podman[220673]: 2025-12-05 12:03:27.832792161 +0000 UTC m=+0.023153043 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:03:27 compute-0 podman[220673]: 2025-12-05 12:03:27.983871726 +0000 UTC m=+0.174232578 container create 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 05 12:03:28 compute-0 systemd[1]: Started libpod-conmon-80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a.scope.
Dec 05 12:03:28 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:03:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78756f8a0c787d9c7273ba58bfae472f50372e3fa267a0cd0bfee40b23f73738/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:03:28 compute-0 podman[220673]: 2025-12-05 12:03:28.153644155 +0000 UTC m=+0.344005017 container init 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.155 187212 DEBUG nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.155 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.156 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.156 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:28 compute-0 podman[220686]: 2025-12-05 12:03:28.156891998 +0000 UTC m=+0.136227952 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.156 187212 DEBUG nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Processing event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.158 187212 DEBUG nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.158 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.158 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.158 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.159 187212 DEBUG nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] No waiting events found dispatching network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.159 187212 WARNING nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received unexpected event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 for instance with vm_state building and task_state deleting.
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.160 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:03:28 compute-0 podman[220673]: 2025-12-05 12:03:28.161736857 +0000 UTC m=+0.352097709 container start 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.164 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936208.1644201, 478fa005-452c-4e37-a919-63bb734a3c5c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.164 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] VM Resumed (Lifecycle Event)
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.166 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.170 187212 INFO nova.virt.libvirt.driver [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance spawned successfully.
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.170 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:03:28 compute-0 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [NOTICE]   (220738) : New worker (220740) forked
Dec 05 12:03:28 compute-0 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [NOTICE]   (220738) : Loading success.
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.199 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.208 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: deleting, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.213 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.213 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.214 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.214 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.214 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.215 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.235 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] During sync_power_state the instance has a pending task (deleting). Skip.
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.353 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.435 187212 INFO nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Took 13.63 seconds to spawn the instance on the hypervisor.
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.436 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:28 compute-0 podman[220687]: 2025-12-05 12:03:28.44158973 +0000 UTC m=+0.416213669 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.483 187212 DEBUG nova.compute.utils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Conflict updating instance 478fa005-452c-4e37-a919-63bb734a3c5c. Expected: {'task_state': ['spawning']}. Actual: {'task_state': 'deleting'} notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.484 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance disappeared during build. _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2483
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.484 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.485 187212 DEBUG nova.virt.libvirt.vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1018127156',display_name='tempest-DeleteServersTestJSON-server-1018127156',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1018127156',id=34,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=2025-12-05T12:03:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-oru1ft1x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state=None,terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:17Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=478fa005-452c-4e37-a919-63bb734a3c5c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.485 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.486 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.486 187212 DEBUG os_vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.488 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.488 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7022c257-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.540 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:28 compute-0 kernel: tap7022c257-2a: left promiscuous mode
Dec 05 12:03:28 compute-0 NetworkManager[55691]: <info>  [1764936208.5414] device (tap7022c257-2a): state change: disconnected -> unmanaged (reason 'unmanaged-external-down', managed-type: 'external')
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.542 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:28 compute-0 ovn_controller[95610]: 2025-12-05T12:03:28Z|00274|binding|INFO|Releasing lport 7022c257-2ab5-436e-9757-387e9de66b18 from this chassis (sb_readonly=0)
Dec 05 12:03:28 compute-0 ovn_controller[95610]: 2025-12-05T12:03:28Z|00275|binding|INFO|Setting lport 7022c257-2ab5-436e-9757-387e9de66b18 down in Southbound
Dec 05 12:03:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.560 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:8e:0e 10.100.0.4'], port_security=['fa:16:3e:57:8e:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '478fa005-452c-4e37-a919-63bb734a3c5c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7022c257-2ab5-436e-9757-387e9de66b18) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.561 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7022c257-2ab5-436e-9757-387e9de66b18 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 unbound from our chassis
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.563 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.565 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.566 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3822956a-f178-469e-964b-c878e09d3965]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.567 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace which is not needed anymore
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.572 187212 INFO os_vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a')
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.573 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.573 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:28 compute-0 nova_compute[187208]: 2025-12-05 12:03:28.573 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:28 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [NOTICE]   (220467) : haproxy version is 2.8.14-c23fe91
Dec 05 12:03:28 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [NOTICE]   (220467) : path to executable is /usr/sbin/haproxy
Dec 05 12:03:28 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [WARNING]  (220467) : Exiting Master process...
Dec 05 12:03:28 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [ALERT]    (220467) : Current worker (220469) exited with code 143 (Terminated)
Dec 05 12:03:28 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [WARNING]  (220467) : All workers exited. Exiting... (0)
Dec 05 12:03:28 compute-0 systemd[1]: libpod-8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43.scope: Deactivated successfully.
Dec 05 12:03:28 compute-0 podman[220770]: 2025-12-05 12:03:28.751327037 +0000 UTC m=+0.100849131 container died 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:03:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43-userdata-shm.mount: Deactivated successfully.
Dec 05 12:03:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-1ded9da6901c69416b75c147b588f572cf065df3fa0a5ca976e39b2a29f8769f-merged.mount: Deactivated successfully.
Dec 05 12:03:29 compute-0 podman[220770]: 2025-12-05 12:03:29.095076996 +0000 UTC m=+0.444599080 container cleanup 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 12:03:29 compute-0 systemd[1]: libpod-conmon-8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43.scope: Deactivated successfully.
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.215 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Updated VIF entry in instance network info cache for port a35b6b13-07bc-4c91-aaf5-231163a6ea44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.216 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Updating instance_info_cache with network_info: [{"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.239 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.240 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-changed-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.240 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Refreshing instance network info cache due to event network-changed-ecec1a41-6f3e-4852-8cdb-9d461eded987. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.240 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.240 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.241 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Refreshing network info cache for port ecec1a41-6f3e-4852-8cdb-9d461eded987 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:03:29 compute-0 podman[220798]: 2025-12-05 12:03:29.27623156 +0000 UTC m=+0.159511067 container remove 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:03:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.281 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[00583953-3966-435e-aa11-965cc7e7ee23]: (4, ('Fri Dec  5 12:03:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43)\n8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43\nFri Dec  5 12:03:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43)\n8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.283 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7dfbcd26-62e0-4d4f-be68-513e59792677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.284 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:29 compute-0 kernel: tapd7360f84-b0: left promiscuous mode
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.286 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e35841d1-cfa7-4bda-9d48-9e4fbf52c36b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.311 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[596c5fd6-66c5-4868-8e51-540b8b38aaf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.312 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4753a636-ad36-484c-afed-31cb4e008607]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:29 compute-0 nova_compute[187208]: 2025-12-05 12:03:29.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c06cf931-eebe-4916-be15-15dfabc21fa8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358249, 'reachable_time': 28551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220811, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:29 compute-0 systemd[1]: run-netns-ovnmeta\x2dd7360f84\x2dbcd5\x2d4e64\x2dbf43\x2d1fdbd8215a70.mount: Deactivated successfully.
Dec 05 12:03:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.337 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:03:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.337 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b6497517-2e20-438f-9314-121115552c5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.263 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.286 187212 INFO nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Took 1.71 seconds to deallocate network for instance.
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.429 187212 INFO nova.scheduler.client.report [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Deleted allocations for instance 478fa005-452c-4e37-a919-63bb734a3c5c
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.430 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.430 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 13.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.430 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.430 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.431 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.432 187212 INFO nova.compute.manager [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Terminating instance
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.432 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.432 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.433 187212 DEBUG nova.network.neutron [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.612 187212 DEBUG nova.network.neutron [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.963 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Updated VIF entry in instance network info cache for port ecec1a41-6f3e-4852-8cdb-9d461eded987. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.964 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Updating instance_info_cache with network_info: [{"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:30 compute-0 nova_compute[187208]: 2025-12-05 12:03:30.984 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.046 187212 DEBUG nova.network.neutron [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.061 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.062 187212 DEBUG nova.compute.manager [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:31 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000022.scope: Deactivated successfully.
Dec 05 12:03:31 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000022.scope: Consumed 3.165s CPU time.
Dec 05 12:03:31 compute-0 systemd-machined[153543]: Machine qemu-38-instance-00000022 terminated.
Dec 05 12:03:31 compute-0 podman[220816]: 2025-12-05 12:03:31.232005043 +0000 UTC m=+0.083820915 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.338 187212 INFO nova.virt.libvirt.driver [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance destroyed successfully.
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.339 187212 DEBUG nova.objects.instance [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'resources' on Instance uuid 478fa005-452c-4e37-a919-63bb734a3c5c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.359 187212 INFO nova.virt.libvirt.driver [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Deleting instance files /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c_del
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.360 187212 INFO nova.virt.libvirt.driver [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Deletion of /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c_del complete
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.416 187212 INFO nova.compute.manager [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.417 187212 DEBUG oslo.service.loopingcall [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.417 187212 DEBUG nova.compute.manager [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.418 187212 DEBUG nova.network.neutron [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.757 187212 DEBUG nova.network.neutron [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.772 187212 DEBUG nova.network.neutron [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.787 187212 INFO nova.compute.manager [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Took 0.37 seconds to deallocate network for instance.
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.834 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:31 compute-0 nova_compute[187208]: 2025-12-05 12:03:31.836 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:32 compute-0 nova_compute[187208]: 2025-12-05 12:03:32.039 187212 DEBUG nova.compute.provider_tree [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:32 compute-0 nova_compute[187208]: 2025-12-05 12:03:32.055 187212 DEBUG nova.scheduler.client.report [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:32 compute-0 nova_compute[187208]: 2025-12-05 12:03:32.078 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:32 compute-0 nova_compute[187208]: 2025-12-05 12:03:32.119 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936197.1187773, bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:32 compute-0 nova_compute[187208]: 2025-12-05 12:03:32.120 187212 INFO nova.compute.manager [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] VM Stopped (Lifecycle Event)
Dec 05 12:03:32 compute-0 nova_compute[187208]: 2025-12-05 12:03:32.145 187212 DEBUG nova.compute.manager [None req-d0df733a-df5c-4792-bc1b-e2b262df4072 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:32 compute-0 nova_compute[187208]: 2025-12-05 12:03:32.173 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:32 compute-0 nova_compute[187208]: 2025-12-05 12:03:32.552 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.095 187212 DEBUG nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.095 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.096 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.096 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.096 187212 DEBUG nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Processing event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.096 187212 DEBUG nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.097 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.097 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.097 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.097 187212 DEBUG nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] No waiting events found dispatching network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.098 187212 WARNING nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received unexpected event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 for instance with vm_state building and task_state spawning.
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.099 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.100 187212 DEBUG nova.compute.manager [req-ce595d2d-9bc8-4b24-8274-0c4d85d58346 req-4f772a87-d60d-49f6-8f96-ca0a0cec236c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-vif-deleted-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.103 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936213.103502, 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.104 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] VM Resumed (Lifecycle Event)
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.105 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.112 187212 INFO nova.virt.libvirt.driver [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance spawned successfully.
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.112 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.160 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.161 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.161 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.162 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.162 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.163 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.168 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.172 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.372 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.437 187212 INFO nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Took 15.85 seconds to spawn the instance on the hypervisor.
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.438 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.507 187212 INFO nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Took 16.34 seconds to build instance.
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.532 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.540 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:33 compute-0 nova_compute[187208]: 2025-12-05 12:03:33.964 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:34 compute-0 nova_compute[187208]: 2025-12-05 12:03:34.654 187212 DEBUG nova.compute.manager [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:34 compute-0 nova_compute[187208]: 2025-12-05 12:03:34.724 187212 INFO nova.compute.manager [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] instance snapshotting
Dec 05 12:03:34 compute-0 nova_compute[187208]: 2025-12-05 12:03:34.947 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Beginning live snapshot process
Dec 05 12:03:35 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.151 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.213 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.216 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.278 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.291 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.345 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.347 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:35 compute-0 ovn_controller[95610]: 2025-12-05T12:03:35Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:70:f2 10.100.0.12
Dec 05 12:03:35 compute-0 ovn_controller[95610]: 2025-12-05T12:03:35Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:70:f2 10.100.0.12
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.796 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006.delta 1073741824" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.798 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:03:35 compute-0 nova_compute[187208]: 2025-12-05 12:03:35.854 187212 DEBUG nova.virt.libvirt.guest [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:03:36 compute-0 nova_compute[187208]: 2025-12-05 12:03:36.358 187212 DEBUG nova.virt.libvirt.guest [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:03:36 compute-0 nova_compute[187208]: 2025-12-05 12:03:36.364 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:03:36 compute-0 nova_compute[187208]: 2025-12-05 12:03:36.412 187212 DEBUG nova.privsep.utils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:03:36 compute-0 nova_compute[187208]: 2025-12-05 12:03:36.413 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006.delta /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:36 compute-0 nova_compute[187208]: 2025-12-05 12:03:36.977 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006.delta /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006" returned: 0 in 0.564s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:36 compute-0 nova_compute[187208]: 2025-12-05 12:03:36.985 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Snapshot extracted, beginning image upload
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.102 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.131 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.132 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.132 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.154 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.155 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.156 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.156 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.156 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d70544d6-04e3-4b2a-914a-72db3052216a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.632 187212 DEBUG nova.compute.manager [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.632 187212 DEBUG oslo_concurrency.lockutils [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.633 187212 DEBUG oslo_concurrency.lockutils [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.633 187212 DEBUG oslo_concurrency.lockutils [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.633 187212 DEBUG nova.compute.manager [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] No waiting events found dispatching network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.633 187212 WARNING nova.compute.manager [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received unexpected event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 for instance with vm_state deleted and task_state None.
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.699 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.699 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.734 187212 DEBUG nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.734 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.735 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.735 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.735 187212 DEBUG nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Processing event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.736 187212 DEBUG nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.736 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.736 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.736 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.737 187212 DEBUG nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] No waiting events found dispatching network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.737 187212 WARNING nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received unexpected event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 for instance with vm_state building and task_state spawning.
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.738 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance event wait completed in 11 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.739 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.743 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936217.7426016, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.745 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Resumed (Lifecycle Event)
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.757 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.761 187212 INFO nova.virt.libvirt.driver [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance spawned successfully.
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.761 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.774 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.778 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.793 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.794 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.795 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.795 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.796 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.797 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.807 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.849 187212 INFO nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 18.11 seconds to spawn the instance on the hypervisor.
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.850 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.920 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.920 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.924 187212 INFO nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 18.96 seconds to build instance.
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.928 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.929 187212 INFO nova.compute.claims [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:03:37 compute-0 nova_compute[187208]: 2025-12-05 12:03:37.948 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.286 187212 DEBUG nova.compute.provider_tree [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.400 187212 DEBUG nova.scheduler.client.report [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.543 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.551 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.552 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.635 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.636 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.669 187212 INFO nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.688 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.788 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.789 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.790 187212 INFO nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Creating image(s)
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.791 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.791 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.792 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.810 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.892 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.893 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.894 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.907 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:38.911 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:38.912 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.954 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.965 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.966 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.986 187212 DEBUG nova.policy [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4671f6c82ea049fab3a314ecf45b7656', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:03:38 compute-0 nova_compute[187208]: 2025-12-05 12:03:38.999 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.000 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.000 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.052 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.053 187212 DEBUG nova.virt.disk.api [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Checking if we can resize image /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.053 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.110 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00276|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00277|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00278|binding|INFO|Releasing lport 41d0fbd3-22b2-4ee9-8c84-9f176e5ee865 from this chassis (sb_readonly=0)
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00279|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.112 187212 DEBUG nova.virt.disk.api [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Cannot resize image /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.114 187212 DEBUG nova.objects.instance [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'migration_context' on Instance uuid 30a55909-059f-4a0c-9598-14cc506d42a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.146 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.146 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Ensure instance console log exists: /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.147 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.147 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.147 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:39 compute-0 podman[220911]: 2025-12-05 12:03:39.199927553 +0000 UTC m=+0.052326936 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.271 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updating instance_info_cache with network_info: [{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.311 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.311 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.343 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.344 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.344 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.344 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00280|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00281|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00282|binding|INFO|Releasing lport 41d0fbd3-22b2-4ee9-8c84-9f176e5ee865 from this chassis (sb_readonly=0)
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00283|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.421 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.422 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.431 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.450 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.470 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.471 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.500 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.507 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.508 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.552 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.553 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.561 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.561 187212 INFO nova.compute.claims [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.573 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.579 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.650 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.651 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.723 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.730 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.798 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.800 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.829 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.873 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.874 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.874 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.875 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.875 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.877 187212 INFO nova.compute.manager [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Terminating instance
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.878 187212 DEBUG nova.compute.manager [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.879 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.886 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:39 compute-0 kernel: tapa35b6b13-07 (unregistering): left promiscuous mode
Dec 05 12:03:39 compute-0 NetworkManager[55691]: <info>  [1764936219.9002] device (tapa35b6b13-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00284|binding|INFO|Releasing lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 from this chassis (sb_readonly=0)
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00285|binding|INFO|Setting lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 down in Southbound
Dec 05 12:03:39 compute-0 ovn_controller[95610]: 2025-12-05T12:03:39Z|00286|binding|INFO|Removing iface tapa35b6b13-07 ovn-installed in OVS
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.921 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:a5:f2 10.100.0.10'], port_security=['fa:16:3e:91:a5:f2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '05008cd8-8cac-482b-9ff8-68f2f0aaa6d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5223579-477c-4fbe-a58c-2e56f428541c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70409a2f9710408cb377a61250853fbd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cfbac9bb-a6fa-4e30-b1e0-c07877ef21de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85bd0838-1b85-4e8e-bf67-d21df8aa9251, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a35b6b13-07bc-4c91-aaf5-231163a6ea44) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.923 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a35b6b13-07bc-4c91-aaf5-231163a6ea44 in datapath f5223579-477c-4fbe-a58c-2e56f428541c unbound from our chassis
Dec 05 12:03:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.925 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5223579-477c-4fbe-a58c-2e56f428541c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.926 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0d259fab-54dc-43c4-86ec-2a1e2ed40565]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.927 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c namespace which is not needed anymore
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:39 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000023.scope: Deactivated successfully.
Dec 05 12:03:39 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000023.scope: Consumed 7.231s CPU time.
Dec 05 12:03:39 compute-0 systemd-machined[153543]: Machine qemu-39-instance-00000023 terminated.
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.966 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:39 compute-0 nova_compute[187208]: 2025-12-05 12:03:39.966 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.005 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Successfully created port: 9dc35efb-0aed-463b-860e-3b60dd65b6db _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.042 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.105 187212 DEBUG nova.compute.provider_tree [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.111 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.131 187212 DEBUG nova.scheduler.client.report [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.148 187212 INFO nova.virt.libvirt.driver [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance destroyed successfully.
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.149 187212 DEBUG nova.objects.instance [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lazy-loading 'resources' on Instance uuid 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.192 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.193 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.197 187212 DEBUG nova.virt.libvirt.vif [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-378209780',display_name='tempest-InstanceActionsNegativeTestJSON-server-378209780',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-378209780',id=35,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70409a2f9710408cb377a61250853fbd',ramdisk_id='',reservation_id='r-1rqpjj9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1806311246',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1806311246-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:33Z,user_data=None,user_id='18569d5748e8448fbd1bcbf5d37ff5f6',uuid=05008cd8-8cac-482b-9ff8-68f2f0aaa6d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.197 187212 DEBUG nova.network.os_vif_util [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converting VIF {"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.198 187212 DEBUG nova.network.os_vif_util [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.198 187212 DEBUG os_vif [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.200 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.200 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa35b6b13-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.201 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.202 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.205 187212 INFO os_vif [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07')
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.206 187212 INFO nova.virt.libvirt.driver [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Deleting instance files /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4_del
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.206 187212 INFO nova.virt.libvirt.driver [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Deletion of /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4_del complete
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.211 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.212 187212 INFO nova.compute.claims [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.258 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Snapshot image upload complete
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.258 187212 INFO nova.compute.manager [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 5.53 seconds to snapshot the instance on the hypervisor.
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.307 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.308 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5040MB free_disk=73.18467330932617GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.308 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.328 187212 INFO nova.compute.manager [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Took 0.45 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.329 187212 DEBUG oslo.service.loopingcall [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.329 187212 DEBUG nova.compute.manager [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.329 187212 DEBUG nova.network.neutron [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.337 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.338 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.366 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:03:40 compute-0 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [NOTICE]   (220738) : haproxy version is 2.8.14-c23fe91
Dec 05 12:03:40 compute-0 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [NOTICE]   (220738) : path to executable is /usr/sbin/haproxy
Dec 05 12:03:40 compute-0 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [ALERT]    (220738) : Current worker (220740) exited with code 143 (Terminated)
Dec 05 12:03:40 compute-0 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [WARNING]  (220738) : All workers exited. Exiting... (0)
Dec 05 12:03:40 compute-0 systemd[1]: libpod-80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a.scope: Deactivated successfully.
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.400 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:03:40 compute-0 podman[220980]: 2025-12-05 12:03:40.405936301 +0000 UTC m=+0.382790245 container died 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:03:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a-userdata-shm.mount: Deactivated successfully.
Dec 05 12:03:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-78756f8a0c787d9c7273ba58bfae472f50372e3fa267a0cd0bfee40b23f73738-merged.mount: Deactivated successfully.
Dec 05 12:03:40 compute-0 podman[220980]: 2025-12-05 12:03:40.513349889 +0000 UTC m=+0.490203813 container cleanup 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:03:40 compute-0 systemd[1]: libpod-conmon-80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a.scope: Deactivated successfully.
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.574 187212 DEBUG nova.compute.provider_tree [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.577 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.578 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.579 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Creating image(s)
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.579 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.579 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.580 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.594 187212 DEBUG nova.scheduler.client.report [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.597 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.619 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.620 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.626 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.656 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.657 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.658 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:40 compute-0 podman[221030]: 2025-12-05 12:03:40.66780683 +0000 UTC m=+0.128194042 container remove 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.671 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.672 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7308ce7d-08be-4944-bff3-e47d1d8ad6f6]: (4, ('Fri Dec  5 12:03:40 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c (80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a)\n80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a\nFri Dec  5 12:03:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c (80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a)\n80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.674 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c644b93-f774-4a26-bfbb-6aea6d819342]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.675 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5223579-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:40 compute-0 kernel: tapf5223579-40: left promiscuous mode
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.714 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.717 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:03:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.717 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[448c27ca-3ea1-4dfe-8a88-78bf2efac067]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.718 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:03:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.734 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f66561ae-cfde-4319-8f5b-8826b0c0a6f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.736 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43ddc131-0654-416a-82a9-c165e82307b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.749 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.753 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.754 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.758 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ea0d991f-f7c5-41cc-8b92-cc391bf30283]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358778, 'reachable_time': 43445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221051, 'error': None, 'target': 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.762 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:03:40 compute-0 systemd[1]: run-netns-ovnmeta\x2df5223579\x2d477c\x2d4fbe\x2da58c\x2d2e56f428541c.mount: Deactivated successfully.
Dec 05 12:03:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.762 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7f321cef-74d3-4c0e-b552-c84dcdcc94e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.782 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.787 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance d70544d6-04e3-4b2a-914a-72db3052216a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.788 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance d2085dd9-2ebd-4804-99c1-3b15cbd216f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.788 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.788 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance c1e2f189-1777-4f28-97ab-72cf0f60fbc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.788 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 30a55909-059f-4a0c-9598-14cc506d42a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.789 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.789 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 456f1972-6ed7-4fc2-b046-fa035704d434 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.789 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.790 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1408MB phys_disk=79GB used_disk=7GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.815 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk 1073741824" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.817 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.817 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.875 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.877 187212 DEBUG nova.virt.disk.api [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Checking if we can resize image /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.877 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.901 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.903 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.903 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Creating image(s)
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.904 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.904 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.905 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.920 187212 DEBUG nova.policy [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40620135b1ff4f8d9d80eb79f51fd593', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bebbbd9623064681bb9350747fba600e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.925 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.953 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.954 187212 DEBUG nova.virt.disk.api [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Cannot resize image /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.954 187212 DEBUG nova.objects.instance [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'migration_context' on Instance uuid 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.970 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.970 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Ensure instance console log exists: /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.971 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.971 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.971 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.978 187212 DEBUG nova.policy [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40620135b1ff4f8d9d80eb79f51fd593', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bebbbd9623064681bb9350747fba600e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.989 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.990 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:40 compute-0 nova_compute[187208]: 2025-12-05 12:03:40.990 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.002 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.062 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.063 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.093 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.119 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.163 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.164 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.165 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.187 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk 1073741824" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.188 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.189 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.253 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.254 187212 DEBUG nova.virt.disk.api [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Checking if we can resize image /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.254 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.271 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Successfully updated port: 9dc35efb-0aed-463b-860e-3b60dd65b6db _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.286 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.287 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.287 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.315 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.316 187212 DEBUG nova.virt.disk.api [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Cannot resize image /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.316 187212 DEBUG nova.objects.instance [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'migration_context' on Instance uuid 456f1972-6ed7-4fc2-b046-fa035704d434 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.335 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.335 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Ensure instance console log exists: /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.336 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.336 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.337 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:41.915 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.921 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.923 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.951 187212 WARNING nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] While synchronizing instance power states, found 7 instances in the database and 3 instances on the hypervisor.
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.951 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid d70544d6-04e3-4b2a-914a-72db3052216a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.952 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid d2085dd9-2ebd-4804-99c1-3b15cbd216f8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.952 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.952 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.953 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 30a55909-059f-4a0c-9598-14cc506d42a2 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.953 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.953 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 456f1972-6ed7-4fc2-b046-fa035704d434 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.953 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.954 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.954 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.954 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.955 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.956 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.956 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.957 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.957 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.957 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.958 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.958 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:03:41 compute-0 nova_compute[187208]: 2025-12-05 12:03:41.959 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.028 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.029 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.029 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.098 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.203 187212 DEBUG nova.compute.manager [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-unplugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.203 187212 DEBUG oslo_concurrency.lockutils [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.204 187212 DEBUG oslo_concurrency.lockutils [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.204 187212 DEBUG oslo_concurrency.lockutils [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.204 187212 DEBUG nova.compute.manager [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] No waiting events found dispatching network-vif-unplugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.205 187212 DEBUG nova.compute.manager [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-unplugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.352 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Successfully created port: 4f7ea95e-e59f-4941-83b6-5c482617a975 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.556 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.954 187212 DEBUG nova.network.neutron [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:42 compute-0 nova_compute[187208]: 2025-12-05 12:03:42.981 187212 INFO nova.compute.manager [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Took 2.65 seconds to deallocate network for instance.
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.040 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.041 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.092 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Successfully created port: 909107ba-c90a-4004-a47f-e5367cab8f82 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.173 187212 DEBUG nova.objects.instance [None req-b0b56b73-2f59-4f5c-8732-2fcb3b32ac53 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.193 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936223.1932034, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.193 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Paused (Lifecycle Event)
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.216 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.220 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.246 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] During sync_power_state the instance has a pending task (suspending). Skip.
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.259 187212 DEBUG nova.compute.provider_tree [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.275 187212 DEBUG nova.scheduler.client.report [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.308 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.349 187212 INFO nova.scheduler.client.report [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Deleted allocations for instance 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.633 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.634 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.635 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] During sync_power_state the instance has a pending task (deleting). Skip.
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.635 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:43 compute-0 kernel: tapecec1a41-6f (unregistering): left promiscuous mode
Dec 05 12:03:43 compute-0 NetworkManager[55691]: <info>  [1764936223.8128] device (tapecec1a41-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.827 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:43 compute-0 ovn_controller[95610]: 2025-12-05T12:03:43Z|00287|binding|INFO|Releasing lport ecec1a41-6f3e-4852-8cdb-9d461eded987 from this chassis (sb_readonly=0)
Dec 05 12:03:43 compute-0 ovn_controller[95610]: 2025-12-05T12:03:43Z|00288|binding|INFO|Setting lport ecec1a41-6f3e-4852-8cdb-9d461eded987 down in Southbound
Dec 05 12:03:43 compute-0 ovn_controller[95610]: 2025-12-05T12:03:43Z|00289|binding|INFO|Removing iface tapecec1a41-6f ovn-installed in OVS
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:43 compute-0 ovn_controller[95610]: 2025-12-05T12:03:43Z|00290|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:03:43 compute-0 ovn_controller[95610]: 2025-12-05T12:03:43Z|00291|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec 05 12:03:43 compute-0 ovn_controller[95610]: 2025-12-05T12:03:43Z|00292|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec 05 12:03:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.836 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:88:7f 10.100.0.5'], port_security=['fa:16:3e:57:88:7f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c1e2f189-1777-4f28-97ab-72cf0f60fbc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ecec1a41-6f3e-4852-8cdb-9d461eded987) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.839 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ecec1a41-6f3e-4852-8cdb-9d461eded987 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis
Dec 05 12:03:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.844 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.845 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4012423f-77e3-4df9-9d3e-78146d2512bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.846 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.846 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:43 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000024.scope: Deactivated successfully.
Dec 05 12:03:43 compute-0 nova_compute[187208]: 2025-12-05 12:03:43.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:43 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000024.scope: Consumed 5.842s CPU time.
Dec 05 12:03:43 compute-0 systemd-machined[153543]: Machine qemu-40-instance-00000024 terminated.
Dec 05 12:03:43 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [NOTICE]   (220617) : haproxy version is 2.8.14-c23fe91
Dec 05 12:03:43 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [NOTICE]   (220617) : path to executable is /usr/sbin/haproxy
Dec 05 12:03:43 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [WARNING]  (220617) : Exiting Master process...
Dec 05 12:03:43 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [ALERT]    (220617) : Current worker (220619) exited with code 143 (Terminated)
Dec 05 12:03:43 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [WARNING]  (220617) : All workers exited. Exiting... (0)
Dec 05 12:03:43 compute-0 systemd[1]: libpod-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7.scope: Deactivated successfully.
Dec 05 12:03:43 compute-0 conmon[220613]: conmon f442b1005a00d2fb0330 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7.scope/container/memory.events
Dec 05 12:03:43 compute-0 podman[221102]: 2025-12-05 12:03:43.997321343 +0000 UTC m=+0.054859928 container died f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:03:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7-userdata-shm.mount: Deactivated successfully.
Dec 05 12:03:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-d1e943a749671a1b77c4cdf6fa31b99e20bf9ab4dc014c673f02b47bd056cb36-merged.mount: Deactivated successfully.
Dec 05 12:03:44 compute-0 podman[221102]: 2025-12-05 12:03:44.03994368 +0000 UTC m=+0.097482265 container cleanup f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 05 12:03:44 compute-0 systemd[1]: libpod-conmon-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7.scope: Deactivated successfully.
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.050 187212 DEBUG nova.compute.manager [None req-b0b56b73-2f59-4f5c-8732-2fcb3b32ac53 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.093 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Updating instance_info_cache with network_info: [{"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:44 compute-0 podman[221145]: 2025-12-05 12:03:44.109078005 +0000 UTC m=+0.045977014 container remove f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:03:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.116 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d23ea96e-8102-409b-b839-6aef03ee9807]: (4, ('Fri Dec  5 12:03:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7)\nf442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7\nFri Dec  5 12:03:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7)\nf442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.118 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[20dac5c5-dc4a-4096-a0ac-f40387f31190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.120 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.151 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:44 compute-0 kernel: tap41b3b495-c0: left promiscuous mode
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.154 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.154 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance network_info: |[{"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.157 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start _get_guest_xml network_info=[{"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.161 187212 WARNING nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.166 187212 DEBUG nova.virt.libvirt.host [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.166 187212 DEBUG nova.virt.libvirt.host [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.168 187212 DEBUG nova.virt.libvirt.host [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.169 187212 DEBUG nova.virt.libvirt.host [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.169 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.169 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.170 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.170 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.170 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.170 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:03:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[827e9e7e-e0b3-4fe9-b310-c01a1fee7d30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.175 187212 DEBUG nova.virt.libvirt.vif [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1329126976',display_name='tempest-DeleteServersTestJSON-server-1329126976',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1329126976',id=37,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-4bgg3k4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:38Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=30a55909-059f-4a0c-9598-14cc506d42a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.177 187212 DEBUG nova.network.os_vif_util [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.178 187212 DEBUG nova.network.os_vif_util [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.179 187212 DEBUG nova.objects.instance [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid 30a55909-059f-4a0c-9598-14cc506d42a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.192 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eb888422-31ab-4f79-8b34-f3d2ddd89f34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.194 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[318cb6ed-80c2-493c-a3e6-2d6b7c2b16af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.207 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d5e2322b-4945-47f0-baa1-aecc03892359]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358648, 'reachable_time': 16458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221164, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.209 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <uuid>30a55909-059f-4a0c-9598-14cc506d42a2</uuid>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <name>instance-00000025</name>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <nova:name>tempest-DeleteServersTestJSON-server-1329126976</nova:name>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:03:44</nova:creationTime>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:03:44 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:03:44 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:03:44 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:03:44 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:03:44 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:03:44 compute-0 nova_compute[187208]:         <nova:user uuid="ff425b7b04144f93a2c15e3a347fc15c">tempest-DeleteServersTestJSON-554028480-project-member</nova:user>
Dec 05 12:03:44 compute-0 nova_compute[187208]:         <nova:project uuid="4671f6c82ea049fab3a314ecf45b7656">tempest-DeleteServersTestJSON-554028480</nova:project>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:03:44 compute-0 nova_compute[187208]:         <nova:port uuid="9dc35efb-0aed-463b-860e-3b60dd65b6db">
Dec 05 12:03:44 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <system>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <entry name="serial">30a55909-059f-4a0c-9598-14cc506d42a2</entry>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <entry name="uuid">30a55909-059f-4a0c-9598-14cc506d42a2</entry>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     </system>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <os>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   </os>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <features>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   </features>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.config"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:4b:04:08"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <target dev="tap9dc35efb-0a"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/console.log" append="off"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <video>
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     </video>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:03:44 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:03:44 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:03:44 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:03:44 compute-0 nova_compute[187208]: </domain>
Dec 05 12:03:44 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.210 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Preparing to wait for external event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:03:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.210 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.211 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.211 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.211 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:03:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.211 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0422cdce-9c89-44fa-a2ce-e4e5321b75a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.212 187212 DEBUG nova.virt.libvirt.vif [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1329126976',display_name='tempest-DeleteServersTestJSON-server-1329126976',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1329126976',id=37,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-4bgg3k4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:38Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=30a55909-059f-4a0c-9598-14cc506d42a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.212 187212 DEBUG nova.network.os_vif_util [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.213 187212 DEBUG nova.network.os_vif_util [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.213 187212 DEBUG os_vif [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.214 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.214 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.215 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.217 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.218 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9dc35efb-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.218 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9dc35efb-0a, col_values=(('external_ids', {'iface-id': '9dc35efb-0aed-463b-860e-3b60dd65b6db', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4b:04:08', 'vm-uuid': '30a55909-059f-4a0c-9598-14cc506d42a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.219 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:44 compute-0 NetworkManager[55691]: <info>  [1764936224.2205] manager: (tap9dc35efb-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.221 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.227 187212 INFO os_vif [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a')
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.285 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Successfully updated port: 4f7ea95e-e59f-4941-83b6-5c482617a975 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.293 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.294 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.294 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No VIF found with MAC fa:16:3e:4b:04:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.295 187212 INFO nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Using config drive
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.321 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.322 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquired lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.322 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.890 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.894 187212 INFO nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Creating config drive at /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.config
Dec 05 12:03:44 compute-0 nova_compute[187208]: 2025-12-05 12:03:44.898 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkhpjpk3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.027 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkhpjpk3" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:45 compute-0 kernel: tap9dc35efb-0a: entered promiscuous mode
Dec 05 12:03:45 compute-0 NetworkManager[55691]: <info>  [1764936225.0836] manager: (tap9dc35efb-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Dec 05 12:03:45 compute-0 systemd-udevd[221081]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:45 compute-0 ovn_controller[95610]: 2025-12-05T12:03:45Z|00293|binding|INFO|Claiming lport 9dc35efb-0aed-463b-860e-3b60dd65b6db for this chassis.
Dec 05 12:03:45 compute-0 ovn_controller[95610]: 2025-12-05T12:03:45Z|00294|binding|INFO|9dc35efb-0aed-463b-860e-3b60dd65b6db: Claiming fa:16:3e:4b:04:08 10.100.0.4
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:45 compute-0 NetworkManager[55691]: <info>  [1764936225.0957] device (tap9dc35efb-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:03:45 compute-0 NetworkManager[55691]: <info>  [1764936225.0964] device (tap9dc35efb-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.098 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:04:08 10.100.0.4'], port_security=['fa:16:3e:4b:04:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30a55909-059f-4a0c-9598-14cc506d42a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9dc35efb-0aed-463b-860e-3b60dd65b6db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.099 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9dc35efb-0aed-463b-860e-3b60dd65b6db in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 bound to our chassis
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.101 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.111 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[980db7ab-6b24-4be6-920e-a14d2d29a6e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.112 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7360f84-b1 in ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.114 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7360f84-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.115 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[64a1fe69-462b-4864-8719-523a5eec5567]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.115 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a25dfc-07dc-4139-8785-ce3352ed2f45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 systemd-machined[153543]: New machine qemu-41-instance-00000025.
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.125 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[31232b55-7860-4c87-b173-cfb48a7d7174]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.142 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2cdbd0-0e70-482b-beef-c530bfb84535]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 systemd[1]: Started Virtual Machine qemu-41-instance-00000025.
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.148 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:45 compute-0 ovn_controller[95610]: 2025-12-05T12:03:45Z|00295|binding|INFO|Setting lport 9dc35efb-0aed-463b-860e-3b60dd65b6db ovn-installed in OVS
Dec 05 12:03:45 compute-0 ovn_controller[95610]: 2025-12-05T12:03:45Z|00296|binding|INFO|Setting lport 9dc35efb-0aed-463b-860e-3b60dd65b6db up in Southbound
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.153 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.175 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[650d20d9-917a-4bdd-8206-65b295d802e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 NetworkManager[55691]: <info>  [1764936225.1814] manager: (tapd7360f84-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/125)
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.181 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0b158d46-b534-4650-a3d2-640ea9e9d210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.205 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[11e6e5e2-c431-48f6-bc33-fabff5ba4ca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.208 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1409eb-5815-4fbc-a398-92315902ce8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 NetworkManager[55691]: <info>  [1764936225.2256] device (tapd7360f84-b0): carrier: link connected
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.226 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Successfully updated port: 909107ba-c90a-4004-a47f-e5367cab8f82 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.231 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc22804-20a9-4b9e-bb0b-1ac01660406f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.249 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.249 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquired lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.249 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.249 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe5de8d-e5bd-4d94-89a4-ed89dbd62eb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360578, 'reachable_time': 17423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221222, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.267 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[664b1c2c-6f5f-466c-b0d6-69c1701bf117]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:2b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360578, 'tstamp': 360578}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221223, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.286 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5aacd4-dc9c-472d-bc50-4103b6e88ce2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360578, 'reachable_time': 17423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221224, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.311 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7187502a-e780-4b59-92fa-0491ee13ab89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.329 187212 DEBUG nova.compute.manager [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-unplugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 DEBUG oslo_concurrency.lockutils [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 DEBUG oslo_concurrency.lockutils [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 DEBUG oslo_concurrency.lockutils [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 DEBUG nova.compute.manager [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] No waiting events found dispatching network-vif-unplugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 WARNING nova.compute.manager [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received unexpected event network-vif-unplugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 for instance with vm_state suspended and task_state None.
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.376 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d401b818-c885-4508-b2aa-1f14e48a186e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.377 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.377 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.378 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7360f84-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.379 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:45 compute-0 NetworkManager[55691]: <info>  [1764936225.3803] manager: (tapd7360f84-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Dec 05 12:03:45 compute-0 kernel: tapd7360f84-b0: entered promiscuous mode
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.384 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.385 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7360f84-b0, col_values=(('external_ids', {'iface-id': 'd85bc323-c3ce-47e3-ac1f-d5f27467a4e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:45 compute-0 ovn_controller[95610]: 2025-12-05T12:03:45Z|00297|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.386 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.403 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.405 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.407 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.408 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb5c847-6237-4fa0-a483-1762fce664b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.409 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:03:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.409 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'env', 'PROCESS_TAG=haproxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.437 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936225.4372697, 30a55909-059f-4a0c-9598-14cc506d42a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.438 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Started (Lifecycle Event)
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.458 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.462 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936225.4398158, 30a55909-059f-4a0c-9598-14cc506d42a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.462 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Paused (Lifecycle Event)
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.480 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.483 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.502 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.591 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.592 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.592 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.592 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.592 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] No waiting events found dispatching network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 WARNING nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received unexpected event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 for instance with vm_state deleted and task_state None.
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-changed-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Refreshing instance network info cache due to event network-changed-9dc35efb-0aed-463b-860e-3b60dd65b6db. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG nova.network.neutron [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Refreshing network info cache for port 9dc35efb-0aed-463b-860e-3b60dd65b6db _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:03:45 compute-0 podman[221268]: 2025-12-05 12:03:45.773610769 +0000 UTC m=+0.052841131 container create a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:03:45 compute-0 systemd[1]: Started libpod-conmon-a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b.scope.
Dec 05 12:03:45 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:03:45 compute-0 podman[221268]: 2025-12-05 12:03:45.743286913 +0000 UTC m=+0.022517315 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:03:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aba16f9d308e86ed666c24a9528a5e9f58db6e3fb9b48c984c73f0766f63478d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:03:45 compute-0 podman[221268]: 2025-12-05 12:03:45.857138995 +0000 UTC m=+0.136369387 container init a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 05 12:03:45 compute-0 podman[221268]: 2025-12-05 12:03:45.862126837 +0000 UTC m=+0.141357199 container start a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 12:03:45 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [NOTICE]   (221287) : New worker (221289) forked
Dec 05 12:03:45 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [NOTICE]   (221287) : Loading success.
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.935 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.943 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Updating instance_info_cache with network_info: [{"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.975 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Releasing lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.975 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance network_info: |[{"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.977 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start _get_guest_xml network_info=[{"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.983 187212 WARNING nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.987 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.988 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.991 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.992 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.992 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.992 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.993 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.993 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.994 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.994 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.994 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.995 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.995 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.995 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.996 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:03:45 compute-0 nova_compute[187208]: 2025-12-05 12:03:45.996 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.001 187212 DEBUG nova.virt.libvirt.vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-2',id=39,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=456f1972-6ed7-4fc2-b046-fa035704d434,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.002 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.002 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.004 187212 DEBUG nova.objects.instance [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'pci_devices' on Instance uuid 456f1972-6ed7-4fc2-b046-fa035704d434 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <uuid>456f1972-6ed7-4fc2-b046-fa035704d434</uuid>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <name>instance-00000027</name>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <nova:name>tempest-tempest.common.compute-instance-2007104146-2</nova:name>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:03:45</nova:creationTime>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:03:46 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:03:46 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:03:46 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:03:46 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:03:46 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:03:46 compute-0 nova_compute[187208]:         <nova:user uuid="40620135b1ff4f8d9d80eb79f51fd593">tempest-MultipleCreateTestJSON-1941206426-project-member</nova:user>
Dec 05 12:03:46 compute-0 nova_compute[187208]:         <nova:project uuid="bebbbd9623064681bb9350747fba600e">tempest-MultipleCreateTestJSON-1941206426</nova:project>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:03:46 compute-0 nova_compute[187208]:         <nova:port uuid="4f7ea95e-e59f-4941-83b6-5c482617a975">
Dec 05 12:03:46 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <system>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <entry name="serial">456f1972-6ed7-4fc2-b046-fa035704d434</entry>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <entry name="uuid">456f1972-6ed7-4fc2-b046-fa035704d434</entry>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     </system>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <os>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   </os>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <features>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   </features>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.config"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:4a:7b:36"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <target dev="tap4f7ea95e-e5"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/console.log" append="off"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <video>
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     </video>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:03:46 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:03:46 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:03:46 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:03:46 compute-0 nova_compute[187208]: </domain>
Dec 05 12:03:46 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Preparing to wait for external event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.021 187212 DEBUG nova.virt.libvirt.vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-2',id=39,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=456f1972-6ed7-4fc2-b046-fa035704d434,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.021 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.022 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.022 187212 DEBUG os_vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.023 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.023 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.026 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f7ea95e-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.027 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f7ea95e-e5, col_values=(('external_ids', {'iface-id': '4f7ea95e-e59f-4941-83b6-5c482617a975', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:7b:36', 'vm-uuid': '456f1972-6ed7-4fc2-b046-fa035704d434'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:46 compute-0 NetworkManager[55691]: <info>  [1764936226.0291] manager: (tap4f7ea95e-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.031 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.038 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.040 187212 INFO os_vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5')
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.096 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.097 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.097 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No VIF found with MAC fa:16:3e:4a:7b:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.097 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Using config drive
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.336 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936211.3357682, 478fa005-452c-4e37-a919-63bb734a3c5c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.337 187212 INFO nova.compute.manager [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] VM Stopped (Lifecycle Event)
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.370 187212 DEBUG nova.compute.manager [None req-10a4bc21-c7db-4067-b037-182ce6d7175d - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.604 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Creating config drive at /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.config
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.610 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcy3f46ov execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.738 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcy3f46ov" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:46 compute-0 kernel: tap4f7ea95e-e5: entered promiscuous mode
Dec 05 12:03:46 compute-0 NetworkManager[55691]: <info>  [1764936226.8018] manager: (tap4f7ea95e-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Dec 05 12:03:46 compute-0 systemd-udevd[221211]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:46 compute-0 ovn_controller[95610]: 2025-12-05T12:03:46Z|00298|binding|INFO|Claiming lport 4f7ea95e-e59f-4941-83b6-5c482617a975 for this chassis.
Dec 05 12:03:46 compute-0 ovn_controller[95610]: 2025-12-05T12:03:46Z|00299|binding|INFO|4f7ea95e-e59f-4941-83b6-5c482617a975: Claiming fa:16:3e:4a:7b:36 10.100.0.3
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:46 compute-0 NetworkManager[55691]: <info>  [1764936226.8196] device (tap4f7ea95e-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:03:46 compute-0 NetworkManager[55691]: <info>  [1764936226.8203] device (tap4f7ea95e-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.820 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:7b:36 10.100.0.3'], port_security=['fa:16:3e:4a:7b:36 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '456f1972-6ed7-4fc2-b046-fa035704d434', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4f7ea95e-e59f-4941-83b6-5c482617a975) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.822 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4f7ea95e-e59f-4941-83b6-5c482617a975 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 bound to our chassis
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.824 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.836 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ee69182f-39b2-4ec1-bcee-7071250cd57e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.837 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8ea1ed6-91 in ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.840 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8ea1ed6-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.841 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c190bdd2-1d4c-4804-bcb4-2a16ad31090a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.842 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[078c941d-d4fb-4d2b-b91d-ab0497c0f4f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:46 compute-0 systemd-machined[153543]: New machine qemu-42-instance-00000027.
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.852 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[2566efdc-a622-45d6-b447-7caaabb86445]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.869 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8cf0e3-c0a3-4376-a026-3cbb5f1338b4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:46 compute-0 systemd[1]: Started Virtual Machine qemu-42-instance-00000027.
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.930 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:46 compute-0 ovn_controller[95610]: 2025-12-05T12:03:46Z|00300|binding|INFO|Setting lport 4f7ea95e-e59f-4941-83b6-5c482617a975 ovn-installed in OVS
Dec 05 12:03:46 compute-0 ovn_controller[95610]: 2025-12-05T12:03:46Z|00301|binding|INFO|Setting lport 4f7ea95e-e59f-4941-83b6-5c482617a975 up in Southbound
Dec 05 12:03:46 compute-0 nova_compute[187208]: 2025-12-05 12:03:46.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.955 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[72a734b6-c880-4669-abcf-1b3f30e9ec12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:46 compute-0 systemd-udevd[221326]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:46 compute-0 NetworkManager[55691]: <info>  [1764936226.9614] manager: (tapb8ea1ed6-90): new Veth device (/org/freedesktop/NetworkManager/Devices/129)
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.960 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[51fb1acb-3369-4d1b-b673-9c98767b19c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.987 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[77604098-e122-4b75-b392-c046d0f8acd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.991 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[318203fb-1dca-4044-a5a8-d825a4a4ec90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:47 compute-0 NetworkManager[55691]: <info>  [1764936227.0109] device (tapb8ea1ed6-90): carrier: link connected
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.014 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[063906d6-b394-497c-842d-fe08ffa82372]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.029 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a89671aa-5a2c-43d6-8f91-79e436706594]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360757, 'reachable_time': 17404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221357, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.043 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5887dba9-d83f-48ff-a406-0b752e38bd14]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:fb51'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360757, 'tstamp': 360757}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221358, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.057 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e1db39cd-bc38-4baa-b5ba-7623132b0ec7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360757, 'reachable_time': 17404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221359, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.085 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dfec196d-e8f6-48bd-9555-ba49d1a0605f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.153 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4000348f-2c86-47f6-91de-89a57f5ef3f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.154 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.154 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.155 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ea1ed6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:47 compute-0 kernel: tapb8ea1ed6-90: entered promiscuous mode
Dec 05 12:03:47 compute-0 NetworkManager[55691]: <info>  [1764936227.1577] manager: (tapb8ea1ed6-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.156 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.165 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ea1ed6-90, col_values=(('external_ids', {'iface-id': '6f012c31-72e4-4df5-be68-787aa910fb9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:47 compute-0 ovn_controller[95610]: 2025-12-05T12:03:47Z|00302|binding|INFO|Releasing lport 6f012c31-72e4-4df5-be68-787aa910fb9c from this chassis (sb_readonly=0)
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.184 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.185 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.186 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7f7d8d-b341-455a-9437-61ce68a1486e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.187 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.pid.haproxy
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:03:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.188 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'env', 'PROCESS_TAG=haproxy-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:03:47 compute-0 podman[221365]: 2025-12-05 12:03:47.220518627 +0000 UTC m=+0.066127330 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.351 187212 DEBUG nova.network.neutron [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Updated VIF entry in instance network info cache for port 9dc35efb-0aed-463b-860e-3b60dd65b6db. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.352 187212 DEBUG nova.network.neutron [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Updating instance_info_cache with network_info: [{"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.382 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.383 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-deleted-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.400 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Updating instance_info_cache with network_info: [{"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.419 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Releasing lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.419 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance network_info: |[{"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.423 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start _get_guest_xml network_info=[{"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.425 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936227.4244573, 456f1972-6ed7-4fc2-b046-fa035704d434 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.425 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] VM Started (Lifecycle Event)
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.431 187212 WARNING nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.435 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.436 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.441 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.441 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.442 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.442 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.442 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.443 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.443 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.443 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.443 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.444 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.444 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.444 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.444 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.445 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.448 187212 DEBUG nova.virt.libvirt.vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-1',id=38,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=7df02f69-ecc9-424d-82ab-dc8ba279ffd5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.449 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.449 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.450 187212 DEBUG nova.objects.instance [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'pci_devices' on Instance uuid 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.452 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.456 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936227.4254956, 456f1972-6ed7-4fc2-b046-fa035704d434 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.457 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] VM Paused (Lifecycle Event)
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.497 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.499 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <uuid>7df02f69-ecc9-424d-82ab-dc8ba279ffd5</uuid>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <name>instance-00000026</name>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <nova:name>tempest-tempest.common.compute-instance-2007104146-1</nova:name>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:03:47</nova:creationTime>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:03:47 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:03:47 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:03:47 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:03:47 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:03:47 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:03:47 compute-0 nova_compute[187208]:         <nova:user uuid="40620135b1ff4f8d9d80eb79f51fd593">tempest-MultipleCreateTestJSON-1941206426-project-member</nova:user>
Dec 05 12:03:47 compute-0 nova_compute[187208]:         <nova:project uuid="bebbbd9623064681bb9350747fba600e">tempest-MultipleCreateTestJSON-1941206426</nova:project>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:03:47 compute-0 nova_compute[187208]:         <nova:port uuid="909107ba-c90a-4004-a47f-e5367cab8f82">
Dec 05 12:03:47 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <system>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <entry name="serial">7df02f69-ecc9-424d-82ab-dc8ba279ffd5</entry>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <entry name="uuid">7df02f69-ecc9-424d-82ab-dc8ba279ffd5</entry>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     </system>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <os>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   </os>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <features>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   </features>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.config"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:16:8e:a6"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <target dev="tap909107ba-c9"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/console.log" append="off"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <video>
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     </video>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:03:47 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:03:47 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:03:47 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:03:47 compute-0 nova_compute[187208]: </domain>
Dec 05 12:03:47 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.500 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Preparing to wait for external event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.501 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.501 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.501 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.502 187212 DEBUG nova.virt.libvirt.vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-1',id=38,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=7df02f69-ecc9-424d-82ab-dc8ba279ffd5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.502 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.502 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.503 187212 DEBUG os_vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.503 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.503 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.504 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.507 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.507 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap909107ba-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.508 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap909107ba-c9, col_values=(('external_ids', {'iface-id': '909107ba-c90a-4004-a47f-e5367cab8f82', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:16:8e:a6', 'vm-uuid': '7df02f69-ecc9-424d-82ab-dc8ba279ffd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:47 compute-0 NetworkManager[55691]: <info>  [1764936227.5104] manager: (tap909107ba-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.510 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.520 187212 INFO os_vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9')
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.533 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.556 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.575 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.575 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.575 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No VIF found with MAC fa:16:3e:16:8e:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:03:47 compute-0 nova_compute[187208]: 2025-12-05 12:03:47.575 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Using config drive
Dec 05 12:03:47 compute-0 podman[221423]: 2025-12-05 12:03:47.615432807 +0000 UTC m=+0.052985524 container create 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:03:47 compute-0 systemd[1]: Started libpod-conmon-9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1.scope.
Dec 05 12:03:47 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:03:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cea3cddfc7f419ead74ac9c1e8d910318a876e210e0687b74d3349159defed7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:03:47 compute-0 podman[221423]: 2025-12-05 12:03:47.589722213 +0000 UTC m=+0.027274910 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:03:47 compute-0 podman[221423]: 2025-12-05 12:03:47.694831635 +0000 UTC m=+0.132384322 container init 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 05 12:03:47 compute-0 podman[221423]: 2025-12-05 12:03:47.70023716 +0000 UTC m=+0.137789847 container start 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:03:47 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [NOTICE]   (221443) : New worker (221445) forked
Dec 05 12:03:47 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [NOTICE]   (221443) : Loading success.
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.186 187212 DEBUG nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-changed-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.186 187212 DEBUG nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Refreshing instance network info cache due to event network-changed-4f7ea95e-e59f-4941-83b6-5c482617a975. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.187 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.187 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.187 187212 DEBUG nova.network.neutron [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Refreshing network info cache for port 4f7ea95e-e59f-4941-83b6-5c482617a975 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.397 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Creating config drive at /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.config
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.402 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeyj7g2gt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.533 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeyj7g2gt" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:48 compute-0 kernel: tap909107ba-c9: entered promiscuous mode
Dec 05 12:03:48 compute-0 NetworkManager[55691]: <info>  [1764936228.6054] manager: (tap909107ba-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Dec 05 12:03:48 compute-0 systemd-udevd[221340]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:48 compute-0 ovn_controller[95610]: 2025-12-05T12:03:48Z|00303|binding|INFO|Claiming lport 909107ba-c90a-4004-a47f-e5367cab8f82 for this chassis.
Dec 05 12:03:48 compute-0 ovn_controller[95610]: 2025-12-05T12:03:48Z|00304|binding|INFO|909107ba-c90a-4004-a47f-e5367cab8f82: Claiming fa:16:3e:16:8e:a6 10.100.0.12
Dec 05 12:03:48 compute-0 NetworkManager[55691]: <info>  [1764936228.6251] device (tap909107ba-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:03:48 compute-0 NetworkManager[55691]: <info>  [1764936228.6260] device (tap909107ba-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:03:48 compute-0 ovn_controller[95610]: 2025-12-05T12:03:48Z|00305|binding|INFO|Setting lport 909107ba-c90a-4004-a47f-e5367cab8f82 ovn-installed in OVS
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.635 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:48 compute-0 nova_compute[187208]: 2025-12-05 12:03:48.640 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:48 compute-0 systemd-machined[153543]: New machine qemu-43-instance-00000026.
Dec 05 12:03:48 compute-0 systemd[1]: Started Virtual Machine qemu-43-instance-00000026.
Dec 05 12:03:48 compute-0 ovn_controller[95610]: 2025-12-05T12:03:48Z|00306|binding|INFO|Setting lport 909107ba-c90a-4004-a47f-e5367cab8f82 up in Southbound
Dec 05 12:03:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:48.990 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:8e:a6 10.100.0.12'], port_security=['fa:16:3e:16:8e:a6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7df02f69-ecc9-424d-82ab-dc8ba279ffd5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=909107ba-c90a-4004-a47f-e5367cab8f82) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:48.992 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 909107ba-c90a-4004-a47f-e5367cab8f82 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 bound to our chassis
Dec 05 12:03:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:48.994 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.012 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af5d40c8-bb7c-45fb-b27c-6515486fc366]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.050 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9a243c6b-73aa-4d3e-a6bb-cdd9bf3a31d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.054 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f2943ac3-f567-4a84-b1d8-951bad60a961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.091 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8c6570-3d4a-4e92-9ccb-6ff31beb7fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.109 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1cf392-e299-4f0f-9d91-28effeb36615]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360757, 'reachable_time': 17404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221487, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.133 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[96f0f335-0172-422e-b7aa-baf1c85cd8a2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360767, 'tstamp': 360767}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221488, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360771, 'tstamp': 360771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221488, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.135 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.137 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.148 187212 DEBUG nova.compute.manager [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.149 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.149 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ea1ed6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.150 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.151 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ea1ed6-90, col_values=(('external_ids', {'iface-id': '6f012c31-72e4-4df5-be68-787aa910fb9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.151 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.173 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-changed-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.174 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Refreshing instance network info cache due to event network-changed-909107ba-c90a-4004-a47f-e5367cab8f82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.174 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.174 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.175 187212 DEBUG nova.network.neutron [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Refreshing network info cache for port 909107ba-c90a-4004-a47f-e5367cab8f82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.262 187212 INFO nova.compute.manager [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] instance snapshotting
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.263 187212 WARNING nova.compute.manager [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] trying to snapshot a non-running instance: (state: 4 expected: 1)
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.607 187212 INFO nova.virt.libvirt.driver [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Beginning cold snapshot process
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.854 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936229.8540106, 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:49 compute-0 nova_compute[187208]: 2025-12-05 12:03:49.855 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] VM Started (Lifecycle Event)
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.157 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.162 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936229.8566923, 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.162 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] VM Paused (Lifecycle Event)
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.193 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.196 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.199 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.199 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.200 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.200 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.200 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.201 187212 INFO nova.compute.manager [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Terminating instance
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.202 187212 DEBUG nova.compute.manager [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.226 187212 DEBUG nova.privsep.utils [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:03:50 compute-0 kernel: tap99a1ab7f-bf (unregistering): left promiscuous mode
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.226 187212 DEBUG oslo_concurrency.processutils [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk /var/lib/nova/instances/snapshots/tmp_tvfw6sw/a90bdd0383f44c6e9298f49095fb64de execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:50 compute-0 NetworkManager[55691]: <info>  [1764936230.2357] device (tap99a1ab7f-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:03:50 compute-0 ovn_controller[95610]: 2025-12-05T12:03:50Z|00307|binding|INFO|Releasing lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc from this chassis (sb_readonly=0)
Dec 05 12:03:50 compute-0 ovn_controller[95610]: 2025-12-05T12:03:50Z|00308|binding|INFO|Setting lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc down in Southbound
Dec 05 12:03:50 compute-0 ovn_controller[95610]: 2025-12-05T12:03:50Z|00309|binding|INFO|Removing iface tap99a1ab7f-bf ovn-installed in OVS
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.253 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.256 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.264 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:8a:0c 10.100.0.12'], port_security=['fa:16:3e:a9:8a:0c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd70544d6-04e3-4b2a-914a-72db3052216a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39702279-01de-4f4b-bc33-58c8c6f673e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55d3be64e01442ca8f492d2f3e10d1cc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7da5af47-2519-44c3-bc78-6f5347e93e10', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94b6aed5-905a-43ff-81d8-6adfe368f476, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.267 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc in datapath 39702279-01de-4f4b-bc33-58c8c6f673e3 unbound from our chassis
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.272 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 39702279-01de-4f4b-bc33-58c8c6f673e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.273 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f61ffb8d-9303-4707-81a7-06b874aa8526]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.274 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 namespace which is not needed anymore
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.278 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.306 187212 DEBUG nova.compute.manager [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:50 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Dec 05 12:03:50 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000001f.scope: Consumed 13.786s CPU time.
Dec 05 12:03:50 compute-0 systemd-machined[153543]: Machine qemu-35-instance-0000001f terminated.
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.364 187212 INFO nova.compute.manager [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] instance snapshotting
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.400 187212 DEBUG oslo_concurrency.processutils [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk /var/lib/nova/instances/snapshots/tmp_tvfw6sw/a90bdd0383f44c6e9298f49095fb64de" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.400 187212 INFO nova.virt.libvirt.driver [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Snapshot extracted, beginning image upload
Dec 05 12:03:50 compute-0 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [NOTICE]   (219283) : haproxy version is 2.8.14-c23fe91
Dec 05 12:03:50 compute-0 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [NOTICE]   (219283) : path to executable is /usr/sbin/haproxy
Dec 05 12:03:50 compute-0 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [WARNING]  (219283) : Exiting Master process...
Dec 05 12:03:50 compute-0 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [ALERT]    (219283) : Current worker (219285) exited with code 143 (Terminated)
Dec 05 12:03:50 compute-0 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [WARNING]  (219283) : All workers exited. Exiting... (0)
Dec 05 12:03:50 compute-0 systemd[1]: libpod-004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff.scope: Deactivated successfully.
Dec 05 12:03:50 compute-0 podman[221529]: 2025-12-05 12:03:50.424656518 +0000 UTC m=+0.053280443 container died 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 12:03:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff-userdata-shm.mount: Deactivated successfully.
Dec 05 12:03:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-fc8314cf85594aa36784fe5dbf1012ba087261d9468b0e3431f9bb9c756a87d7-merged.mount: Deactivated successfully.
Dec 05 12:03:50 compute-0 podman[221529]: 2025-12-05 12:03:50.472252967 +0000 UTC m=+0.100876892 container cleanup 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:03:50 compute-0 systemd[1]: libpod-conmon-004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff.scope: Deactivated successfully.
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.485 187212 INFO nova.virt.libvirt.driver [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance destroyed successfully.
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.486 187212 DEBUG nova.objects.instance [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lazy-loading 'resources' on Instance uuid d70544d6-04e3-4b2a-914a-72db3052216a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.515 187212 DEBUG nova.virt.libvirt.vif [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1222437752',display_name='tempest-ImagesOneServerTestJSON-server-1222437752',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1222437752',id=31,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='55d3be64e01442ca8f492d2f3e10d1cc',ramdisk_id='',reservation_id='r-zbitw7u9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-1350277374',owner_user_name='tempest-ImagesOneServerTestJSON-1350277374-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='b5f1bf811e6c42d699922035de0b538c',uuid=d70544d6-04e3-4b2a-914a-72db3052216a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.515 187212 DEBUG nova.network.os_vif_util [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converting VIF {"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.516 187212 DEBUG nova.network.os_vif_util [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.516 187212 DEBUG os_vif [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.520 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.520 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99a1ab7f-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.522 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.525 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.527 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.531 187212 INFO os_vif [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf')
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.532 187212 INFO nova.virt.libvirt.driver [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Deleting instance files /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a_del
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.532 187212 INFO nova.virt.libvirt.driver [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Deletion of /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a_del complete
Dec 05 12:03:50 compute-0 podman[221578]: 2025-12-05 12:03:50.547697472 +0000 UTC m=+0.050956426 container remove 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.555 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf88cf73-6c22-413f-b415-254758d323f1]: (4, ('Fri Dec  5 12:03:50 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 (004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff)\n004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff\nFri Dec  5 12:03:50 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 (004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff)\n004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.557 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6f282c85-0421-4731-acb7-ccb02f4ede8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.558 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39702279-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:50 compute-0 kernel: tap39702279-00: left promiscuous mode
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.631 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.640 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.642 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f7239d9e-585f-4819-9478-f0a8e6624161]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.645 187212 INFO nova.compute.manager [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 0.44 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.646 187212 DEBUG oslo.service.loopingcall [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.646 187212 INFO nova.virt.libvirt.driver [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Beginning live snapshot process
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.649 187212 DEBUG nova.compute.manager [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.649 187212 DEBUG nova.network.neutron [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.655 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[24ceaffd-fda4-4577-9478-05ada5027d81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.656 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d141c7f1-b1c2-456e-97ba-d0d07bfe0845]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.665 187212 DEBUG nova.network.neutron [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Updated VIF entry in instance network info cache for port 4f7ea95e-e59f-4941-83b6-5c482617a975. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.665 187212 DEBUG nova.network.neutron [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Updating instance_info_cache with network_info: [{"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.677 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[26f6d425-7897-43a6-a983-42336b378b07]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355488, 'reachable_time': 18343, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221596, 'error': None, 'target': 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.679 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:03:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.680 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[fe677c4d-d778-4dc0-b128-c73e8aac5ed6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:50 compute-0 systemd[1]: run-netns-ovnmeta\x2d39702279\x2d01de\x2d4f4b\x2dbc33\x2d58c8c6f673e3.mount: Deactivated successfully.
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.712 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.712 187212 DEBUG nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.713 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.713 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.713 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.713 187212 DEBUG nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] No waiting events found dispatching network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.714 187212 WARNING nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received unexpected event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 for instance with vm_state suspended and task_state None.
Dec 05 12:03:50 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.795 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.891 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json -f qcow2" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.892 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.956 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:50 compute-0 nova_compute[187208]: 2025-12-05 12:03:50.969 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.046 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.048 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.085 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c.delta 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.087 187212 INFO nova.virt.libvirt.driver [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.138 187212 DEBUG nova.virt.libvirt.guest [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] COPY block job progress, current cursor: 0 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.157 187212 DEBUG nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.158 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.158 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.159 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.159 187212 DEBUG nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Processing event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.159 187212 DEBUG nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.160 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.160 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.160 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.161 187212 DEBUG nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] No waiting events found dispatching network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.161 187212 WARNING nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received unexpected event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 for instance with vm_state building and task_state spawning.
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.167 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.174 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936231.1735473, 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.174 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] VM Resumed (Lifecycle Event)
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.178 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.183 187212 INFO nova.virt.libvirt.driver [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance spawned successfully.
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.184 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.238 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.248 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.254 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.255 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.255 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.256 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.256 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.257 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.428 187212 DEBUG nova.network.neutron [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Updated VIF entry in instance network info cache for port 909107ba-c90a-4004-a47f-e5367cab8f82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.429 187212 DEBUG nova.network.neutron [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Updating instance_info_cache with network_info: [{"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.456 187212 DEBUG nova.compute.manager [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-unplugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.456 187212 DEBUG oslo_concurrency.lockutils [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.457 187212 DEBUG oslo_concurrency.lockutils [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.457 187212 DEBUG oslo_concurrency.lockutils [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.457 187212 DEBUG nova.compute.manager [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] No waiting events found dispatching network-vif-unplugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.457 187212 DEBUG nova.compute.manager [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-unplugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.474 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.643 187212 DEBUG nova.virt.libvirt.guest [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] COPY block job progress, current cursor: 75366400 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.645 187212 INFO nova.virt.libvirt.driver [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.683 187212 DEBUG nova.privsep.utils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.683 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c.delta /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.805 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.806 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.807 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.807 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.807 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.808 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Processing event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.808 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.809 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.809 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.809 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.810 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] No waiting events found dispatching network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.810 187212 WARNING nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received unexpected event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db for instance with vm_state building and task_state spawning.
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.810 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.811 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.811 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.812 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.812 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Processing event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.812 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.813 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.813 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.813 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.814 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] No waiting events found dispatching network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.814 187212 WARNING nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received unexpected event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 for instance with vm_state building and task_state spawning.
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.815 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.816 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.820 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936231.8198025, 30a55909-059f-4a0c-9598-14cc506d42a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.820 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Resumed (Lifecycle Event)
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.824 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.825 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.830 187212 INFO nova.virt.libvirt.driver [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance spawned successfully.
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.831 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.834 187212 INFO nova.virt.libvirt.driver [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance spawned successfully.
Dec 05 12:03:51 compute-0 nova_compute[187208]: 2025-12-05 12:03:51.836 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.052 187212 INFO nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Took 11.47 seconds to spawn the instance on the hypervisor.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.053 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.073 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.078 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.078 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.079 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.079 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.080 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.080 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.086 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.111 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.111 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.112 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.112 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.113 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.113 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.119 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c.delta /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.124 187212 INFO nova.virt.libvirt.driver [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Snapshot extracted, beginning image upload
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.207 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.208 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936231.8248255, 456f1972-6ed7-4fc2-b046-fa035704d434 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.209 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] VM Resumed (Lifecycle Event)
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.254 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.258 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.269 187212 INFO nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Took 13.48 seconds to spawn the instance on the hypervisor.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.270 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.273 187212 INFO nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Took 12.74 seconds to build instance.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.284 187212 INFO nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Took 11.38 seconds to spawn the instance on the hypervisor.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.285 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.306 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.346 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.925s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.347 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 10.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.347 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.348 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.385 187212 INFO nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Took 14.58 seconds to build instance.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.393 187212 INFO nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Took 12.61 seconds to build instance.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.561 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.682 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.683 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 10.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.683 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.683 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.689 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.990s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.690 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 10.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.692 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.693 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:52 compute-0 nova_compute[187208]: 2025-12-05 12:03:52.735 187212 WARNING nova.compute.manager [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Image not found during snapshot: nova.exception.ImageNotFound: Image f951ed45-f10d-4ac3-a0fc-5d19a12add95 could not be found.
Dec 05 12:03:53 compute-0 podman[221634]: 2025-12-05 12:03:53.219385234 +0000 UTC m=+0.057931496 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:03:53 compute-0 podman[221633]: 2025-12-05 12:03:53.260940771 +0000 UTC m=+0.095358025 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Dec 05 12:03:53 compute-0 nova_compute[187208]: 2025-12-05 12:03:53.410 187212 DEBUG nova.network.neutron [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:53 compute-0 nova_compute[187208]: 2025-12-05 12:03:53.433 187212 INFO nova.compute.manager [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 2.78 seconds to deallocate network for instance.
Dec 05 12:03:53 compute-0 nova_compute[187208]: 2025-12-05 12:03:53.498 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:53 compute-0 nova_compute[187208]: 2025-12-05 12:03:53.499 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:53 compute-0 nova_compute[187208]: 2025-12-05 12:03:53.539 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 12:03:53 compute-0 nova_compute[187208]: 2025-12-05 12:03:53.558 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 12:03:53 compute-0 nova_compute[187208]: 2025-12-05 12:03:53.559 187212 DEBUG nova.compute.provider_tree [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 12:03:53 compute-0 nova_compute[187208]: 2025-12-05 12:03:53.763 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 12:03:53 compute-0 nova_compute[187208]: 2025-12-05 12:03:53.788 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 12:03:54 compute-0 nova_compute[187208]: 2025-12-05 12:03:54.839 187212 DEBUG nova.compute.provider_tree [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:54 compute-0 nova_compute[187208]: 2025-12-05 12:03:54.858 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:54 compute-0 nova_compute[187208]: 2025-12-05 12:03:54.883 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:54 compute-0 nova_compute[187208]: 2025-12-05 12:03:54.928 187212 INFO nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Deleted allocations for instance d70544d6-04e3-4b2a-914a-72db3052216a
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.015 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.051 187212 INFO nova.virt.libvirt.driver [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Snapshot image upload complete
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.051 187212 INFO nova.compute.manager [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 5.79 seconds to snapshot the instance on the hypervisor.
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.133 187212 DEBUG nova.compute.manager [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.133 187212 DEBUG oslo_concurrency.lockutils [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.134 187212 DEBUG oslo_concurrency.lockutils [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.134 187212 DEBUG oslo_concurrency.lockutils [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.134 187212 DEBUG nova.compute.manager [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] No waiting events found dispatching network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.134 187212 WARNING nova.compute.manager [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received unexpected event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc for instance with vm_state deleted and task_state None.
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.135 187212 DEBUG nova.compute.manager [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-deleted-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.147 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936220.1460907, 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.147 187212 INFO nova.compute.manager [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] VM Stopped (Lifecycle Event)
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.177 187212 DEBUG nova.compute.manager [None req-188187e6-77dc-4f1c-8fe6-d6620a329879 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:55 compute-0 nova_compute[187208]: 2025-12-05 12:03:55.583 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.227 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.228 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.228 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.228 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.229 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.230 187212 INFO nova.compute.manager [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Terminating instance
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.230 187212 DEBUG nova.compute.manager [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.232 187212 INFO nova.compute.manager [None req-7a7b1fde-7875-4bb0-a8e8-c57375c20d5a ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Pausing
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.232 187212 DEBUG nova.objects.instance [None req-7a7b1fde-7875-4bb0-a8e8-c57375c20d5a ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'flavor' on Instance uuid 30a55909-059f-4a0c-9598-14cc506d42a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:56 compute-0 kernel: tap0a11e563-2b (unregistering): left promiscuous mode
Dec 05 12:03:56 compute-0 NetworkManager[55691]: <info>  [1764936236.2603] device (tap0a11e563-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.272 187212 DEBUG nova.compute.manager [None req-7a7b1fde-7875-4bb0-a8e8-c57375c20d5a ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.274 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936236.2743678, 30a55909-059f-4a0c-9598-14cc506d42a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.275 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Paused (Lifecycle Event)
Dec 05 12:03:56 compute-0 ovn_controller[95610]: 2025-12-05T12:03:56Z|00310|binding|INFO|Releasing lport 0a11e563-2be9-4ce9-af51-7d29b586e233 from this chassis (sb_readonly=0)
Dec 05 12:03:56 compute-0 ovn_controller[95610]: 2025-12-05T12:03:56Z|00311|binding|INFO|Setting lport 0a11e563-2be9-4ce9-af51-7d29b586e233 down in Southbound
Dec 05 12:03:56 compute-0 ovn_controller[95610]: 2025-12-05T12:03:56Z|00312|binding|INFO|Removing iface tap0a11e563-2b ovn-installed in OVS
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.279 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.284 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:70:f2 10.100.0.12'], port_security=['fa:16:3e:f2:70:f2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd2085dd9-2ebd-4804-99c1-3b15cbd216f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d064000-316c-46a7-a23c-1dc26318b6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79895287bd1d488c842f6013729a1f81', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e1ec2415-6840-4cf9-b5ac-efaf1a9c9a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3804b014-203a-4c47-b0bb-7634579c4ec4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0a11e563-2be9-4ce9-af51-7d29b586e233) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.285 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0a11e563-2be9-4ce9-af51-7d29b586e233 in datapath 5d064000-316c-46a7-a23c-1dc26318b6a4 unbound from our chassis
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.287 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d064000-316c-46a7-a23c-1dc26318b6a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.289 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[da63f076-97d5-42ad-a980-db288b5f6b20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.289 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 namespace which is not needed anymore
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.291 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.301 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.307 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:56 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000021.scope: Deactivated successfully.
Dec 05 12:03:56 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000021.scope: Consumed 14.127s CPU time.
Dec 05 12:03:56 compute-0 systemd-machined[153543]: Machine qemu-37-instance-00000021 terminated.
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.331 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] During sync_power_state the instance has a pending task (pausing). Skip.
Dec 05 12:03:56 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [NOTICE]   (220123) : haproxy version is 2.8.14-c23fe91
Dec 05 12:03:56 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [NOTICE]   (220123) : path to executable is /usr/sbin/haproxy
Dec 05 12:03:56 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [WARNING]  (220123) : Exiting Master process...
Dec 05 12:03:56 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [ALERT]    (220123) : Current worker (220125) exited with code 143 (Terminated)
Dec 05 12:03:56 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [WARNING]  (220123) : All workers exited. Exiting... (0)
Dec 05 12:03:56 compute-0 systemd[1]: libpod-592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c.scope: Deactivated successfully.
Dec 05 12:03:56 compute-0 podman[221693]: 2025-12-05 12:03:56.429380602 +0000 UTC m=+0.057639538 container died 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:03:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c-userdata-shm.mount: Deactivated successfully.
Dec 05 12:03:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-f72b34854b6787c4cd40b26e6f22d36e2f45382e4691a696a9fc490f51c1bb73-merged.mount: Deactivated successfully.
Dec 05 12:03:56 compute-0 podman[221693]: 2025-12-05 12:03:56.47063025 +0000 UTC m=+0.098889186 container cleanup 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:03:56 compute-0 systemd[1]: libpod-conmon-592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c.scope: Deactivated successfully.
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.498 187212 INFO nova.virt.libvirt.driver [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance destroyed successfully.
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.498 187212 DEBUG nova.objects.instance [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'resources' on Instance uuid d2085dd9-2ebd-4804-99c1-3b15cbd216f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.515 187212 DEBUG nova.virt.libvirt.vif [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-835443144',display_name='tempest-ImagesOneServerNegativeTestJSON-server-835443144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-835443144',id=33,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-ijey2289',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:52Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=d2085dd9-2ebd-4804-99c1-3b15cbd216f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.515 187212 DEBUG nova.network.os_vif_util [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.517 187212 DEBUG nova.network.os_vif_util [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.517 187212 DEBUG os_vif [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.519 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a11e563-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.520 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.522 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.525 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.528 187212 INFO os_vif [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b')
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.529 187212 INFO nova.virt.libvirt.driver [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Deleting instance files /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8_del
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.529 187212 INFO nova.virt.libvirt.driver [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Deletion of /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8_del complete
Dec 05 12:03:56 compute-0 podman[221737]: 2025-12-05 12:03:56.54730073 +0000 UTC m=+0.052039048 container remove 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.554 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c067939f-cbca-4325-8d52-8616a98d9e93]: (4, ('Fri Dec  5 12:03:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 (592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c)\n592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c\nFri Dec  5 12:03:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 (592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c)\n592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.556 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4318c6-8faa-49e1-8b0a-c9dd9c81f282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.557 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d064000-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.559 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:56 compute-0 kernel: tap5d064000-30: left promiscuous mode
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.579 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05ad57f4-88b0-43d9-aa48-f1a46e063d4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.581 187212 INFO nova.compute.manager [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.582 187212 DEBUG oslo.service.loopingcall [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.582 187212 DEBUG nova.compute.manager [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:56 compute-0 nova_compute[187208]: 2025-12-05 12:03:56.583 187212 DEBUG nova.network.neutron [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.595 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fb9df3-7feb-4d57-8396-617cc9364865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.598 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff18bb6-e111-4193-80e2-861143be0326]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.614 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7caceb30-bef4-4611-b152-c5182ab541f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357459, 'reachable_time': 17903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221757, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.617 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:03:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.617 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[2cbf1065-3a9e-4209-a7bd-40200c316540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d5d064000\x2d316c\x2d46a7\x2da23c\x2d1dc26318b6a4.mount: Deactivated successfully.
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.560 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.623 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.624 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.624 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.625 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.625 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.626 187212 INFO nova.compute.manager [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Terminating instance
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.627 187212 DEBUG nova.compute.manager [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:57 compute-0 kernel: tap909107ba-c9 (unregistering): left promiscuous mode
Dec 05 12:03:57 compute-0 NetworkManager[55691]: <info>  [1764936237.6467] device (tap909107ba-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.652 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:57 compute-0 ovn_controller[95610]: 2025-12-05T12:03:57Z|00313|binding|INFO|Releasing lport 909107ba-c90a-4004-a47f-e5367cab8f82 from this chassis (sb_readonly=0)
Dec 05 12:03:57 compute-0 ovn_controller[95610]: 2025-12-05T12:03:57Z|00314|binding|INFO|Setting lport 909107ba-c90a-4004-a47f-e5367cab8f82 down in Southbound
Dec 05 12:03:57 compute-0 ovn_controller[95610]: 2025-12-05T12:03:57Z|00315|binding|INFO|Removing iface tap909107ba-c9 ovn-installed in OVS
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.655 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.660 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:8e:a6 10.100.0.12'], port_security=['fa:16:3e:16:8e:a6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7df02f69-ecc9-424d-82ab-dc8ba279ffd5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=909107ba-c90a-4004-a47f-e5367cab8f82) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.661 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 909107ba-c90a-4004-a47f-e5367cab8f82 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.663 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.673 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.678 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a167f52a-33a3-448d-b939-aa21f4ee2f14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.704 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e460f771-7073-47e5-b169-e9b5f135ad0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.709 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[dec63153-74c0-4a5d-95b0-b23dd94a8845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:57 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000026.scope: Deactivated successfully.
Dec 05 12:03:57 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000026.scope: Consumed 7.758s CPU time.
Dec 05 12:03:57 compute-0 systemd-machined[153543]: Machine qemu-43-instance-00000026 terminated.
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.738 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d33766db-a46f-43cf-8bdd-bd8b2c900093]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.757 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0facf9-d60b-4bec-84fb-a70f86778f83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360757, 'reachable_time': 17404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221769, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.775 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[28b9a832-3d4f-46db-82a7-a1a8a6b1d70b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360767, 'tstamp': 360767}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221770, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360771, 'tstamp': 360771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221770, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.777 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.791 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.800 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ea1ed6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.800 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.801 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ea1ed6-90, col_values=(('external_ids', {'iface-id': '6f012c31-72e4-4df5-be68-787aa910fb9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.801 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.883 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-unplugged-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.884 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.884 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.884 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.884 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] No waiting events found dispatching network-vif-unplugged-0a11e563-2be9-4ce9-af51-7d29b586e233 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.885 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-unplugged-0a11e563-2be9-4ce9-af51-7d29b586e233 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.885 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.885 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.886 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.886 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.886 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] No waiting events found dispatching network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.887 187212 WARNING nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received unexpected event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 for instance with vm_state active and task_state deleting.
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.911 187212 INFO nova.virt.libvirt.driver [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance destroyed successfully.
Dec 05 12:03:57 compute-0 nova_compute[187208]: 2025-12-05 12:03:57.912 187212 DEBUG nova.objects.instance [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'resources' on Instance uuid 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.284 187212 DEBUG nova.virt.libvirt.vif [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-1',id=38,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:52Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=7df02f69-ecc9-424d-82ab-dc8ba279ffd5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.285 187212 DEBUG nova.network.os_vif_util [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.286 187212 DEBUG nova.network.os_vif_util [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.286 187212 DEBUG os_vif [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.288 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.288 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap909107ba-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.290 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.292 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.295 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.298 187212 INFO os_vif [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9')
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.298 187212 INFO nova.virt.libvirt.driver [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Deleting instance files /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5_del
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.299 187212 INFO nova.virt.libvirt.driver [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Deletion of /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5_del complete
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.342 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.343 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.343 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.344 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.344 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.345 187212 INFO nova.compute.manager [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Terminating instance
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.346 187212 DEBUG nova.compute.manager [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.353 187212 INFO nova.compute.manager [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Took 0.73 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.353 187212 DEBUG oslo.service.loopingcall [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.353 187212 DEBUG nova.compute.manager [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.354 187212 DEBUG nova.network.neutron [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:58 compute-0 kernel: tap4f7ea95e-e5 (unregistering): left promiscuous mode
Dec 05 12:03:58 compute-0 NetworkManager[55691]: <info>  [1764936238.3718] device (tap4f7ea95e-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.382 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00316|binding|INFO|Releasing lport 4f7ea95e-e59f-4941-83b6-5c482617a975 from this chassis (sb_readonly=0)
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00317|binding|INFO|Setting lport 4f7ea95e-e59f-4941-83b6-5c482617a975 down in Southbound
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00318|binding|INFO|Removing iface tap4f7ea95e-e5 ovn-installed in OVS
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.385 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.390 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:7b:36 10.100.0.3'], port_security=['fa:16:3e:4a:7b:36 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '456f1972-6ed7-4fc2-b046-fa035704d434', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4f7ea95e-e59f-4941-83b6-5c482617a975) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.391 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4f7ea95e-e59f-4941-83b6-5c482617a975 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.392 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.393 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1153941c-4190-450d-9f5b-a6c2b86c03d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.393 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 namespace which is not needed anymore
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.397 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000027.scope: Deactivated successfully.
Dec 05 12:03:58 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000027.scope: Consumed 7.189s CPU time.
Dec 05 12:03:58 compute-0 systemd-machined[153543]: Machine qemu-42-instance-00000027 terminated.
Dec 05 12:03:58 compute-0 podman[221793]: 2025-12-05 12:03:58.476089273 +0000 UTC m=+0.061173208 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [NOTICE]   (221443) : haproxy version is 2.8.14-c23fe91
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [NOTICE]   (221443) : path to executable is /usr/sbin/haproxy
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [WARNING]  (221443) : Exiting Master process...
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [WARNING]  (221443) : Exiting Master process...
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [ALERT]    (221443) : Current worker (221445) exited with code 143 (Terminated)
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [WARNING]  (221443) : All workers exited. Exiting... (0)
Dec 05 12:03:58 compute-0 systemd[1]: libpod-9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1.scope: Deactivated successfully.
Dec 05 12:03:58 compute-0 podman[221835]: 2025-12-05 12:03:58.536444177 +0000 UTC m=+0.050351559 container died 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 12:03:58 compute-0 kernel: tap4f7ea95e-e5: entered promiscuous mode
Dec 05 12:03:58 compute-0 systemd-udevd[221676]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:03:58 compute-0 NetworkManager[55691]: <info>  [1764936238.5677] manager: (tap4f7ea95e-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Dec 05 12:03:58 compute-0 kernel: tap4f7ea95e-e5 (unregistering): left promiscuous mode
Dec 05 12:03:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1-userdata-shm.mount: Deactivated successfully.
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.573 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-cea3cddfc7f419ead74ac9c1e8d910318a876e210e0687b74d3349159defed7c-merged.mount: Deactivated successfully.
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00319|binding|INFO|Claiming lport 4f7ea95e-e59f-4941-83b6-5c482617a975 for this chassis.
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00320|binding|INFO|4f7ea95e-e59f-4941-83b6-5c482617a975: Claiming fa:16:3e:4a:7b:36 10.100.0.3
Dec 05 12:03:58 compute-0 podman[221835]: 2025-12-05 12:03:58.584336515 +0000 UTC m=+0.098243887 container cleanup 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 12:03:58 compute-0 systemd[1]: libpod-conmon-9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1.scope: Deactivated successfully.
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00321|if_status|INFO|Dropped 8 log messages in last 62 seconds (most recently, 62 seconds ago) due to excessive rate
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.595 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00322|if_status|INFO|Not setting lport 4f7ea95e-e59f-4941-83b6-5c482617a975 down as sb is readonly
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.598 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 podman[221836]: 2025-12-05 12:03:58.625282454 +0000 UTC m=+0.120113991 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.625 187212 INFO nova.virt.libvirt.driver [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance destroyed successfully.
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.625 187212 DEBUG nova.objects.instance [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'resources' on Instance uuid 456f1972-6ed7-4fc2-b046-fa035704d434 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00323|binding|INFO|Releasing lport 4f7ea95e-e59f-4941-83b6-5c482617a975 from this chassis (sb_readonly=0)
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.657 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:7b:36 10.100.0.3'], port_security=['fa:16:3e:4a:7b:36 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '456f1972-6ed7-4fc2-b046-fa035704d434', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4f7ea95e-e59f-4941-83b6-5c482617a975) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:58 compute-0 podman[221894]: 2025-12-05 12:03:58.659139512 +0000 UTC m=+0.051673667 container remove 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.664 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:7b:36 10.100.0.3'], port_security=['fa:16:3e:4a:7b:36 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '456f1972-6ed7-4fc2-b046-fa035704d434', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4f7ea95e-e59f-4941-83b6-5c482617a975) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.665 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d60773fd-f54d-419e-8aae-5f6cf367f618]: (4, ('Fri Dec  5 12:03:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 (9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1)\n9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1\nFri Dec  5 12:03:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 (9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1)\n9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.666 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7e63a4be-a65f-46c6-b627-1e3d642e3307]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.667 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.668 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.676 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.676 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.677 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.677 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.677 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.678 187212 INFO nova.compute.manager [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Terminating instance
Dec 05 12:03:58 compute-0 kernel: tapb8ea1ed6-90: left promiscuous mode
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.679 187212 DEBUG nova.compute.manager [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.679 187212 DEBUG nova.virt.libvirt.vif [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-2',id=39,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-12-05T12:03:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:52Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=456f1972-6ed7-4fc2-b046-fa035704d434,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.680 187212 DEBUG nova.network.os_vif_util [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.680 187212 DEBUG nova.network.os_vif_util [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.680 187212 DEBUG os_vif [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.682 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.682 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f7ea95e-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.683 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.688 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.688 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[261ec7eb-7478-45fc-8cf9-b1b5e5dca27f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.690 187212 INFO os_vif [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5')
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.690 187212 INFO nova.virt.libvirt.driver [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Deleting instance files /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434_del
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.691 187212 INFO nova.virt.libvirt.driver [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Deletion of /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434_del complete
Dec 05 12:03:58 compute-0 kernel: tap9dc35efb-0a (unregistering): left promiscuous mode
Dec 05 12:03:58 compute-0 NetworkManager[55691]: <info>  [1764936238.7023] device (tap9dc35efb-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.704 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[818f7e74-d8ae-438b-bcb6-76413c14d9d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.705 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[521872f7-516a-4960-b048-97410301b28d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00324|binding|INFO|Releasing lport 9dc35efb-0aed-463b-860e-3b60dd65b6db from this chassis (sb_readonly=0)
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.709 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00325|binding|INFO|Setting lport 9dc35efb-0aed-463b-860e-3b60dd65b6db down in Southbound
Dec 05 12:03:58 compute-0 ovn_controller[95610]: 2025-12-05T12:03:58Z|00326|binding|INFO|Removing iface tap9dc35efb-0a ovn-installed in OVS
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.712 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.716 187212 DEBUG nova.network.neutron [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.720 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fc05912f-1ed9-4845-9175-ab9bc3ebd64e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360751, 'reachable_time': 37106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221922, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 systemd[1]: run-netns-ovnmeta\x2db8ea1ed6\x2d9eec\x2d4cb3\x2da2b6\x2d6146b7b65c36.mount: Deactivated successfully.
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.723 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:04:08 10.100.0.4'], port_security=['fa:16:3e:4b:04:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30a55909-059f-4a0c-9598-14cc506d42a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9dc35efb-0aed-463b-860e-3b60dd65b6db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.726 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.726 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[89274fc9-3de1-4ddd-8ae5-115954886ee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.727 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4f7ea95e-e59f-4941-83b6-5c482617a975 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.728 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.729 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.729 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[57ffca8b-0180-4e98-b46f-2e5b15e9091f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.730 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4f7ea95e-e59f-4941-83b6-5c482617a975 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.732 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.732 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6885eb8d-2a35-42fb-a1c6-0a0515399090]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.733 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9dc35efb-0aed-463b-860e-3b60dd65b6db in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 unbound from our chassis
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.734 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.735 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[079cabab-c962-47d5-895f-eb4343444743]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.735 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace which is not needed anymore
Dec 05 12:03:58 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000025.scope: Deactivated successfully.
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.753 187212 INFO nova.compute.manager [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Took 2.17 seconds to deallocate network for instance.
Dec 05 12:03:58 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000025.scope: Consumed 4.769s CPU time.
Dec 05 12:03:58 compute-0 systemd-machined[153543]: Machine qemu-41-instance-00000025 terminated.
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.762 187212 INFO nova.compute.manager [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Took 0.42 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.762 187212 DEBUG oslo.service.loopingcall [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.762 187212 DEBUG nova.compute.manager [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.763 187212 DEBUG nova.network.neutron [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG nova.compute.manager [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-unplugged-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG oslo_concurrency.lockutils [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG oslo_concurrency.lockutils [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG oslo_concurrency.lockutils [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG nova.compute.manager [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] No waiting events found dispatching network-vif-unplugged-909107ba-c90a-4004-a47f-e5367cab8f82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG nova.compute.manager [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-unplugged-909107ba-c90a-4004-a47f-e5367cab8f82 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [NOTICE]   (221287) : haproxy version is 2.8.14-c23fe91
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [NOTICE]   (221287) : path to executable is /usr/sbin/haproxy
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [WARNING]  (221287) : Exiting Master process...
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [WARNING]  (221287) : Exiting Master process...
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [ALERT]    (221287) : Current worker (221289) exited with code 143 (Terminated)
Dec 05 12:03:58 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [WARNING]  (221287) : All workers exited. Exiting... (0)
Dec 05 12:03:58 compute-0 systemd[1]: libpod-a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b.scope: Deactivated successfully.
Dec 05 12:03:58 compute-0 podman[221943]: 2025-12-05 12:03:58.870588721 +0000 UTC m=+0.048597859 container died a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:03:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b-userdata-shm.mount: Deactivated successfully.
Dec 05 12:03:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-aba16f9d308e86ed666c24a9528a5e9f58db6e3fb9b48c984c73f0766f63478d-merged.mount: Deactivated successfully.
Dec 05 12:03:58 compute-0 NetworkManager[55691]: <info>  [1764936238.9099] manager: (tap9dc35efb-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/134)
Dec 05 12:03:58 compute-0 podman[221943]: 2025-12-05 12:03:58.917073199 +0000 UTC m=+0.095082347 container cleanup a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.939 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:58 compute-0 systemd[1]: libpod-conmon-a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b.scope: Deactivated successfully.
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.977 187212 DEBUG nova.compute.provider_tree [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.983 187212 INFO nova.virt.libvirt.driver [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance destroyed successfully.
Dec 05 12:03:58 compute-0 nova_compute[187208]: 2025-12-05 12:03:58.984 187212 DEBUG nova.objects.instance [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'resources' on Instance uuid 30a55909-059f-4a0c-9598-14cc506d42a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:03:59 compute-0 podman[221985]: 2025-12-05 12:03:59.005228157 +0000 UTC m=+0.045404088 container remove a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.009 187212 DEBUG nova.virt.libvirt.vif [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1329126976',display_name='tempest-DeleteServersTestJSON-server-1329126976',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1329126976',id=37,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-4bgg3k4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:56Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=30a55909-059f-4a0c-9598-14cc506d42a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.009 187212 DEBUG nova.network.os_vif_util [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.010 187212 DEBUG nova.network.os_vif_util [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.011 187212 DEBUG os_vif [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:03:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.011 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0f3038f1-2edc-47e0-9004-a8b6ffe8e291]: (4, ('Fri Dec  5 12:03:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b)\na17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b\nFri Dec  5 12:03:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b)\na17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.013 187212 DEBUG nova.scheduler.client.report [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:03:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.013 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed75716b-4cbd-4fc3-ac8f-8ee88dcd5e45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.014 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.017 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:59 compute-0 kernel: tapd7360f84-b0: left promiscuous mode
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.017 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dc35efb-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.018 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.037 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:03:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.037 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f73898d2-ea14-436d-a708-470e78749a64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.039 187212 INFO os_vif [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a')
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.040 187212 INFO nova.virt.libvirt.driver [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Deleting instance files /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2_del
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.041 187212 INFO nova.virt.libvirt.driver [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Deletion of /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2_del complete
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.045 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.051 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936224.049811, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.051 187212 INFO nova.compute.manager [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Stopped (Lifecycle Event)
Dec 05 12:03:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.052 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[46cadb41-3009-4fd9-8fd6-b88aa4e15537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.055 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[56e5fa8a-0e57-4e96-8a38-caf4100aeeea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.072 187212 DEBUG nova.compute.manager [None req-9f45b4e0-f64a-4d0f-88b2-6c1b1eb5fd2c - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.076 187212 DEBUG nova.compute.manager [None req-9f45b4e0-f64a-4d0f-88b2-6c1b1eb5fd2c - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: suspended, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:03:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.078 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[37eb8854-c0d3-4dca-a361-767f0be1d15c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360573, 'reachable_time': 16088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222012, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.078 187212 INFO nova.scheduler.client.report [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Deleted allocations for instance d2085dd9-2ebd-4804-99c1-3b15cbd216f8
Dec 05 12:03:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.080 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:03:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.080 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[347ad698-250e-41d2-97c5-10c8d43ab7f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.107 187212 INFO nova.compute.manager [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Took 0.43 seconds to destroy the instance on the hypervisor.
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.107 187212 DEBUG oslo.service.loopingcall [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.109 187212 DEBUG nova.compute.manager [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.109 187212 DEBUG nova.network.neutron [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:03:59 compute-0 nova_compute[187208]: 2025-12-05 12:03:59.191 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:03:59 compute-0 systemd[1]: run-netns-ovnmeta\x2dd7360f84\x2dbcd5\x2d4e64\x2dbf43\x2d1fdbd8215a70.mount: Deactivated successfully.
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.063 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.064 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.064 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.064 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.065 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.066 187212 INFO nova.compute.manager [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Terminating instance
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.067 187212 DEBUG nova.compute.manager [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.074 187212 INFO nova.virt.libvirt.driver [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance destroyed successfully.
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.074 187212 DEBUG nova.objects.instance [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.087 187212 DEBUG nova.virt.libvirt.vif [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1049650520',display_name='tempest-ImagesTestJSON-server-1049650520',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1049650520',id=36,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-kquxoeat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:55Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=c1e2f189-1777-4f28-97ab-72cf0f60fbc0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.087 187212 DEBUG nova.network.os_vif_util [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.088 187212 DEBUG nova.network.os_vif_util [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.088 187212 DEBUG os_vif [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.090 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.090 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapecec1a41-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.091 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.094 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.096 187212 INFO os_vif [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f')
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.097 187212 INFO nova.virt.libvirt.driver [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Deleting instance files /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0_del
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.098 187212 INFO nova.virt.libvirt.driver [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Deletion of /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0_del complete
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.137 187212 DEBUG nova.compute.manager [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-deleted-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.137 187212 DEBUG nova.compute.manager [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-unplugged-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.137 187212 DEBUG oslo_concurrency.lockutils [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.138 187212 DEBUG oslo_concurrency.lockutils [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.138 187212 DEBUG oslo_concurrency.lockutils [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.138 187212 DEBUG nova.compute.manager [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] No waiting events found dispatching network-vif-unplugged-4f7ea95e-e59f-4941-83b6-5c482617a975 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.139 187212 DEBUG nova.compute.manager [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-unplugged-4f7ea95e-e59f-4941-83b6-5c482617a975 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.178 187212 INFO nova.compute.manager [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 0.11 seconds to destroy the instance on the hypervisor.
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.179 187212 DEBUG oslo.service.loopingcall [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.179 187212 DEBUG nova.compute.manager [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.180 187212 DEBUG nova.network.neutron [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.891 187212 DEBUG nova.compute.manager [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.891 187212 DEBUG oslo_concurrency.lockutils [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.892 187212 DEBUG oslo_concurrency.lockutils [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.892 187212 DEBUG oslo_concurrency.lockutils [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.892 187212 DEBUG nova.compute.manager [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] No waiting events found dispatching network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.893 187212 WARNING nova.compute.manager [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received unexpected event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 for instance with vm_state active and task_state deleting.
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.979 187212 DEBUG nova.network.neutron [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:00 compute-0 nova_compute[187208]: 2025-12-05 12:04:00.998 187212 INFO nova.compute.manager [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 0.82 seconds to deallocate network for instance.
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.064 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.065 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.094 187212 DEBUG nova.network.neutron [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.123 187212 INFO nova.compute.manager [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Took 2.77 seconds to deallocate network for instance.
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.188 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.192 187212 DEBUG nova.compute.provider_tree [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.212 187212 DEBUG nova.scheduler.client.report [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.235 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.237 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.258 187212 INFO nova.scheduler.client.report [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance c1e2f189-1777-4f28-97ab-72cf0f60fbc0
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.322 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.344 187212 DEBUG nova.compute.provider_tree [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.362 187212 DEBUG nova.scheduler.client.report [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.382 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.409 187212 INFO nova.scheduler.client.report [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Deleted allocations for instance 7df02f69-ecc9-424d-82ab-dc8ba279ffd5
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.457 187212 DEBUG nova.network.neutron [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.487 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.489 187212 INFO nova.compute.manager [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Took 2.73 seconds to deallocate network for instance.
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.537 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.537 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.604 187212 DEBUG nova.compute.provider_tree [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.618 187212 DEBUG nova.scheduler.client.report [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.638 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.664 187212 INFO nova.scheduler.client.report [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Deleted allocations for instance 456f1972-6ed7-4fc2-b046-fa035704d434
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.730 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.812 187212 DEBUG nova.network.neutron [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.832 187212 INFO nova.compute.manager [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Took 2.72 seconds to deallocate network for instance.
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.884 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.884 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.941 187212 DEBUG nova.compute.provider_tree [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.956 187212 DEBUG nova.scheduler.client.report [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:01 compute-0 nova_compute[187208]: 2025-12-05 12:04:01.977 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.020 187212 INFO nova.scheduler.client.report [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Deleted allocations for instance 30a55909-059f-4a0c-9598-14cc506d42a2
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.076 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:02 compute-0 podman[222013]: 2025-12-05 12:04:02.224045196 +0000 UTC m=+0.070430243 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.290 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.291 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] No waiting events found dispatching network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 WARNING nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received unexpected event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 for instance with vm_state deleted and task_state None.
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-unplugged-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] No waiting events found dispatching network-vif-unplugged-9dc35efb-0aed-463b-860e-3b60dd65b6db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 WARNING nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received unexpected event network-vif-unplugged-9dc35efb-0aed-463b-860e-3b60dd65b6db for instance with vm_state deleted and task_state None.
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-deleted-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-deleted-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.295 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.295 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] No waiting events found dispatching network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.295 187212 WARNING nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received unexpected event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db for instance with vm_state deleted and task_state None.
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.295 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-deleted-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:02 compute-0 nova_compute[187208]: 2025-12-05 12:04:02.562 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:03.011 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:03.011 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:03.012 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:03 compute-0 nova_compute[187208]: 2025-12-05 12:04:03.217 187212 DEBUG nova.compute.manager [req-215b72bb-4ffe-4c6d-a0a3-d3e74120ce59 req-e42ff978-79bf-4066-b40e-cadb44e32c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-deleted-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:04 compute-0 nova_compute[187208]: 2025-12-05 12:04:04.890 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:04 compute-0 nova_compute[187208]: 2025-12-05 12:04:04.890 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:04 compute-0 nova_compute[187208]: 2025-12-05 12:04:04.912 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:04 compute-0 nova_compute[187208]: 2025-12-05 12:04:04.971 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:04 compute-0 nova_compute[187208]: 2025-12-05 12:04:04.972 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:04 compute-0 nova_compute[187208]: 2025-12-05 12:04:04.978 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:04 compute-0 nova_compute[187208]: 2025-12-05 12:04:04.978 187212 INFO nova.compute.claims [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.087 187212 DEBUG nova.compute.provider_tree [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.105 187212 DEBUG nova.scheduler.client.report [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.125 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.126 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.174 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.174 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.196 187212 INFO nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.216 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.289 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.290 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.290 187212 INFO nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Creating image(s)
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.291 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.291 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.291 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.302 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.398 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.399 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.400 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.423 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.484 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936230.4838316, d70544d6-04e3-4b2a-914a-72db3052216a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.486 187212 INFO nova.compute.manager [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] VM Stopped (Lifecycle Event)
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.490 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.491 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.578 187212 DEBUG nova.compute.manager [None req-97c0c5d6-0a43-4dbe-a0cf-cec4a6ad6c1e - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.919 187212 DEBUG nova.policy [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.963 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk 1073741824" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.964 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:05 compute-0 nova_compute[187208]: 2025-12-05 12:04:05.965 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.027 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.029 187212 DEBUG nova.virt.disk.api [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.029 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.103 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.104 187212 DEBUG nova.virt.disk.api [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.104 187212 DEBUG nova.objects.instance [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.121 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.122 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Ensure instance console log exists: /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.122 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.122 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.123 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.510 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.510 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.532 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.600 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.601 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.608 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.608 187212 INFO nova.compute.claims [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.719 187212 DEBUG nova.compute.provider_tree [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.734 187212 DEBUG nova.scheduler.client.report [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.769 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.770 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.839 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.840 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.881 187212 INFO nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:06 compute-0 nova_compute[187208]: 2025-12-05 12:04:06.908 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.008 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.009 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.010 187212 INFO nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Creating image(s)
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.010 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.010 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.011 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.026 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.068 187212 DEBUG nova.policy [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4671f6c82ea049fab3a314ecf45b7656', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.087 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.088 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.088 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.101 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.165 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.166 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.201 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.202 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.202 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.265 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.266 187212 DEBUG nova.virt.disk.api [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Checking if we can resize image /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.267 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.338 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.339 187212 DEBUG nova.virt.disk.api [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Cannot resize image /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.339 187212 DEBUG nova.objects.instance [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'migration_context' on Instance uuid c2e63727-b45b-4249-a94f-85b0d6314ba0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.354 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.355 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Ensure instance console log exists: /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.355 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.356 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.356 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.564 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:07 compute-0 nova_compute[187208]: 2025-12-05 12:04:07.674 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Successfully created port: 656f63d2-77f9-46f7-9338-81bc5a056ad4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:08 compute-0 nova_compute[187208]: 2025-12-05 12:04:08.557 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Successfully created port: ea8794b1-8d29-4839-af08-e1675802ea0a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.422 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "082d2145-1505-4170-9a11-4e46bf86fed2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.422 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.454 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.480 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "58c3288f-57bf-4c62-8d69-9842a22e43d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.480 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.492 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Successfully updated port: 656f63d2-77f9-46f7-9338-81bc5a056ad4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.516 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.516 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.516 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.518 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.537 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.537 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.553 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.553 187212 INFO nova.compute.claims [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.637 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.805 187212 DEBUG nova.compute.provider_tree [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.823 187212 DEBUG nova.scheduler.client.report [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.842 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.843 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.845 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.852 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.852 187212 INFO nova.compute.claims [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.878 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.934 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.935 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.967 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:09 compute-0 nova_compute[187208]: 2025-12-05 12:04:09.984 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.060 187212 DEBUG nova.compute.provider_tree [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.078 187212 DEBUG nova.scheduler.client.report [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.084 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.085 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.085 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Creating image(s)
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.086 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "/var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.086 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.087 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.105 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.105 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.108 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.174 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.175 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.175 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.196 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.219 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.220 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:10 compute-0 podman[222065]: 2025-12-05 12:04:10.236112817 +0000 UTC m=+0.088588222 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.245 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.258 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.259 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.278 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.293 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.294 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.295 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.355 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.356 187212 DEBUG nova.virt.disk.api [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Checking if we can resize image /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.357 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.390 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.391 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.392 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Creating image(s)
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.393 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "/var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.393 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.394 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.407 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.428 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.429 187212 DEBUG nova.virt.disk.api [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Cannot resize image /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.429 187212 DEBUG nova.objects.instance [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'migration_context' on Instance uuid 082d2145-1505-4170-9a11-4e46bf86fed2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.453 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.454 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Ensure instance console log exists: /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.454 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.455 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.455 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.470 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.470 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.471 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.484 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.540 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.541 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.582 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.583 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.584 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.654 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.655 187212 DEBUG nova.virt.disk.api [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Checking if we can resize image /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.656 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.724 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.725 187212 DEBUG nova.virt.disk.api [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Cannot resize image /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.725 187212 DEBUG nova.objects.instance [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'migration_context' on Instance uuid 58c3288f-57bf-4c62-8d69-9842a22e43d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.741 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.742 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Ensure instance console log exists: /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.742 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.743 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.743 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.842 187212 DEBUG nova.policy [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40620135b1ff4f8d9d80eb79f51fd593', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bebbbd9623064681bb9350747fba600e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:10 compute-0 nova_compute[187208]: 2025-12-05 12:04:10.987 187212 DEBUG nova.policy [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40620135b1ff4f8d9d80eb79f51fd593', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bebbbd9623064681bb9350747fba600e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.122 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Updating instance_info_cache with network_info: [{"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.146 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.147 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance network_info: |[{"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.150 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Start _get_guest_xml network_info=[{"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.156 187212 WARNING nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.163 187212 DEBUG nova.virt.libvirt.host [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.164 187212 DEBUG nova.virt.libvirt.host [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.168 187212 DEBUG nova.virt.libvirt.host [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.169 187212 DEBUG nova.virt.libvirt.host [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.170 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.170 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.171 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.172 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.172 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.173 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.173 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.173 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.174 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.174 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.175 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.175 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.183 187212 DEBUG nova.virt.libvirt.vif [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-64428055',display_name='tempest-DeleteServersTestJSON-server-64428055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-64428055',id=41,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-rpqd5t4m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:06Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=c2e63727-b45b-4249-a94f-85b0d6314ba0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.183 187212 DEBUG nova.network.os_vif_util [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.185 187212 DEBUG nova.network.os_vif_util [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.186 187212 DEBUG nova.objects.instance [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid c2e63727-b45b-4249-a94f-85b0d6314ba0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.231 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <uuid>c2e63727-b45b-4249-a94f-85b0d6314ba0</uuid>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <name>instance-00000029</name>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <nova:name>tempest-DeleteServersTestJSON-server-64428055</nova:name>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:11</nova:creationTime>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:11 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:11 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:11 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:11 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:11 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:11 compute-0 nova_compute[187208]:         <nova:user uuid="ff425b7b04144f93a2c15e3a347fc15c">tempest-DeleteServersTestJSON-554028480-project-member</nova:user>
Dec 05 12:04:11 compute-0 nova_compute[187208]:         <nova:project uuid="4671f6c82ea049fab3a314ecf45b7656">tempest-DeleteServersTestJSON-554028480</nova:project>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:11 compute-0 nova_compute[187208]:         <nova:port uuid="656f63d2-77f9-46f7-9338-81bc5a056ad4">
Dec 05 12:04:11 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <entry name="serial">c2e63727-b45b-4249-a94f-85b0d6314ba0</entry>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <entry name="uuid">c2e63727-b45b-4249-a94f-85b0d6314ba0</entry>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.config"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:64:8d:59"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <target dev="tap656f63d2-77"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/console.log" append="off"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:11 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:11 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:11 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:11 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:11 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.233 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Preparing to wait for external event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.234 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.234 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.234 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.235 187212 DEBUG nova.virt.libvirt.vif [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-64428055',display_name='tempest-DeleteServersTestJSON-server-64428055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-64428055',id=41,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-rpqd5t4m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:06Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=c2e63727-b45b-4249-a94f-85b0d6314ba0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.235 187212 DEBUG nova.network.os_vif_util [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.236 187212 DEBUG nova.network.os_vif_util [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.236 187212 DEBUG os_vif [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.237 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.237 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.238 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.241 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.241 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap656f63d2-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.242 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap656f63d2-77, col_values=(('external_ids', {'iface-id': '656f63d2-77f9-46f7-9338-81bc5a056ad4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:8d:59', 'vm-uuid': 'c2e63727-b45b-4249-a94f-85b0d6314ba0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.244 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:11 compute-0 NetworkManager[55691]: <info>  [1764936251.2457] manager: (tap656f63d2-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.246 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.250 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.251 187212 INFO os_vif [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77')
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.316 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.317 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.317 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No VIF found with MAC fa:16:3e:64:8d:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.318 187212 INFO nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Using config drive
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.426 187212 DEBUG nova.compute.manager [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received event network-changed-656f63d2-77f9-46f7-9338-81bc5a056ad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.426 187212 DEBUG nova.compute.manager [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Refreshing instance network info cache due to event network-changed-656f63d2-77f9-46f7-9338-81bc5a056ad4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.427 187212 DEBUG oslo_concurrency.lockutils [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.427 187212 DEBUG oslo_concurrency.lockutils [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.427 187212 DEBUG nova.network.neutron [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Refreshing network info cache for port 656f63d2-77f9-46f7-9338-81bc5a056ad4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.497 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936236.4959307, d2085dd9-2ebd-4804-99c1-3b15cbd216f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.498 187212 INFO nova.compute.manager [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] VM Stopped (Lifecycle Event)
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.523 187212 DEBUG nova.compute.manager [None req-998c5f52-b9c3-4d11-90cc-5fb77ebc3a65 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.659 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.659 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.683 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.758 187212 INFO nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Creating config drive at /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.config
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.763 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiod01ckn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.805 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.806 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.811 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.812 187212 INFO nova.compute.claims [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.895 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiod01ckn" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:11 compute-0 kernel: tap656f63d2-77: entered promiscuous mode
Dec 05 12:04:11 compute-0 NetworkManager[55691]: <info>  [1764936251.9793] manager: (tap656f63d2-77): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Dec 05 12:04:11 compute-0 ovn_controller[95610]: 2025-12-05T12:04:11Z|00327|binding|INFO|Claiming lport 656f63d2-77f9-46f7-9338-81bc5a056ad4 for this chassis.
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.979 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:11 compute-0 ovn_controller[95610]: 2025-12-05T12:04:11Z|00328|binding|INFO|656f63d2-77f9-46f7-9338-81bc5a056ad4: Claiming fa:16:3e:64:8d:59 10.100.0.4
Dec 05 12:04:11 compute-0 nova_compute[187208]: 2025-12-05 12:04:11.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:11.996 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:8d:59 10.100.0.4'], port_security=['fa:16:3e:64:8d:59 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c2e63727-b45b-4249-a94f-85b0d6314ba0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=656f63d2-77f9-46f7-9338-81bc5a056ad4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:11.998 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 656f63d2-77f9-46f7-9338-81bc5a056ad4 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 bound to our chassis
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.001 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.006 187212 DEBUG nova.compute.provider_tree [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:12 compute-0 systemd-udevd[222137]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.012 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[827d641f-1218-4e0f-b404-40489ceaabe8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.013 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7360f84-b1 in ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.015 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7360f84-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.015 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4053045a-d20d-496b-89dc-1ee2a6998965]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.016 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f6bacca0-4a7f-48dd-95f7-d161724c7952]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.022 187212 DEBUG nova.scheduler.client.report [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:12 compute-0 NetworkManager[55691]: <info>  [1764936252.0261] device (tap656f63d2-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:12 compute-0 NetworkManager[55691]: <info>  [1764936252.0269] device (tap656f63d2-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.027 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[ee696fed-b54a-44c0-aad1-6543dd43b8ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.039 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.040 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:12 compute-0 systemd-machined[153543]: New machine qemu-44-instance-00000029.
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.053 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[97081fb8-d8a4-4638-8bc8-769d6698ffae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.054 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:12 compute-0 ovn_controller[95610]: 2025-12-05T12:04:12Z|00329|binding|INFO|Setting lport 656f63d2-77f9-46f7-9338-81bc5a056ad4 ovn-installed in OVS
Dec 05 12:04:12 compute-0 ovn_controller[95610]: 2025-12-05T12:04:12Z|00330|binding|INFO|Setting lport 656f63d2-77f9-46f7-9338-81bc5a056ad4 up in Southbound
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.062 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:12 compute-0 systemd[1]: Started Virtual Machine qemu-44-instance-00000029.
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.087 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.087 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.090 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4adc01fc-87ee-4510-b515-476ff7504326]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.096 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f41ed1a-8ea6-4828-9c46-ac5c7ee8123d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 NetworkManager[55691]: <info>  [1764936252.0977] manager: (tapd7360f84-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/137)
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.104 187212 INFO nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.125 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.135 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1e09710e-3366-4258-b298-b87110ece5ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.138 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[39643e0e-5076-4a77-9961-0fa6c17ae863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 NetworkManager[55691]: <info>  [1764936252.1639] device (tapd7360f84-b0): carrier: link connected
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.170 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[20455a17-2bc9-4a55-bd24-760c9017ba9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.190 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b93f67-56b2-465d-b466-03319837492e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363272, 'reachable_time': 36535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222172, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.209 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db14cc24-2f69-4a6f-8ff7-4a63b2fcebd1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:2b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363272, 'tstamp': 363272}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222173, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.225 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.226 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.227 187212 INFO nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Creating image(s)
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.227 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.228 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.228 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.229 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4b5eaf-0dc4-4b00-8305-82ca0236c4c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363272, 'reachable_time': 36535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222174, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.240 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.260 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[46d8c3d2-9bb8-4b66-8b76-f7377c978460]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.303 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.304 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.305 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.316 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.328 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4571e9-1059-4be1-8571-555340e4ef0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.330 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.330 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.330 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7360f84-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:12 compute-0 kernel: tapd7360f84-b0: entered promiscuous mode
Dec 05 12:04:12 compute-0 NetworkManager[55691]: <info>  [1764936252.3330] manager: (tapd7360f84-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.335 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7360f84-b0, col_values=(('external_ids', {'iface-id': 'd85bc323-c3ce-47e3-ac1f-d5f27467a4e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:12 compute-0 ovn_controller[95610]: 2025-12-05T12:04:12Z|00331|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.337 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.345 187212 DEBUG nova.policy [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.346 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[81ce1368-e548-4394-944c-4234f2e569b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.347 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:04:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.348 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'env', 'PROCESS_TAG=haproxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.349 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.392 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.393 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.418 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936252.4066186, c2e63727-b45b-4249-a94f-85b0d6314ba0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.419 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] VM Started (Lifecycle Event)
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.437 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.438 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.439 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.466 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.473 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936252.4067945, c2e63727-b45b-4249-a94f-85b0d6314ba0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.474 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] VM Paused (Lifecycle Event)
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.491 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.496 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.507 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.508 187212 DEBUG nova.virt.disk.api [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Checking if we can resize image /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.508 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.531 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.568 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.575 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.577 187212 DEBUG nova.virt.disk.api [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Cannot resize image /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.577 187212 DEBUG nova.objects.instance [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'migration_context' on Instance uuid a7616662-639b-4642-b507-614773f4748f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.591 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.592 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Ensure instance console log exists: /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.593 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.593 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.594 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:12 compute-0 podman[222228]: 2025-12-05 12:04:12.717815053 +0000 UTC m=+0.023616216 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:04:12 compute-0 podman[222228]: 2025-12-05 12:04:12.819766194 +0000 UTC m=+0.125567337 container create 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:04:12 compute-0 systemd[1]: Started libpod-conmon-720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3.scope.
Dec 05 12:04:12 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:04:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1e3820f746877bfac04dbdf959f91cdf0863ffd037e2db606c73901d85cc260/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:04:12 compute-0 podman[222228]: 2025-12-05 12:04:12.911437064 +0000 UTC m=+0.217238227 container init 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.910 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936237.9096906, 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.913 187212 INFO nova.compute.manager [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] VM Stopped (Lifecycle Event)
Dec 05 12:04:12 compute-0 podman[222228]: 2025-12-05 12:04:12.918008951 +0000 UTC m=+0.223810094 container start 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 12:04:12 compute-0 nova_compute[187208]: 2025-12-05 12:04:12.937 187212 DEBUG nova.compute.manager [None req-72c91a9b-f3cc-4d63-b202-414439fa12b2 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:12 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [NOTICE]   (222247) : New worker (222249) forked
Dec 05 12:04:12 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [NOTICE]   (222247) : Loading success.
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.114 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Successfully updated port: ea8794b1-8d29-4839-af08-e1675802ea0a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.150 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.151 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.152 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.624 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936238.6229408, 456f1972-6ed7-4fc2-b046-fa035704d434 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.625 187212 INFO nova.compute.manager [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] VM Stopped (Lifecycle Event)
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.667 187212 DEBUG nova.compute.manager [None req-c087fbc8-c64a-49ba-8fc4-a32eb330d7aa - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.699 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Successfully created port: 539a9707-ef82-4c64-aec4-3759222680f0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.793 187212 DEBUG nova.compute.manager [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received event network-changed-ea8794b1-8d29-4839-af08-e1675802ea0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.793 187212 DEBUG nova.compute.manager [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Refreshing instance network info cache due to event network-changed-ea8794b1-8d29-4839-af08-e1675802ea0a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.794 187212 DEBUG oslo_concurrency.lockutils [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.981 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936238.9804034, 30a55909-059f-4a0c-9598-14cc506d42a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.983 187212 INFO nova.compute.manager [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Stopped (Lifecycle Event)
Dec 05 12:04:13 compute-0 nova_compute[187208]: 2025-12-05 12:04:13.986 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.018 187212 DEBUG nova.compute.manager [None req-ef65769c-7c86-41df-b01b-81729892b7d7 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.230 187212 DEBUG nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.231 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.231 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.231 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.232 187212 DEBUG nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Processing event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.232 187212 DEBUG nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.232 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.233 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.233 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.233 187212 DEBUG nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] No waiting events found dispatching network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.233 187212 WARNING nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received unexpected event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 for instance with vm_state building and task_state spawning.
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.234 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.239 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936254.238292, c2e63727-b45b-4249-a94f-85b0d6314ba0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.239 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] VM Resumed (Lifecycle Event)
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.240 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.251 187212 INFO nova.virt.libvirt.driver [-] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance spawned successfully.
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.251 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.268 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.272 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.279 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.280 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.280 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.280 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.281 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.281 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.294 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.348 187212 INFO nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Took 7.34 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.349 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.420 187212 INFO nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Took 7.84 seconds to build instance.
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.424 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Successfully created port: d7b765ff-93e1-4594-9e3c-e177dee2e07b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.436 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.926s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.619 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Successfully created port: eabadaa6-16c4-434c-83ea-96dfa62d7f79 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.682 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Successfully updated port: 539a9707-ef82-4c64-aec4-3759222680f0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.712 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.712 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquired lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:14 compute-0 nova_compute[187208]: 2025-12-05 12:04:14.713 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:15 compute-0 nova_compute[187208]: 2025-12-05 12:04:15.198 187212 DEBUG nova.network.neutron [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Updated VIF entry in instance network info cache for port 656f63d2-77f9-46f7-9338-81bc5a056ad4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:15 compute-0 nova_compute[187208]: 2025-12-05 12:04:15.198 187212 DEBUG nova.network.neutron [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Updating instance_info_cache with network_info: [{"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:15 compute-0 nova_compute[187208]: 2025-12-05 12:04:15.215 187212 DEBUG oslo_concurrency.lockutils [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:15 compute-0 nova_compute[187208]: 2025-12-05 12:04:15.336 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.173 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Updating instance_info_cache with network_info: [{"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.195 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Releasing lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.196 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Instance network_info: |[{"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.200 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Start _get_guest_xml network_info=[{"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.208 187212 WARNING nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.217 187212 DEBUG nova.virt.libvirt.host [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.218 187212 DEBUG nova.virt.libvirt.host [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.223 187212 DEBUG nova.virt.libvirt.host [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.224 187212 DEBUG nova.virt.libvirt.host [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.225 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.225 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.226 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.226 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.227 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.227 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.228 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.228 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.229 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.229 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.229 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.230 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.236 187212 DEBUG nova.virt.libvirt.vif [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1115906111',id=44,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-jo4hn4lg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:12Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=a7616662-639b-4642-b507-614773f4748f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.236 187212 DEBUG nova.network.os_vif_util [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.238 187212 DEBUG nova.network.os_vif_util [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.240 187212 DEBUG nova.objects.instance [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7616662-639b-4642-b507-614773f4748f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.293 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <uuid>a7616662-639b-4642-b507-614773f4748f</uuid>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <name>instance-0000002c</name>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1115906111</nova:name>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:16</nova:creationTime>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:16 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:16 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:16 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:16 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:16 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:16 compute-0 nova_compute[187208]:         <nova:user uuid="3ee170bdfdd343189ee1da01bdb80be6">tempest-ImagesOneServerNegativeTestJSON-661137252-project-member</nova:user>
Dec 05 12:04:16 compute-0 nova_compute[187208]:         <nova:project uuid="79895287bd1d488c842f6013729a1f81">tempest-ImagesOneServerNegativeTestJSON-661137252</nova:project>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:16 compute-0 nova_compute[187208]:         <nova:port uuid="539a9707-ef82-4c64-aec4-3759222680f0">
Dec 05 12:04:16 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <entry name="serial">a7616662-639b-4642-b507-614773f4748f</entry>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <entry name="uuid">a7616662-639b-4642-b507-614773f4748f</entry>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.config"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:d2:c8:06"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <target dev="tap539a9707-ef"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/console.log" append="off"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:16 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:16 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:16 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:16 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:16 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.295 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Preparing to wait for external event network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.295 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.295 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.296 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.296 187212 DEBUG nova.virt.libvirt.vif [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1115906111',id=44,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-jo4hn4lg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:12Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=a7616662-639b-4642-b507-614773f4748f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.297 187212 DEBUG nova.network.os_vif_util [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.298 187212 DEBUG nova.network.os_vif_util [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.298 187212 DEBUG os_vif [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.299 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.299 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.300 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.300 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.304 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.304 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap539a9707-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.304 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap539a9707-ef, col_values=(('external_ids', {'iface-id': '539a9707-ef82-4c64-aec4-3759222680f0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:c8:06', 'vm-uuid': 'a7616662-639b-4642-b507-614773f4748f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.306 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:16 compute-0 NetworkManager[55691]: <info>  [1764936256.3087] manager: (tap539a9707-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.310 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.320 187212 INFO os_vif [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef')
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.391 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.393 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.393 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No VIF found with MAC fa:16:3e:d2:c8:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.394 187212 INFO nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Using config drive
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.662 187212 INFO nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Creating config drive at /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.config
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.668 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpicvz7qwy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.699 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Updating instance_info_cache with network_info: [{"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.729 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.729 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Instance network_info: |[{"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.730 187212 DEBUG oslo_concurrency.lockutils [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.731 187212 DEBUG nova.network.neutron [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Refreshing network info cache for port ea8794b1-8d29-4839-af08-e1675802ea0a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.734 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Start _get_guest_xml network_info=[{"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:16 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.740 187212 WARNING nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:16.744 187212 DEBUG nova.virt.libvirt.host [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.027 187212 DEBUG nova.virt.libvirt.host [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.031 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpicvz7qwy" returned: 0 in 0.362s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.032 187212 DEBUG nova.compute.manager [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received event network-changed-539a9707-ef82-4c64-aec4-3759222680f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.033 187212 DEBUG nova.compute.manager [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Refreshing instance network info cache due to event network-changed-539a9707-ef82-4c64-aec4-3759222680f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.033 187212 DEBUG oslo_concurrency.lockutils [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.033 187212 DEBUG oslo_concurrency.lockutils [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.033 187212 DEBUG nova.network.neutron [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Refreshing network info cache for port 539a9707-ef82-4c64-aec4-3759222680f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.053 187212 DEBUG nova.virt.libvirt.host [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.054 187212 DEBUG nova.virt.libvirt.host [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.054 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.054 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.055 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.055 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.055 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.055 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.057 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.060 187212 DEBUG nova.virt.libvirt.vif [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-404632133',display_name='tempest-ImagesTestJSON-server-404632133',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-404632133',id=40,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-d69nje92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:05Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=e5212ff3-c6ed-4f02-99c4-becad0e5f2a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.060 187212 DEBUG nova.network.os_vif_util [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.061 187212 DEBUG nova.network.os_vif_util [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.062 187212 DEBUG nova.objects.instance [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.075 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <uuid>e5212ff3-c6ed-4f02-99c4-becad0e5f2a5</uuid>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <name>instance-00000028</name>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesTestJSON-server-404632133</nova:name>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:16</nova:creationTime>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:17 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:17 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:17 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:17 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:17 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:17 compute-0 nova_compute[187208]:         <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec 05 12:04:17 compute-0 nova_compute[187208]:         <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:17 compute-0 nova_compute[187208]:         <nova:port uuid="ea8794b1-8d29-4839-af08-e1675802ea0a">
Dec 05 12:04:17 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <entry name="serial">e5212ff3-c6ed-4f02-99c4-becad0e5f2a5</entry>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <entry name="uuid">e5212ff3-c6ed-4f02-99c4-becad0e5f2a5</entry>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.config"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:58:21:a9"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <target dev="tapea8794b1-8d"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/console.log" append="off"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:17 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:17 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:17 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:17 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:17 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.076 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Preparing to wait for external event network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.077 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.077 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.077 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.078 187212 DEBUG nova.virt.libvirt.vif [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-404632133',display_name='tempest-ImagesTestJSON-server-404632133',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-404632133',id=40,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-d69nje92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:05Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=e5212ff3-c6ed-4f02-99c4-becad0e5f2a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.078 187212 DEBUG nova.network.os_vif_util [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.078 187212 DEBUG nova.network.os_vif_util [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.079 187212 DEBUG os_vif [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.079 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.079 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.080 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.082 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.083 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea8794b1-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.083 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea8794b1-8d, col_values=(('external_ids', {'iface-id': 'ea8794b1-8d29-4839-af08-e1675802ea0a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:21:a9', 'vm-uuid': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.084 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 NetworkManager[55691]: <info>  [1764936257.0862] manager: (tapea8794b1-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.094 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.096 187212 INFO os_vif [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d')
Dec 05 12:04:17 compute-0 kernel: tap539a9707-ef: entered promiscuous mode
Dec 05 12:04:17 compute-0 NetworkManager[55691]: <info>  [1764936257.1284] manager: (tap539a9707-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Dec 05 12:04:17 compute-0 ovn_controller[95610]: 2025-12-05T12:04:17Z|00332|binding|INFO|Claiming lport 539a9707-ef82-4c64-aec4-3759222680f0 for this chassis.
Dec 05 12:04:17 compute-0 ovn_controller[95610]: 2025-12-05T12:04:17Z|00333|binding|INFO|539a9707-ef82-4c64-aec4-3759222680f0: Claiming fa:16:3e:d2:c8:06 10.100.0.4
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.139 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.148 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:c8:06 10.100.0.4'], port_security=['fa:16:3e:d2:c8:06 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a7616662-639b-4642-b507-614773f4748f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d064000-316c-46a7-a23c-1dc26318b6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79895287bd1d488c842f6013729a1f81', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e1ec2415-6840-4cf9-b5ac-efaf1a9c9a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3804b014-203a-4c47-b0bb-7634579c4ec4, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=539a9707-ef82-4c64-aec4-3759222680f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.149 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 539a9707-ef82-4c64-aec4-3759222680f0 in datapath 5d064000-316c-46a7-a23c-1dc26318b6a4 bound to our chassis
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.151 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5d064000-316c-46a7-a23c-1dc26318b6a4
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.154 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.155 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.155 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:58:21:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.156 187212 INFO nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Using config drive
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.165 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1045f622-c6e5-4686-8e90-4ca1b10a6a45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.166 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5d064000-31 in ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:04:17 compute-0 systemd-machined[153543]: New machine qemu-45-instance-0000002c.
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.169 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5d064000-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.169 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[13f822b2-1798-4604-97f8-68b575def2b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.171 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f9429717-693f-4baf-8373-94f9464c46e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.186 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8819262e-8cd9-402a-ad33-33294532ed67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 systemd[1]: Started Virtual Machine qemu-45-instance-0000002c.
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 ovn_controller[95610]: 2025-12-05T12:04:17Z|00334|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.214 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[803881da-3756-455a-97d6-7e94dc46761b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_controller[95610]: 2025-12-05T12:04:17Z|00335|binding|INFO|Setting lport 539a9707-ef82-4c64-aec4-3759222680f0 ovn-installed in OVS
Dec 05 12:04:17 compute-0 ovn_controller[95610]: 2025-12-05T12:04:17Z|00336|binding|INFO|Setting lport 539a9707-ef82-4c64-aec4-3759222680f0 up in Southbound
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.218 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 systemd-udevd[222287]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:17 compute-0 NetworkManager[55691]: <info>  [1764936257.2454] device (tap539a9707-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:17 compute-0 NetworkManager[55691]: <info>  [1764936257.2472] device (tap539a9707-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.259 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5c0fcf6a-d5d8-4344-a3c2-6a106e53fa4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.267 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4cdce7-d314-4f7e-825c-6b0a35dd9a6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 NetworkManager[55691]: <info>  [1764936257.2683] manager: (tap5d064000-30): new Veth device (/org/freedesktop/NetworkManager/Devices/142)
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.303 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[caa868d5-a762-42a6-ab02-46622d9447ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.307 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3d09cb8e-71bb-43e1-961d-d26e6e9c18d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 NetworkManager[55691]: <info>  [1764936257.3343] device (tap5d064000-30): carrier: link connected
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.339 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[57d6974c-24f3-489b-ad9a-9bc73ccf709a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.379 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[843fd080-89d6-4c83-8b19-d28c6f276f8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363789, 'reachable_time': 18329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222336, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 podman[222293]: 2025-12-05 12:04:17.389575062 +0000 UTC m=+0.112654979 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.403 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ee0d5421-3665-4f41-b79b-57db63c20782]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:6d24'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363789, 'tstamp': 363789}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222340, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.420 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e005e595-87b3-48b5-ba9c-c79674090791]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363789, 'reachable_time': 18329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222341, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.460 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[411ace59-c02e-4476-8ad2-d5725fb2a839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.527 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b90da461-ea69-40d3-9a64-63e042ece49f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.528 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d064000-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.529 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.529 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d064000-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 NetworkManager[55691]: <info>  [1764936257.5319] manager: (tap5d064000-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Dec 05 12:04:17 compute-0 kernel: tap5d064000-30: entered promiscuous mode
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.536 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.537 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5d064000-30, col_values=(('external_ids', {'iface-id': '1b49f23e-d835-4ef5-82b9-a339d97fd4cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:17 compute-0 ovn_controller[95610]: 2025-12-05T12:04:17Z|00337|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.554 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.555 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05cdc05b-8e25-40f6-814c-0194ff809ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.556 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-5d064000-316c-46a7-a23c-1dc26318b6a4
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 5d064000-316c-46a7-a23c-1dc26318b6a4
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:04:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.556 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'env', 'PROCESS_TAG=haproxy-5d064000-316c-46a7-a23c-1dc26318b6a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5d064000-316c-46a7-a23c-1dc26318b6a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.570 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.843 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936257.8427467, a7616662-639b-4642-b507-614773f4748f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.843 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] VM Started (Lifecycle Event)
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.862 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.868 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936257.8429296, a7616662-639b-4642-b507-614773f4748f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.868 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] VM Paused (Lifecycle Event)
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.890 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.893 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.923 187212 DEBUG oslo_concurrency.lockutils [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.923 187212 DEBUG oslo_concurrency.lockutils [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.924 187212 DEBUG nova.compute.manager [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.925 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.928 187212 DEBUG nova.compute.manager [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.929 187212 DEBUG nova.objects.instance [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'flavor' on Instance uuid c2e63727-b45b-4249-a94f-85b0d6314ba0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:17 compute-0 nova_compute[187208]: 2025-12-05 12:04:17.954 187212 DEBUG nova.virt.libvirt.driver [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:04:17 compute-0 podman[222383]: 2025-12-05 12:04:17.960629723 +0000 UTC m=+0.054991612 container create 4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:04:18 compute-0 systemd[1]: Started libpod-conmon-4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc.scope.
Dec 05 12:04:18 compute-0 podman[222383]: 2025-12-05 12:04:17.93218175 +0000 UTC m=+0.026543659 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:04:18 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:04:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95fa845c032e9a9b1ecabac8ab36f6e4863c974067db1dae8ab549ec5cbc0437/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:04:18 compute-0 podman[222383]: 2025-12-05 12:04:18.052703683 +0000 UTC m=+0.147065602 container init 4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 12:04:18 compute-0 podman[222383]: 2025-12-05 12:04:18.058011994 +0000 UTC m=+0.152373883 container start 4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:04:18 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[222398]: [NOTICE]   (222402) : New worker (222404) forked
Dec 05 12:04:18 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[222398]: [NOTICE]   (222402) : Loading success.
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.087 187212 INFO nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Creating config drive at /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.config
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.094 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3smp6vbu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.227 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3smp6vbu" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:18 compute-0 NetworkManager[55691]: <info>  [1764936258.3162] manager: (tapea8794b1-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Dec 05 12:04:18 compute-0 systemd-udevd[222315]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:18 compute-0 kernel: tapea8794b1-8d: entered promiscuous mode
Dec 05 12:04:18 compute-0 ovn_controller[95610]: 2025-12-05T12:04:18Z|00338|binding|INFO|Claiming lport ea8794b1-8d29-4839-af08-e1675802ea0a for this chassis.
Dec 05 12:04:18 compute-0 ovn_controller[95610]: 2025-12-05T12:04:18Z|00339|binding|INFO|ea8794b1-8d29-4839-af08-e1675802ea0a: Claiming fa:16:3e:58:21:a9 10.100.0.3
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.325 187212 DEBUG nova.network.neutron [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Updated VIF entry in instance network info cache for port 539a9707-ef82-4c64-aec4-3759222680f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.326 187212 DEBUG nova.network.neutron [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Updating instance_info_cache with network_info: [{"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.328 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.338 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:21:a9 10.100.0.3'], port_security=['fa:16:3e:58:21:a9 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ea8794b1-8d29-4839-af08-e1675802ea0a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.339 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ea8794b1-8d29-4839-af08-e1675802ea0a in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.341 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:04:18 compute-0 NetworkManager[55691]: <info>  [1764936258.3517] device (tapea8794b1-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.352 187212 DEBUG oslo_concurrency.lockutils [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:18 compute-0 NetworkManager[55691]: <info>  [1764936258.3528] device (tapea8794b1-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.353 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78b903e9-677f-45ed-9c70-4eea929bae0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.355 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.357 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.357 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3a0a84ab-915e-472f-b546-aa2334176199]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.358 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3ddf8e-2d06-4a26-8d85-06e0862e4443]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.370 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[07afddd6-74b9-45c4-83bf-f3ccbb12afc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 systemd-machined[153543]: New machine qemu-46-instance-00000028.
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.384 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:18 compute-0 ovn_controller[95610]: 2025-12-05T12:04:18Z|00340|binding|INFO|Setting lport ea8794b1-8d29-4839-af08-e1675802ea0a ovn-installed in OVS
Dec 05 12:04:18 compute-0 ovn_controller[95610]: 2025-12-05T12:04:18Z|00341|binding|INFO|Setting lport ea8794b1-8d29-4839-af08-e1675802ea0a up in Southbound
Dec 05 12:04:18 compute-0 systemd[1]: Started Virtual Machine qemu-46-instance-00000028.
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.391 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.395 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[492af5ad-a1f4-41ae-95ef-895f916a1f9d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.424 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f80a38-baa4-419a-9024-dcdc03449369]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 NetworkManager[55691]: <info>  [1764936258.4415] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/145)
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.438 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca439e8-46a0-4767-a22a-a6a4750c3e76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.482 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[199fb589-1e01-4848-8a3d-117e3a6d53b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.486 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa200c9-f563-4f7f-9f4b-0c17355f15fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 NetworkManager[55691]: <info>  [1764936258.5147] device (tap41b3b495-c0): carrier: link connected
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.521 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[02f46216-d081-4f39-b282-4c0066a0bf0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.538 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c5dde2a3-ead2-4c10-bf95-218496486dfe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363907, 'reachable_time': 21906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222448, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.552 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d222c94-9013-47f6-857c-7d361542acb0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363907, 'tstamp': 363907}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222449, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.573 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[87bb2706-19be-4844-8276-47c15252cffe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363907, 'reachable_time': 21906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222450, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.610 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[194482ee-da4e-4c75-9a82-4aa3a5136f59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.685 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf57426-e8ac-4f5c-a7d5-28ae7e9c26ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.687 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.687 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.688 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.690 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:18 compute-0 NetworkManager[55691]: <info>  [1764936258.6908] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Dec 05 12:04:18 compute-0 kernel: tap41b3b495-c0: entered promiscuous mode
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.692 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.693 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:18 compute-0 ovn_controller[95610]: 2025-12-05T12:04:18Z|00342|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.706 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.709 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.710 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d667d97-356b-4262-83cb-f85e9cf5fd7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.711 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:04:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.711 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.880 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936258.8794317, e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.880 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] VM Started (Lifecycle Event)
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.903 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.907 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936258.8799453, e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.907 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] VM Paused (Lifecycle Event)
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.925 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.929 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:18 compute-0 nova_compute[187208]: 2025-12-05 12:04:18.947 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:19 compute-0 podman[222489]: 2025-12-05 12:04:19.110138817 +0000 UTC m=+0.062321171 container create c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:04:19 compute-0 systemd[1]: Started libpod-conmon-c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589.scope.
Dec 05 12:04:19 compute-0 podman[222489]: 2025-12-05 12:04:19.07280381 +0000 UTC m=+0.024986154 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:04:19 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:04:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b7c90b66f75d352f77a65a6e2c60491d5ded526586a774fd4cd500b1acf38ae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:04:19 compute-0 podman[222489]: 2025-12-05 12:04:19.200466087 +0000 UTC m=+0.152648431 container init c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:04:19 compute-0 podman[222489]: 2025-12-05 12:04:19.205705967 +0000 UTC m=+0.157888291 container start c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 05 12:04:19 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[222505]: [NOTICE]   (222509) : New worker (222511) forked
Dec 05 12:04:19 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[222505]: [NOTICE]   (222509) : Loading success.
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.153 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Successfully updated port: d7b765ff-93e1-4594-9e3c-e177dee2e07b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.172 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "refresh_cache-082d2145-1505-4170-9a11-4e46bf86fed2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.172 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquired lock "refresh_cache-082d2145-1505-4170-9a11-4e46bf86fed2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.173 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.178 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Successfully updated port: eabadaa6-16c4-434c-83ea-96dfa62d7f79 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.195 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "refresh_cache-58c3288f-57bf-4c62-8d69-9842a22e43d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.195 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquired lock "refresh_cache-58c3288f-57bf-4c62-8d69-9842a22e43d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.196 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.440 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "004672c5-70e2-4940-bc9c-8971d94cc037" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.441 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.464 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.572 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.573 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.579 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.579 187212 INFO nova.compute.claims [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.623 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.627 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.794 187212 DEBUG nova.compute.provider_tree [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.810 187212 DEBUG nova.scheduler.client.report [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.832 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.832 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.876 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.876 187212 DEBUG nova.network.neutron [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.892 187212 INFO nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:20 compute-0 nova_compute[187208]: 2025-12-05 12:04:20.907 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.009 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.010 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.010 187212 INFO nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Creating image(s)
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.010 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.011 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.011 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.024 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.082 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.084 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.085 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.095 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.157 187212 DEBUG nova.compute.manager [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received event network-changed-d7b765ff-93e1-4594-9e3c-e177dee2e07b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.158 187212 DEBUG nova.compute.manager [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Refreshing instance network info cache due to event network-changed-d7b765ff-93e1-4594-9e3c-e177dee2e07b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.158 187212 DEBUG oslo_concurrency.lockutils [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-082d2145-1505-4170-9a11-4e46bf86fed2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.160 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.160 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.210 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.211 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.212 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.231 187212 DEBUG nova.network.neutron [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Updated VIF entry in instance network info cache for port ea8794b1-8d29-4839-af08-e1675802ea0a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.232 187212 DEBUG nova.network.neutron [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Updating instance_info_cache with network_info: [{"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.249 187212 DEBUG oslo_concurrency.lockutils [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.268 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.269 187212 DEBUG nova.virt.disk.api [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Checking if we can resize image /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.269 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.287 187212 DEBUG nova.network.neutron [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.287 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.324 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.324 187212 DEBUG nova.virt.disk.api [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Cannot resize image /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.325 187212 DEBUG nova.objects.instance [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'migration_context' on Instance uuid 004672c5-70e2-4940-bc9c-8971d94cc037 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.336 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.337 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Ensure instance console log exists: /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.337 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.338 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.338 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.339 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.345 187212 WARNING nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.350 187212 DEBUG nova.virt.libvirt.host [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.351 187212 DEBUG nova.virt.libvirt.host [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.354 187212 DEBUG nova.virt.libvirt.host [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.354 187212 DEBUG nova.virt.libvirt.host [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.355 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.355 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.355 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.356 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.356 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.356 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.356 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.357 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.357 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.357 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.358 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.358 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.362 187212 DEBUG nova.objects.instance [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'pci_devices' on Instance uuid 004672c5-70e2-4940-bc9c-8971d94cc037 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.375 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <uuid>004672c5-70e2-4940-bc9c-8971d94cc037</uuid>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <name>instance-0000002d</name>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <nova:name>tempest-ListImageFiltersTestJSON-server-469388429</nova:name>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:21</nova:creationTime>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:21 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:21 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:21 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:21 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:21 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:21 compute-0 nova_compute[187208]:         <nova:user uuid="8456efa356654e5c990efa4aef688e8a">tempest-ListImageFiltersTestJSON-277323355-project-member</nova:user>
Dec 05 12:04:21 compute-0 nova_compute[187208]:         <nova:project uuid="42d9566206cb469ebd803d0600019533">tempest-ListImageFiltersTestJSON-277323355</nova:project>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <entry name="serial">004672c5-70e2-4940-bc9c-8971d94cc037</entry>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <entry name="uuid">004672c5-70e2-4940-bc9c-8971d94cc037</entry>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.config"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/console.log" append="off"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:21 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:21 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:21 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:21 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:21 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.429 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.431 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.432 187212 INFO nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Using config drive
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.656 187212 INFO nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Creating config drive at /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.config
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.661 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzsr4fpwc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:21 compute-0 nova_compute[187208]: 2025-12-05 12:04:21.791 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzsr4fpwc" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:21 compute-0 systemd-machined[153543]: New machine qemu-47-instance-0000002d.
Dec 05 12:04:21 compute-0 systemd[1]: Started Virtual Machine qemu-47-instance-0000002d.
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.086 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.212 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936262.2122447, 004672c5-70e2-4940-bc9c-8971d94cc037 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.213 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] VM Resumed (Lifecycle Event)
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.217 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.217 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.222 187212 INFO nova.virt.libvirt.driver [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance spawned successfully.
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.222 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.246 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.253 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.257 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.258 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.258 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.259 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.259 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.260 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.290 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.291 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936262.2148862, 004672c5-70e2-4940-bc9c-8971d94cc037 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.292 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] VM Started (Lifecycle Event)
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.318 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.322 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.337 187212 INFO nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Took 1.33 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.338 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:22 compute-0 rsyslogd[1004]: imjournal from <np0005546909:nova_compute>: begin to drop messages due to rate-limiting
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.347 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.396 187212 INFO nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Took 1.85 seconds to build instance.
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.415 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 1.974s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.574 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.582 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "f50947f2-f8d0-4d6b-bca4-b5412a206503" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.582 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.601 187212 DEBUG nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.693 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.694 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.702 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.703 187212 INFO nova.compute.claims [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.724 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Updating instance_info_cache with network_info: [{"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.748 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Updating instance_info_cache with network_info: [{"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.772 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Releasing lock "refresh_cache-082d2145-1505-4170-9a11-4e46bf86fed2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.773 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Instance network_info: |[{"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.774 187212 DEBUG oslo_concurrency.lockutils [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-082d2145-1505-4170-9a11-4e46bf86fed2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.775 187212 DEBUG nova.network.neutron [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Refreshing network info cache for port d7b765ff-93e1-4594-9e3c-e177dee2e07b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.778 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Start _get_guest_xml network_info=[{"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.782 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Releasing lock "refresh_cache-58c3288f-57bf-4c62-8d69-9842a22e43d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.782 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Instance network_info: |[{"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.786 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Start _get_guest_xml network_info=[{"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.792 187212 WARNING nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.795 187212 WARNING nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.808 187212 DEBUG nova.virt.libvirt.host [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.809 187212 DEBUG nova.virt.libvirt.host [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.810 187212 DEBUG nova.virt.libvirt.host [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.810 187212 DEBUG nova.virt.libvirt.host [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.818 187212 DEBUG nova.virt.libvirt.host [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.820 187212 DEBUG nova.virt.libvirt.host [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.820 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.821 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.821 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.822 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.822 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.822 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.823 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.823 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.823 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.823 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.824 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.824 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.828 187212 DEBUG nova.virt.libvirt.vif [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-330967889',display_name='tempest-MultipleCreateTestJSON-server-330967889-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-330967889-1',id=42,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-epte65hh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:10Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=082d2145-1505-4170-9a11-4e46bf86fed2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.829 187212 DEBUG nova.network.os_vif_util [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.830 187212 DEBUG nova.network.os_vif_util [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7b765ff-93e1-4594-9e3c-e177dee2e07b,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7b765ff-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.832 187212 DEBUG nova.objects.instance [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'pci_devices' on Instance uuid 082d2145-1505-4170-9a11-4e46bf86fed2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.833 187212 DEBUG nova.virt.libvirt.host [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.834 187212 DEBUG nova.virt.libvirt.host [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.834 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.834 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.835 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.835 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.835 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.836 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.836 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.836 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.837 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.837 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.837 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.838 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.842 187212 DEBUG nova.virt.libvirt.vif [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-330967889',display_name='tempest-MultipleCreateTestJSON-server-330967889-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-330967889-2',id=43,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-epte65hh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:10Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=58c3288f-57bf-4c62-8d69-9842a22e43d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.843 187212 DEBUG nova.network.os_vif_util [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.844 187212 DEBUG nova.network.os_vif_util [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c8:f9,bridge_name='br-int',has_traffic_filtering=True,id=eabadaa6-16c4-434c-83ea-96dfa62d7f79,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeabadaa6-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.845 187212 DEBUG nova.objects.instance [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'pci_devices' on Instance uuid 58c3288f-57bf-4c62-8d69-9842a22e43d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.864 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <uuid>082d2145-1505-4170-9a11-4e46bf86fed2</uuid>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <name>instance-0000002a</name>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:name>tempest-MultipleCreateTestJSON-server-330967889-1</nova:name>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:22</nova:creationTime>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:user uuid="40620135b1ff4f8d9d80eb79f51fd593">tempest-MultipleCreateTestJSON-1941206426-project-member</nova:user>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:project uuid="bebbbd9623064681bb9350747fba600e">tempest-MultipleCreateTestJSON-1941206426</nova:project>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:port uuid="d7b765ff-93e1-4594-9e3c-e177dee2e07b">
Dec 05 12:04:22 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="serial">082d2145-1505-4170-9a11-4e46bf86fed2</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="uuid">082d2145-1505-4170-9a11-4e46bf86fed2</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.config"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:92:a3:7b"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <target dev="tapd7b765ff-93"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/console.log" append="off"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:22 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:22 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.871 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Preparing to wait for external event network-vif-plugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.872 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.872 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.872 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.873 187212 DEBUG nova.virt.libvirt.vif [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-330967889',display_name='tempest-MultipleCreateTestJSON-server-330967889-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-330967889-1',id=42,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-epte65hh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:10Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=082d2145-1505-4170-9a11-4e46bf86fed2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.874 187212 DEBUG nova.network.os_vif_util [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.875 187212 DEBUG nova.network.os_vif_util [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7b765ff-93e1-4594-9e3c-e177dee2e07b,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7b765ff-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.876 187212 DEBUG os_vif [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7b765ff-93e1-4594-9e3c-e177dee2e07b,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7b765ff-93') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.877 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.877 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.878 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.885 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <uuid>58c3288f-57bf-4c62-8d69-9842a22e43d6</uuid>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <name>instance-0000002b</name>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:name>tempest-MultipleCreateTestJSON-server-330967889-2</nova:name>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:22</nova:creationTime>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:user uuid="40620135b1ff4f8d9d80eb79f51fd593">tempest-MultipleCreateTestJSON-1941206426-project-member</nova:user>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:project uuid="bebbbd9623064681bb9350747fba600e">tempest-MultipleCreateTestJSON-1941206426</nova:project>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         <nova:port uuid="eabadaa6-16c4-434c-83ea-96dfa62d7f79">
Dec 05 12:04:22 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="serial">58c3288f-57bf-4c62-8d69-9842a22e43d6</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="uuid">58c3288f-57bf-4c62-8d69-9842a22e43d6</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.config"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:1b:c8:f9"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <target dev="tapeabadaa6-16"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/console.log" append="off"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:22 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:22 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:22 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:22 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:22 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.891 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Preparing to wait for external event network-vif-plugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.892 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.892 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.893 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.894 187212 DEBUG nova.virt.libvirt.vif [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-330967889',display_name='tempest-MultipleCreateTestJSON-server-330967889-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-330967889-2',id=43,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-epte65hh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:10Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=58c3288f-57bf-4c62-8d69-9842a22e43d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.895 187212 DEBUG nova.network.os_vif_util [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.896 187212 DEBUG nova.network.os_vif_util [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c8:f9,bridge_name='br-int',has_traffic_filtering=True,id=eabadaa6-16c4-434c-83ea-96dfa62d7f79,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeabadaa6-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.897 187212 DEBUG os_vif [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c8:f9,bridge_name='br-int',has_traffic_filtering=True,id=eabadaa6-16c4-434c-83ea-96dfa62d7f79,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeabadaa6-16') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.899 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.899 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.900 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.904 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7b765ff-93, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.905 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd7b765ff-93, col_values=(('external_ids', {'iface-id': 'd7b765ff-93e1-4594-9e3c-e177dee2e07b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:a3:7b', 'vm-uuid': '082d2145-1505-4170-9a11-4e46bf86fed2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:22 compute-0 NetworkManager[55691]: <info>  [1764936262.9076] manager: (tapd7b765ff-93): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.906 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.910 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.922 187212 INFO os_vif [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7b765ff-93e1-4594-9e3c-e177dee2e07b,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7b765ff-93')
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.923 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeabadaa6-16, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.924 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeabadaa6-16, col_values=(('external_ids', {'iface-id': 'eabadaa6-16c4-434c-83ea-96dfa62d7f79', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:c8:f9', 'vm-uuid': '58c3288f-57bf-4c62-8d69-9842a22e43d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:22 compute-0 NetworkManager[55691]: <info>  [1764936262.9264] manager: (tapeabadaa6-16): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.925 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.928 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.933 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:22 compute-0 nova_compute[187208]: 2025-12-05 12:04:22.934 187212 INFO os_vif [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c8:f9,bridge_name='br-int',has_traffic_filtering=True,id=eabadaa6-16c4-434c-83ea-96dfa62d7f79,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeabadaa6-16')
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.017 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.017 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.018 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No VIF found with MAC fa:16:3e:92:a3:7b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.018 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Using config drive
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.022 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.022 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.022 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No VIF found with MAC fa:16:3e:1b:c8:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.023 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Using config drive
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.029 187212 DEBUG nova.compute.provider_tree [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.061 187212 DEBUG nova.scheduler.client.report [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.086 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.087 187212 DEBUG nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.134 187212 DEBUG nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.134 187212 DEBUG nova.network.neutron [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.160 187212 INFO nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.181 187212 DEBUG nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.281 187212 DEBUG nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.282 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.283 187212 INFO nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Creating image(s)
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.283 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "/var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.284 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "/var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.284 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "/var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.296 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.359 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.360 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.361 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.372 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.472 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.473 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.503 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Creating config drive at /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.config
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.509 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnozteyvi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.534 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk 1073741824" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.536 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.536 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.611 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.613 187212 DEBUG nova.virt.disk.api [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Checking if we can resize image /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.613 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.640 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnozteyvi" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.646 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Creating config drive at /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.config
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.650 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjk4rxkqs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.684 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.685 187212 DEBUG nova.virt.disk.api [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Cannot resize image /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.686 187212 DEBUG nova.objects.instance [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'migration_context' on Instance uuid f50947f2-f8d0-4d6b-bca4-b5412a206503 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.698 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.698 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Ensure instance console log exists: /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.699 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.699 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.699 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:23 compute-0 kernel: tapd7b765ff-93: entered promiscuous mode
Dec 05 12:04:23 compute-0 NetworkManager[55691]: <info>  [1764936263.7177] manager: (tapd7b765ff-93): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Dec 05 12:04:23 compute-0 systemd-udevd[222561]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.724 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00343|binding|INFO|Claiming lport d7b765ff-93e1-4594-9e3c-e177dee2e07b for this chassis.
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00344|binding|INFO|d7b765ff-93e1-4594-9e3c-e177dee2e07b: Claiming fa:16:3e:92:a3:7b 10.100.0.12
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.731 187212 DEBUG nova.network.neutron [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.731 187212 DEBUG nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:23 compute-0 NetworkManager[55691]: <info>  [1764936263.7329] device (tapd7b765ff-93): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.732 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:23 compute-0 NetworkManager[55691]: <info>  [1764936263.7338] device (tapd7b765ff-93): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.735 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.737 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:a3:7b 10.100.0.12'], port_security=['fa:16:3e:92:a3:7b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '082d2145-1505-4170-9a11-4e46bf86fed2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d7b765ff-93e1-4594-9e3c-e177dee2e07b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.739 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d7b765ff-93e1-4594-9e3c-e177dee2e07b in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 bound to our chassis
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.741 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.756 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bd70a5c7-46be-483d-b75f-366b8e159c8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.757 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8ea1ed6-91 in ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.762 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8ea1ed6-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.762 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[56e47e8c-f4bc-4206-a51f-23754600c01d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.763 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[205e1f38-20c7-499c-bce7-129cb06c9871]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.763 187212 WARNING nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.769 187212 DEBUG nova.virt.libvirt.host [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.770 187212 DEBUG nova.virt.libvirt.host [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:23 compute-0 systemd-machined[153543]: New machine qemu-48-instance-0000002a.
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.777 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjk4rxkqs" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:23 compute-0 systemd[1]: Started Virtual Machine qemu-48-instance-0000002a.
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.780 187212 DEBUG nova.virt.libvirt.host [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.780 187212 DEBUG nova.virt.libvirt.host [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.781 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.781 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.781 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.781 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.782 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.782 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.782 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.782 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.782 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.782 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.783 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a2fdd026-f7b8-4977-ba67-aeece266acbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.787 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.788 187212 DEBUG nova.virt.hardware [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.792 187212 DEBUG nova.objects.instance [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'pci_devices' on Instance uuid f50947f2-f8d0-4d6b-bca4-b5412a206503 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.797 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2b18a07c-f9f2-4c97-ae0b-2c7b47d7a6d2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00345|binding|INFO|Setting lport d7b765ff-93e1-4594-9e3c-e177dee2e07b ovn-installed in OVS
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00346|binding|INFO|Setting lport d7b765ff-93e1-4594-9e3c-e177dee2e07b up in Southbound
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.802 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.810 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <uuid>f50947f2-f8d0-4d6b-bca4-b5412a206503</uuid>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <name>instance-0000002e</name>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <nova:name>tempest-ListImageFiltersTestJSON-server-1806870616</nova:name>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:23</nova:creationTime>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:23 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:23 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:23 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:23 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:23 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:23 compute-0 podman[222596]: 2025-12-05 12:04:23.810745627 +0000 UTC m=+0.110185828 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 12:04:23 compute-0 nova_compute[187208]:         <nova:user uuid="8456efa356654e5c990efa4aef688e8a">tempest-ListImageFiltersTestJSON-277323355-project-member</nova:user>
Dec 05 12:04:23 compute-0 nova_compute[187208]:         <nova:project uuid="42d9566206cb469ebd803d0600019533">tempest-ListImageFiltersTestJSON-277323355</nova:project>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <entry name="serial">f50947f2-f8d0-4d6b-bca4-b5412a206503</entry>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <entry name="uuid">f50947f2-f8d0-4d6b-bca4-b5412a206503</entry>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.config"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/console.log" append="off"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:23 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:23 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:23 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:23 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:23 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:23 compute-0 podman[222597]: 2025-12-05 12:04:23.81466616 +0000 UTC m=+0.113959417 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.843 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[48ecb425-a92a-4dcf-90b4-ccc8fbc701e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.849 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8a9afe81-fbb3-46d9-82bf-3f0f079fbb20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 NetworkManager[55691]: <info>  [1764936263.8505] manager: (tapb8ea1ed6-90): new Veth device (/org/freedesktop/NetworkManager/Devices/150)
Dec 05 12:04:23 compute-0 systemd-udevd[222622]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:23 compute-0 kernel: tapeabadaa6-16: entered promiscuous mode
Dec 05 12:04:23 compute-0 NetworkManager[55691]: <info>  [1764936263.8699] manager: (tapeabadaa6-16): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00347|binding|INFO|Claiming lport eabadaa6-16c4-434c-83ea-96dfa62d7f79 for this chassis.
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00348|binding|INFO|eabadaa6-16c4-434c-83ea-96dfa62d7f79: Claiming fa:16:3e:1b:c8:f9 10.100.0.13
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.875 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:23 compute-0 NetworkManager[55691]: <info>  [1764936263.8835] device (tapeabadaa6-16): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:23 compute-0 NetworkManager[55691]: <info>  [1764936263.8845] device (tapeabadaa6-16): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.888 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c8:f9 10.100.0.13'], port_security=['fa:16:3e:1b:c8:f9 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '58c3288f-57bf-4c62-8d69-9842a22e43d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=eabadaa6-16c4-434c-83ea-96dfa62d7f79) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00349|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00350|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00351|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.899 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.899 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.900 187212 INFO nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Using config drive
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00352|binding|INFO|Setting lport eabadaa6-16c4-434c-83ea-96dfa62d7f79 ovn-installed in OVS
Dec 05 12:04:23 compute-0 ovn_controller[95610]: 2025-12-05T12:04:23Z|00353|binding|INFO|Setting lport eabadaa6-16c4-434c-83ea-96dfa62d7f79 up in Southbound
Dec 05 12:04:23 compute-0 nova_compute[187208]: 2025-12-05 12:04:23.905 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.912 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6a2163-44ed-402f-b1d5-f65a27433b4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.919 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b681ebc0-9e20-4021-b86b-aa58808879f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 NetworkManager[55691]: <info>  [1764936263.9447] device (tapb8ea1ed6-90): carrier: link connected
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.952 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf169c1-7b54-4576-9064-d7bcf1b92cb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 systemd-machined[153543]: New machine qemu-49-instance-0000002b.
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.969 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa6f7cd-9b3c-4fe4-849f-9d95f30b486a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364450, 'reachable_time': 17301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222690, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:23 compute-0 systemd[1]: Started Virtual Machine qemu-49-instance-0000002b.
Dec 05 12:04:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:23.991 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5e4e12-45b8-46dd-ba20-af5e5e8a78d2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:fb51'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364450, 'tstamp': 364450}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222691, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.021 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a43bf42e-8580-45c9-8ef2-27bfb25866fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364450, 'reachable_time': 17301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222693, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.054 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cf45f121-2b8b-4bb3-8c30-63eeb41c1f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.096 187212 DEBUG nova.compute.manager [req-613ee0d4-ec0a-4ad0-96ec-d85fc227da11 req-f07521d2-ce14-4ede-8d03-94a418b485b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received event network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.096 187212 DEBUG oslo_concurrency.lockutils [req-613ee0d4-ec0a-4ad0-96ec-d85fc227da11 req-f07521d2-ce14-4ede-8d03-94a418b485b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.097 187212 DEBUG oslo_concurrency.lockutils [req-613ee0d4-ec0a-4ad0-96ec-d85fc227da11 req-f07521d2-ce14-4ede-8d03-94a418b485b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.097 187212 DEBUG oslo_concurrency.lockutils [req-613ee0d4-ec0a-4ad0-96ec-d85fc227da11 req-f07521d2-ce14-4ede-8d03-94a418b485b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.097 187212 DEBUG nova.compute.manager [req-613ee0d4-ec0a-4ad0-96ec-d85fc227da11 req-f07521d2-ce14-4ede-8d03-94a418b485b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Processing event network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.098 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.102 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936264.1023188, e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.102 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] VM Resumed (Lifecycle Event)
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.105 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.119 187212 INFO nova.virt.libvirt.driver [-] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Instance spawned successfully.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.120 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.124 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.127 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4f5630-96cf-4e54-9f6a-7bbca61e237e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.129 187212 INFO nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Creating config drive at /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.config
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.132 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.133 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.133 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ea1ed6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.134 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5stg35h7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:24 compute-0 NetworkManager[55691]: <info>  [1764936264.1446] manager: (tapb8ea1ed6-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Dec 05 12:04:24 compute-0 kernel: tapb8ea1ed6-90: entered promiscuous mode
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.163 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ea1ed6-90, col_values=(('external_ids', {'iface-id': '6f012c31-72e4-4df5-be68-787aa910fb9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:24 compute-0 ovn_controller[95610]: 2025-12-05T12:04:24Z|00354|binding|INFO|Releasing lport 6f012c31-72e4-4df5-be68-787aa910fb9c from this chassis (sb_readonly=0)
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.177 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.174 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.181 187212 DEBUG nova.compute.manager [req-e184fbbf-767f-4801-93f0-ce000ec045fe req-9c3a9b17-235c-444d-8e38-2198d705c055 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Received event network-changed-eabadaa6-16c4-434c-83ea-96dfa62d7f79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.181 187212 DEBUG nova.compute.manager [req-e184fbbf-767f-4801-93f0-ce000ec045fe req-9c3a9b17-235c-444d-8e38-2198d705c055 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Refreshing instance network info cache due to event network-changed-eabadaa6-16c4-434c-83ea-96dfa62d7f79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.182 187212 DEBUG oslo_concurrency.lockutils [req-e184fbbf-767f-4801-93f0-ce000ec045fe req-9c3a9b17-235c-444d-8e38-2198d705c055 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-58c3288f-57bf-4c62-8d69-9842a22e43d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.182 187212 DEBUG oslo_concurrency.lockutils [req-e184fbbf-767f-4801-93f0-ce000ec045fe req-9c3a9b17-235c-444d-8e38-2198d705c055 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-58c3288f-57bf-4c62-8d69-9842a22e43d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.182 187212 DEBUG nova.network.neutron [req-e184fbbf-767f-4801-93f0-ce000ec045fe req-9c3a9b17-235c-444d-8e38-2198d705c055 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Refreshing network info cache for port eabadaa6-16c4-434c-83ea-96dfa62d7f79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.183 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.181 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ae57c84d-3b7a-4bc2-9416-8cd7f6b53ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.184 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.pid.haproxy
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:04:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:24.186 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'env', 'PROCESS_TAG=haproxy-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.192 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.193 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.193 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.194 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.194 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.194 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.225 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.270 187212 INFO nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Took 18.98 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.270 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.280 187212 DEBUG oslo_concurrency.processutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5stg35h7" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.359 187212 INFO nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Took 19.41 seconds to build instance.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.372 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936264.3722613, 58c3288f-57bf-4c62-8d69-9842a22e43d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.373 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] VM Started (Lifecycle Event)
Dec 05 12:04:24 compute-0 systemd-machined[153543]: New machine qemu-50-instance-0000002e.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.376 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:24 compute-0 systemd[1]: Started Virtual Machine qemu-50-instance-0000002e.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.388 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.394 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936264.373241, 58c3288f-57bf-4c62-8d69-9842a22e43d6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.394 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] VM Paused (Lifecycle Event)
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.418 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.441 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.482 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.568 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936264.5685732, 082d2145-1505-4170-9a11-4e46bf86fed2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.569 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] VM Started (Lifecycle Event)
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.599 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.604 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936264.5686603, 082d2145-1505-4170-9a11-4e46bf86fed2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.604 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] VM Paused (Lifecycle Event)
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.622 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.625 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.643 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:24 compute-0 podman[222763]: 2025-12-05 12:04:24.587791275 +0000 UTC m=+0.026223935 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.816 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936264.8087368, f50947f2-f8d0-4d6b-bca4-b5412a206503 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.816 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] VM Resumed (Lifecycle Event)
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.818 187212 DEBUG nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.818 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.821 187212 INFO nova.virt.libvirt.driver [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Instance spawned successfully.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.821 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.842 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.846 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.846 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.846 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.847 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.847 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.848 187212 DEBUG nova.virt.libvirt.driver [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.852 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.875 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.875 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936264.8094037, f50947f2-f8d0-4d6b-bca4-b5412a206503 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.875 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] VM Started (Lifecycle Event)
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.903 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.906 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.929 187212 INFO nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Took 1.65 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.929 187212 DEBUG nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.930 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:24 compute-0 podman[222763]: 2025-12-05 12:04:24.979348182 +0000 UTC m=+0.417780822 container create c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.983 187212 INFO nova.compute.manager [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Took 2.32 seconds to build instance.
Dec 05 12:04:24 compute-0 nova_compute[187208]: 2025-12-05 12:04:24.996 187212 DEBUG oslo_concurrency.lockutils [None req-6f7dd785-4a06-46cd-9776-7f875cd7dcfa 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.414s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:25 compute-0 systemd[1]: Started libpod-conmon-c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec.scope.
Dec 05 12:04:25 compute-0 nova_compute[187208]: 2025-12-05 12:04:25.025 187212 DEBUG nova.network.neutron [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Updated VIF entry in instance network info cache for port d7b765ff-93e1-4594-9e3c-e177dee2e07b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:25 compute-0 nova_compute[187208]: 2025-12-05 12:04:25.025 187212 DEBUG nova.network.neutron [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Updating instance_info_cache with network_info: [{"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:25 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:04:25 compute-0 nova_compute[187208]: 2025-12-05 12:04:25.039 187212 DEBUG oslo_concurrency.lockutils [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-082d2145-1505-4170-9a11-4e46bf86fed2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f5a032348eaf428a0c9b64652c19c912ddd20a0b5d38a19d66425a40143449/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:04:25 compute-0 podman[222763]: 2025-12-05 12:04:25.074886068 +0000 UTC m=+0.513318738 container init c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:04:25 compute-0 podman[222763]: 2025-12-05 12:04:25.082725164 +0000 UTC m=+0.521157804 container start c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 12:04:25 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[222784]: [NOTICE]   (222788) : New worker (222790) forked
Dec 05 12:04:25 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[222784]: [NOTICE]   (222788) : Loading success.
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.147 104471 INFO neutron.agent.ovn.metadata.agent [-] Port eabadaa6-16c4-434c-83ea-96dfa62d7f79 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.150 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.168 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3e8a80-0e62-4574-ba37-1ae330a1838a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.220 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[36d0364c-ebfd-4b99-b9b8-2f1a294a0980]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.224 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4555bb-6c2e-4290-9b84-912ea0d92884]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.277 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[da7361bb-61e5-47e0-9d92-edf7938d1a59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.296 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[814633f9-7ca9-4d11-bceb-1805da4f6b15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364450, 'reachable_time': 17301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222804, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.313 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[46ddd545-32de-4466-9c3f-ff1885c89426]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364464, 'tstamp': 364464}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222806, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364467, 'tstamp': 364467}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222806, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.317 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:25 compute-0 nova_compute[187208]: 2025-12-05 12:04:25.320 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.322 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ea1ed6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.323 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.323 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ea1ed6-90, col_values=(('external_ids', {'iface-id': '6f012c31-72e4-4df5-be68-787aa910fb9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:25.323 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:25 compute-0 nova_compute[187208]: 2025-12-05 12:04:25.321 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:26 compute-0 nova_compute[187208]: 2025-12-05 12:04:26.194 187212 DEBUG nova.network.neutron [req-e184fbbf-767f-4801-93f0-ce000ec045fe req-9c3a9b17-235c-444d-8e38-2198d705c055 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Updated VIF entry in instance network info cache for port eabadaa6-16c4-434c-83ea-96dfa62d7f79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:26 compute-0 nova_compute[187208]: 2025-12-05 12:04:26.194 187212 DEBUG nova.network.neutron [req-e184fbbf-767f-4801-93f0-ce000ec045fe req-9c3a9b17-235c-444d-8e38-2198d705c055 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Updating instance_info_cache with network_info: [{"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:26 compute-0 nova_compute[187208]: 2025-12-05 12:04:26.213 187212 DEBUG oslo_concurrency.lockutils [req-e184fbbf-767f-4801-93f0-ce000ec045fe req-9c3a9b17-235c-444d-8e38-2198d705c055 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-58c3288f-57bf-4c62-8d69-9842a22e43d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:26 compute-0 ovn_controller[95610]: 2025-12-05T12:04:26Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:64:8d:59 10.100.0.4
Dec 05 12:04:26 compute-0 ovn_controller[95610]: 2025-12-05T12:04:26Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:64:8d:59 10.100.0.4
Dec 05 12:04:27 compute-0 nova_compute[187208]: 2025-12-05 12:04:27.575 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:27 compute-0 nova_compute[187208]: 2025-12-05 12:04:27.926 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.012 187212 DEBUG nova.virt.libvirt.driver [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.168 187212 DEBUG nova.compute.manager [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received event network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.168 187212 DEBUG oslo_concurrency.lockutils [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.168 187212 DEBUG oslo_concurrency.lockutils [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.169 187212 DEBUG oslo_concurrency.lockutils [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.169 187212 DEBUG nova.compute.manager [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] No waiting events found dispatching network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.169 187212 WARNING nova.compute.manager [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received unexpected event network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a for instance with vm_state active and task_state None.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.169 187212 DEBUG nova.compute.manager [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Received event network-vif-plugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.170 187212 DEBUG oslo_concurrency.lockutils [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.170 187212 DEBUG oslo_concurrency.lockutils [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.170 187212 DEBUG oslo_concurrency.lockutils [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.170 187212 DEBUG nova.compute.manager [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Processing event network-vif-plugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.171 187212 DEBUG nova.compute.manager [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Received event network-vif-plugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.171 187212 DEBUG oslo_concurrency.lockutils [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.171 187212 DEBUG oslo_concurrency.lockutils [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.171 187212 DEBUG oslo_concurrency.lockutils [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.172 187212 DEBUG nova.compute.manager [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] No waiting events found dispatching network-vif-plugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.172 187212 WARNING nova.compute.manager [req-3bc6b4c2-cc52-4fb3-a6a0-4b1fbe966c49 req-045ebcba-c05e-4651-933e-ae3f93de55ff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Received unexpected event network-vif-plugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 for instance with vm_state building and task_state spawning.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.173 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.176 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.177 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936268.1763675, 58c3288f-57bf-4c62-8d69-9842a22e43d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.177 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] VM Resumed (Lifecycle Event)
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.184 187212 INFO nova.virt.libvirt.driver [-] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Instance spawned successfully.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.185 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.223 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.227 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.227 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.227 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.228 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.228 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.228 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.232 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.250 187212 DEBUG nova.compute.manager [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received event network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.250 187212 DEBUG oslo_concurrency.lockutils [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.250 187212 DEBUG oslo_concurrency.lockutils [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.251 187212 DEBUG oslo_concurrency.lockutils [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.251 187212 DEBUG nova.compute.manager [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Processing event network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.251 187212 DEBUG nova.compute.manager [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received event network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.251 187212 DEBUG oslo_concurrency.lockutils [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.252 187212 DEBUG oslo_concurrency.lockutils [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.252 187212 DEBUG oslo_concurrency.lockutils [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.252 187212 DEBUG nova.compute.manager [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] No waiting events found dispatching network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.252 187212 WARNING nova.compute.manager [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received unexpected event network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 for instance with vm_state building and task_state spawning.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.253 187212 DEBUG nova.compute.manager [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received event network-vif-plugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.253 187212 DEBUG oslo_concurrency.lockutils [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.253 187212 DEBUG oslo_concurrency.lockutils [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.253 187212 DEBUG oslo_concurrency.lockutils [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.253 187212 DEBUG nova.compute.manager [req-9fdc3a20-0780-4346-8ab4-0e0f89015fb2 req-41f87974-6e2e-4960-be13-f697714acb46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Processing event network-vif-plugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.254 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.255 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.266 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.267 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.271 187212 INFO nova.virt.libvirt.driver [-] [instance: a7616662-639b-4642-b507-614773f4748f] Instance spawned successfully.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.272 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.275 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.275 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936268.2587397, 082d2145-1505-4170-9a11-4e46bf86fed2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.276 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] VM Resumed (Lifecycle Event)
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.278 187212 INFO nova.virt.libvirt.driver [-] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Instance spawned successfully.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.278 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.299 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.310 187212 INFO nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Took 17.92 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.311 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.313 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.313 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.314 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.314 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.315 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.315 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.323 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.323 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.324 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.324 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.324 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.325 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.452 187212 INFO nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Took 16.23 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.452 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.453 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.516 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.516 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936268.2661486, a7616662-639b-4642-b507-614773f4748f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.516 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] VM Resumed (Lifecycle Event)
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.573 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.577 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.595 187212 INFO nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Took 19.00 seconds to build instance.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.599 187212 INFO nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Took 18.52 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.600 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.614 187212 INFO nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Took 16.86 seconds to build instance.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.621 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.640 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.676 187212 INFO nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Took 19.16 seconds to build instance.
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.693 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.822 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.822 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.848 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.864 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.865 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.909 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.913 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.914 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.933 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.934 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.942 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.942 187212 INFO nova.compute.claims [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:28 compute-0 nova_compute[187208]: 2025-12-05 12:04:28.947 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.036 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.093 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.100 187212 DEBUG nova.compute.manager [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.203 187212 INFO nova.compute.manager [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] instance snapshotting
Dec 05 12:04:29 compute-0 podman[222821]: 2025-12-05 12:04:29.315724585 +0000 UTC m=+0.100976424 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:04:29 compute-0 podman[222822]: 2025-12-05 12:04:29.37087755 +0000 UTC m=+0.145993458 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.382 187212 DEBUG nova.compute.provider_tree [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.402 187212 DEBUG nova.scheduler.client.report [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.419 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.420 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.422 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.428 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.428 187212 INFO nova.compute.claims [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.462 187212 INFO nova.virt.libvirt.driver [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Beginning live snapshot process
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.501 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.502 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.546 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.576 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.631 187212 DEBUG nova.compute.manager [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.706 187212 INFO nova.compute.manager [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] instance snapshotting
Dec 05 12:04:29 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.711 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.886 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.888 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.888 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Creating image(s)
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.888 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "/var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.889 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "/var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.889 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "/var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.904 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json -f qcow2" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.905 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.976 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:29 compute-0 nova_compute[187208]: 2025-12-05 12:04:29.997 187212 DEBUG nova.policy [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd51d545246e0434591329e386f100a7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.028 187212 INFO nova.virt.libvirt.driver [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Beginning live snapshot process
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.067 187212 DEBUG nova.compute.provider_tree [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.073 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json -f qcow2" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.086 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.127 187212 DEBUG nova.scheduler.client.report [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.167 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.262s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.168 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.184 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.016s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:30 compute-0 kernel: tap656f63d2-77 (unregistering): left promiscuous mode
Dec 05 12:04:30 compute-0 NetworkManager[55691]: <info>  [1764936270.2302] device (tap656f63d2-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.234 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:30 compute-0 ovn_controller[95610]: 2025-12-05T12:04:30Z|00355|binding|INFO|Releasing lport 656f63d2-77f9-46f7-9338-81bc5a056ad4 from this chassis (sb_readonly=0)
Dec 05 12:04:30 compute-0 ovn_controller[95610]: 2025-12-05T12:04:30Z|00356|binding|INFO|Setting lport 656f63d2-77f9-46f7-9338-81bc5a056ad4 down in Southbound
Dec 05 12:04:30 compute-0 ovn_controller[95610]: 2025-12-05T12:04:30Z|00357|binding|INFO|Removing iface tap656f63d2-77 ovn-installed in OVS
Dec 05 12:04:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:30.245 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:8d:59 10.100.0.4'], port_security=['fa:16:3e:64:8d:59 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c2e63727-b45b-4249-a94f-85b0d6314ba0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=656f63d2-77f9-46f7-9338-81bc5a056ad4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:30.249 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 656f63d2-77f9-46f7-9338-81bc5a056ad4 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 unbound from our chassis
Dec 05 12:04:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:30.252 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:04:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:30.255 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac9f82a-7387-4fea-a983-84f17cda509f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:30.255 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace which is not needed anymore
Dec 05 12:04:30 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000029.scope: Deactivated successfully.
Dec 05 12:04:30 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000029.scope: Consumed 12.564s CPU time.
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.288 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.292 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.293 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:30 compute-0 systemd-machined[153543]: Machine qemu-44-instance-00000029 terminated.
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.301 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.215s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.301 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.303 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp14naatji/c88599ae9a0b4fedbdf344a0a7cd60ac.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.349 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.350 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.541 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.542 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.580 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.580 187212 INFO nova.compute.claims [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.583 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:30 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.588 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.648 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.723 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk 1073741824" returned: 0 in 0.373s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.724 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.724 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.751 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.757 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.758 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Creating image(s)
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.758 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "/var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.759 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "/var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.762 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "/var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.775 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json -f qcow2" returned: 0 in 0.187s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.775 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp14naatji/c88599ae9a0b4fedbdf344a0a7cd60ac.delta 1073741824" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.776 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.794 187212 INFO nova.virt.libvirt.driver [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.801 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:30 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [NOTICE]   (222247) : haproxy version is 2.8.14-c23fe91
Dec 05 12:04:30 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [NOTICE]   (222247) : path to executable is /usr/sbin/haproxy
Dec 05 12:04:30 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [WARNING]  (222247) : Exiting Master process...
Dec 05 12:04:30 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [WARNING]  (222247) : Exiting Master process...
Dec 05 12:04:30 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [ALERT]    (222247) : Current worker (222249) exited with code 143 (Terminated)
Dec 05 12:04:30 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [WARNING]  (222247) : All workers exited. Exiting... (0)
Dec 05 12:04:30 compute-0 systemd[1]: libpod-720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3.scope: Deactivated successfully.
Dec 05 12:04:30 compute-0 podman[222939]: 2025-12-05 12:04:30.820370881 +0000 UTC m=+0.147904713 container stop 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 12:04:30 compute-0 podman[222939]: 2025-12-05 12:04:30.851208237 +0000 UTC m=+0.178742089 container died 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:04:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3-userdata-shm.mount: Deactivated successfully.
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.888 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json -f qcow2" returned: 0 in 0.112s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-b1e3820f746877bfac04dbdf959f91cdf0863ffd037e2db606c73901d85cc260-merged.mount: Deactivated successfully.
Dec 05 12:04:30 compute-0 nova_compute[187208]: 2025-12-05 12:04:30.909 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:30 compute-0 podman[222939]: 2025-12-05 12:04:30.910996186 +0000 UTC m=+0.238530008 container cleanup 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.004 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.280s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.005 187212 DEBUG nova.virt.disk.api [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Checking if we can resize image /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.006 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 systemd[1]: libpod-conmon-720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3.scope: Deactivated successfully.
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.074 187212 DEBUG nova.policy [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd51d545246e0434591329e386f100a7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.081 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.281s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.082 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.083 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.120 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.153 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk --force-share --output=json" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.155 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.170 187212 DEBUG nova.virt.disk.api [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Cannot resize image /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.171 187212 DEBUG nova.objects.instance [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'migration_context' on Instance uuid 10048ac5-1fbc-45e6-aa94-01eff87b9ffc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.172 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpry6c0j_s/c44c52b464b54aae8193db6d3528683a.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.203 187212 DEBUG nova.virt.libvirt.guest [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.206 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.207 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.274 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Successfully created port: e4966b1a-1933-4c85-a2a0-6b5a788efd7a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.278 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.279 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Ensure instance console log exists: /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.279 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.280 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.280 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.282 187212 INFO nova.virt.libvirt.driver [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.284 187212 INFO nova.virt.libvirt.driver [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance shutdown successfully after 13 seconds.
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.290 187212 DEBUG nova.compute.provider_tree [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.310 187212 INFO nova.virt.libvirt.driver [-] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance destroyed successfully.
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.311 187212 DEBUG nova.objects.instance [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'numa_topology' on Instance uuid c2e63727-b45b-4249-a94f-85b0d6314ba0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.313 187212 DEBUG nova.scheduler.client.report [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.324 187212 DEBUG nova.compute.manager [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.344 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.345 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.385 187212 DEBUG nova.privsep.utils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.387 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp14naatji/c88599ae9a0b4fedbdf344a0a7cd60ac.delta /var/lib/nova/instances/snapshots/tmp14naatji/c88599ae9a0b4fedbdf344a0a7cd60ac execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 podman[222978]: 2025-12-05 12:04:31.388899325 +0000 UTC m=+0.423710362 container remove 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 12:04:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:31.394 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9d9d0e40-5ae8-4f48-a554-b6f05fa10f04]: (4, ('Fri Dec  5 12:04:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3)\n720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3\nFri Dec  5 12:04:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3)\n720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:31.396 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ca26b1a7-a090-4bab-8281-78f1ed214483]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:31.398 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:31 compute-0 kernel: tapd7360f84-b0: left promiscuous mode
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.419 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk 1073741824" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.420 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpry6c0j_s/c44c52b464b54aae8193db6d3528683a.delta 1073741824" returned: 0 in 0.248s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.421 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.425 187212 DEBUG oslo_concurrency.lockutils [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.501s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:31.425 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[68f6654f-43e5-481b-a429-74ee38c74d64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.431 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.433 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:31.436 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[98d6b8fe-5282-4bd8-937e-5480da3c6cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:31.438 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35716f40-79f6-4071-b819-2ad105ea2138]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:31.460 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4e257171-f4f4-43bf-b127-b059d3f9addd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363264, 'reachable_time': 40076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223048, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.462 187212 INFO nova.virt.libvirt.driver [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:04:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:31.463 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:04:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:31.463 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[d149a5e8-c4b0-4223-8950-0cb1c25d37ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:31 compute-0 systemd[1]: run-netns-ovnmeta\x2dd7360f84\x2dbcd5\x2d4e64\x2dbf43\x2d1fdbd8215a70.mount: Deactivated successfully.
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.464 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.464 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.488 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.507 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.511 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.512 187212 DEBUG nova.virt.disk.api [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Checking if we can resize image /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.513 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.587 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.588 187212 DEBUG nova.virt.disk.api [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Cannot resize image /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.588 187212 DEBUG nova.objects.instance [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'migration_context' on Instance uuid c6a957dd-2181-4e92-9e06-e1a15fe5c307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.593 187212 DEBUG nova.virt.libvirt.guest [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.597 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.602 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.602 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Creating image(s)
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.603 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "/var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.603 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "/var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.604 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "/var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.623 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.624 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Ensure instance console log exists: /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.626 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.626 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.626 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.628 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.755 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.756 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.757 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.776 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.803 187212 DEBUG nova.policy [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd51d545246e0434591329e386f100a7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.873 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.874 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.975 187212 DEBUG oslo_concurrency.processutils [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp14naatji/c88599ae9a0b4fedbdf344a0a7cd60ac.delta /var/lib/nova/instances/snapshots/tmp14naatji/c88599ae9a0b4fedbdf344a0a7cd60ac" returned: 0 in 0.588s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.976 187212 INFO nova.virt.libvirt.driver [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Snapshot extracted, beginning image upload
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.984 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk 1073741824" returned: 0 in 0.110s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.984 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:31 compute-0 nova_compute[187208]: 2025-12-05 12:04:31.985 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.008 187212 DEBUG nova.compute.manager [req-89336b5f-abc7-4267-9948-fe96bb5b1b94 req-3be8d254-6783-448b-9ebe-3c6b68e4ff46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received event network-vif-plugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.009 187212 DEBUG oslo_concurrency.lockutils [req-89336b5f-abc7-4267-9948-fe96bb5b1b94 req-3be8d254-6783-448b-9ebe-3c6b68e4ff46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.009 187212 DEBUG oslo_concurrency.lockutils [req-89336b5f-abc7-4267-9948-fe96bb5b1b94 req-3be8d254-6783-448b-9ebe-3c6b68e4ff46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.009 187212 DEBUG oslo_concurrency.lockutils [req-89336b5f-abc7-4267-9948-fe96bb5b1b94 req-3be8d254-6783-448b-9ebe-3c6b68e4ff46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.009 187212 DEBUG nova.compute.manager [req-89336b5f-abc7-4267-9948-fe96bb5b1b94 req-3be8d254-6783-448b-9ebe-3c6b68e4ff46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] No waiting events found dispatching network-vif-plugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.009 187212 WARNING nova.compute.manager [req-89336b5f-abc7-4267-9948-fe96bb5b1b94 req-3be8d254-6783-448b-9ebe-3c6b68e4ff46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received unexpected event network-vif-plugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b for instance with vm_state active and task_state None.
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.065 187212 DEBUG nova.compute.manager [req-0bcfec9d-d09d-41bc-896e-11d1ae29e61c req-f9d33f79-f5fd-4c6b-a2d8-e44380c8dbc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received event network-vif-unplugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.065 187212 DEBUG oslo_concurrency.lockutils [req-0bcfec9d-d09d-41bc-896e-11d1ae29e61c req-f9d33f79-f5fd-4c6b-a2d8-e44380c8dbc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.065 187212 DEBUG oslo_concurrency.lockutils [req-0bcfec9d-d09d-41bc-896e-11d1ae29e61c req-f9d33f79-f5fd-4c6b-a2d8-e44380c8dbc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.066 187212 DEBUG oslo_concurrency.lockutils [req-0bcfec9d-d09d-41bc-896e-11d1ae29e61c req-f9d33f79-f5fd-4c6b-a2d8-e44380c8dbc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.066 187212 DEBUG nova.compute.manager [req-0bcfec9d-d09d-41bc-896e-11d1ae29e61c req-f9d33f79-f5fd-4c6b-a2d8-e44380c8dbc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] No waiting events found dispatching network-vif-unplugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.066 187212 WARNING nova.compute.manager [req-0bcfec9d-d09d-41bc-896e-11d1ae29e61c req-f9d33f79-f5fd-4c6b-a2d8-e44380c8dbc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received unexpected event network-vif-unplugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 for instance with vm_state stopped and task_state None.
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.084 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.086 187212 DEBUG nova.virt.disk.api [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Checking if we can resize image /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.086 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.114 187212 DEBUG nova.virt.libvirt.guest [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.118 187212 INFO nova.virt.libvirt.driver [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.167 187212 DEBUG nova.privsep.utils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.170 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpry6c0j_s/c44c52b464b54aae8193db6d3528683a.delta /var/lib/nova/instances/snapshots/tmpry6c0j_s/c44c52b464b54aae8193db6d3528683a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.188 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.189 187212 DEBUG nova.virt.disk.api [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Cannot resize image /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.190 187212 DEBUG nova.objects.instance [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'migration_context' on Instance uuid 97020786-7ba5-4c8b-8a2c-838c0f663bb4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.193 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Successfully created port: b7157ade-85e0-4802-8d6a-0dfb86921b3a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.205 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.205 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Ensure instance console log exists: /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.206 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.206 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.206 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.387 187212 DEBUG oslo_concurrency.processutils [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpry6c0j_s/c44c52b464b54aae8193db6d3528683a.delta /var/lib/nova/instances/snapshots/tmpry6c0j_s/c44c52b464b54aae8193db6d3528683a" returned: 0 in 0.217s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.387 187212 INFO nova.virt.libvirt.driver [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Snapshot extracted, beginning image upload
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:32 compute-0 nova_compute[187208]: 2025-12-05 12:04:32.928 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:33 compute-0 nova_compute[187208]: 2025-12-05 12:04:33.150 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Successfully created port: 83fb1d43-a495-47f4-ad3a-569fd7c02c76 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:33 compute-0 podman[223092]: 2025-12-05 12:04:33.271608689 +0000 UTC m=+0.122374729 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Dec 05 12:04:33 compute-0 nova_compute[187208]: 2025-12-05 12:04:33.771 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Successfully updated port: e4966b1a-1933-4c85-a2a0-6b5a788efd7a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:33 compute-0 nova_compute[187208]: 2025-12-05 12:04:33.791 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "refresh_cache-10048ac5-1fbc-45e6-aa94-01eff87b9ffc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:33 compute-0 nova_compute[187208]: 2025-12-05 12:04:33.791 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquired lock "refresh_cache-10048ac5-1fbc-45e6-aa94-01eff87b9ffc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:33 compute-0 nova_compute[187208]: 2025-12-05 12:04:33.791 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:33 compute-0 nova_compute[187208]: 2025-12-05 12:04:33.888 187212 DEBUG nova.compute.manager [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:33 compute-0 nova_compute[187208]: 2025-12-05 12:04:33.942 187212 INFO nova.compute.manager [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] instance snapshotting
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.004 187212 DEBUG oslo_concurrency.lockutils [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "082d2145-1505-4170-9a11-4e46bf86fed2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.004 187212 DEBUG oslo_concurrency.lockutils [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.005 187212 DEBUG oslo_concurrency.lockutils [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.005 187212 DEBUG oslo_concurrency.lockutils [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.005 187212 DEBUG oslo_concurrency.lockutils [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.006 187212 INFO nova.compute.manager [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Terminating instance
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.007 187212 DEBUG nova.compute.manager [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:04:34 compute-0 kernel: tapd7b765ff-93 (unregistering): left promiscuous mode
Dec 05 12:04:34 compute-0 NetworkManager[55691]: <info>  [1764936274.0378] device (tapd7b765ff-93): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:04:34 compute-0 ovn_controller[95610]: 2025-12-05T12:04:34Z|00358|binding|INFO|Releasing lport d7b765ff-93e1-4594-9e3c-e177dee2e07b from this chassis (sb_readonly=0)
Dec 05 12:04:34 compute-0 ovn_controller[95610]: 2025-12-05T12:04:34Z|00359|binding|INFO|Setting lport d7b765ff-93e1-4594-9e3c-e177dee2e07b down in Southbound
Dec 05 12:04:34 compute-0 ovn_controller[95610]: 2025-12-05T12:04:34Z|00360|binding|INFO|Removing iface tapd7b765ff-93 ovn-installed in OVS
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.053 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.056 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.057 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:a3:7b 10.100.0.12'], port_security=['fa:16:3e:92:a3:7b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '082d2145-1505-4170-9a11-4e46bf86fed2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d7b765ff-93e1-4594-9e3c-e177dee2e07b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.059 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d7b765ff-93e1-4594-9e3c-e177dee2e07b in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.062 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.070 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.080 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4bf9e6-bd51-4946-9bfe-36013b549b24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:34 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Dec 05 12:04:34 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000002a.scope: Consumed 6.167s CPU time.
Dec 05 12:04:34 compute-0 systemd-machined[153543]: Machine qemu-48-instance-0000002a terminated.
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.113 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f689369a-9efb-4a76-92c1-c642f62896c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.117 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[053098b5-7578-4172-a77e-504af68b0346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.146 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[56df084d-4183-40fa-8e2d-46a293f96bdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.164 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[144ab5e4-55cb-4014-8db0-6b6a8151e270]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364450, 'reachable_time': 17301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223137, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.173 187212 INFO nova.virt.libvirt.driver [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Snapshot image upload complete
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.174 187212 INFO nova.compute.manager [None req-f4547e3e-488a-4bac-a4b3-63081ca12369 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Took 4.97 seconds to snapshot the instance on the hypervisor.
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.182 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35226e80-22db-4385-a0a8-0e540454cbb6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364464, 'tstamp': 364464}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223138, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364467, 'tstamp': 364467}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223138, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.184 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.186 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.191 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ea1ed6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.191 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.192 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ea1ed6-90, col_values=(('external_ids', {'iface-id': '6f012c31-72e4-4df5-be68-787aa910fb9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.192 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:34.192 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.273 187212 INFO nova.virt.libvirt.driver [-] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Instance destroyed successfully.
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.275 187212 DEBUG nova.objects.instance [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'resources' on Instance uuid 082d2145-1505-4170-9a11-4e46bf86fed2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.295 187212 DEBUG nova.virt.libvirt.vif [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-330967889',display_name='tempest-MultipleCreateTestJSON-server-330967889-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-330967889-1',id=42,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:04:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-epte65hh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:28Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=082d2145-1505-4170-9a11-4e46bf86fed2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.296 187212 DEBUG nova.network.os_vif_util [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "address": "fa:16:3e:92:a3:7b", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7b765ff-93", "ovs_interfaceid": "d7b765ff-93e1-4594-9e3c-e177dee2e07b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.297 187212 DEBUG nova.network.os_vif_util [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7b765ff-93e1-4594-9e3c-e177dee2e07b,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7b765ff-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.298 187212 DEBUG os_vif [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7b765ff-93e1-4594-9e3c-e177dee2e07b,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7b765ff-93') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.301 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.301 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7b765ff-93, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.306 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.308 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.312 187212 INFO os_vif [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:a3:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7b765ff-93e1-4594-9e3c-e177dee2e07b,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7b765ff-93')
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.313 187212 INFO nova.virt.libvirt.driver [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Deleting instance files /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2_del
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.314 187212 INFO nova.virt.libvirt.driver [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Deletion of /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2_del complete
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.334 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Successfully updated port: b7157ade-85e0-4802-8d6a-0dfb86921b3a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.367 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "refresh_cache-c6a957dd-2181-4e92-9e06-e1a15fe5c307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.368 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquired lock "refresh_cache-c6a957dd-2181-4e92-9e06-e1a15fe5c307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.368 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.391 187212 INFO nova.compute.manager [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.392 187212 DEBUG oslo.service.loopingcall [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.392 187212 DEBUG nova.compute.manager [-] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.392 187212 DEBUG nova.network.neutron [-] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.481 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Successfully updated port: 83fb1d43-a495-47f4-ad3a-569fd7c02c76 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.496 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "refresh_cache-97020786-7ba5-4c8b-8a2c-838c0f663bb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.496 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquired lock "refresh_cache-97020786-7ba5-4c8b-8a2c-838c0f663bb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.496 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.532 187212 INFO nova.virt.libvirt.driver [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Beginning live snapshot process
Dec 05 12:04:34 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.682 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.762 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk --force-share --output=json -f qcow2" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.763 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.843 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk --force-share --output=json -f qcow2" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.855 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.877 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.941 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.941 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp61mdfko2/d94634f962a54597833cd16854168d8c.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.981 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp61mdfko2/d94634f962a54597833cd16854168d8c.delta 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.982 187212 INFO nova.virt.libvirt.driver [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:04:34 compute-0 nova_compute[187208]: 2025-12-05 12:04:34.999 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.036 187212 DEBUG nova.virt.libvirt.guest [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.039 187212 INFO nova.virt.libvirt.driver [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.078 187212 DEBUG nova.privsep.utils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.079 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp61mdfko2/d94634f962a54597833cd16854168d8c.delta /var/lib/nova/instances/snapshots/tmp61mdfko2/d94634f962a54597833cd16854168d8c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.159 187212 DEBUG oslo_concurrency.lockutils [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "58c3288f-57bf-4c62-8d69-9842a22e43d6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.160 187212 DEBUG oslo_concurrency.lockutils [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.160 187212 DEBUG oslo_concurrency.lockutils [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.160 187212 DEBUG oslo_concurrency.lockutils [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.160 187212 DEBUG oslo_concurrency.lockutils [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.162 187212 INFO nova.compute.manager [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Terminating instance
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.163 187212 DEBUG nova.compute.manager [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:04:35 compute-0 kernel: tapeabadaa6-16 (unregistering): left promiscuous mode
Dec 05 12:04:35 compute-0 NetworkManager[55691]: <info>  [1764936275.1922] device (tapeabadaa6-16): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:04:35 compute-0 ovn_controller[95610]: 2025-12-05T12:04:35Z|00361|binding|INFO|Releasing lport eabadaa6-16c4-434c-83ea-96dfa62d7f79 from this chassis (sb_readonly=0)
Dec 05 12:04:35 compute-0 ovn_controller[95610]: 2025-12-05T12:04:35Z|00362|binding|INFO|Setting lport eabadaa6-16c4-434c-83ea-96dfa62d7f79 down in Southbound
Dec 05 12:04:35 compute-0 ovn_controller[95610]: 2025-12-05T12:04:35Z|00363|binding|INFO|Removing iface tapeabadaa6-16 ovn-installed in OVS
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.205 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.214 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c8:f9 10.100.0.13'], port_security=['fa:16:3e:1b:c8:f9 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '58c3288f-57bf-4c62-8d69-9842a22e43d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=eabadaa6-16c4-434c-83ea-96dfa62d7f79) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.218 104471 INFO neutron.agent.ovn.metadata.agent [-] Port eabadaa6-16c4-434c-83ea-96dfa62d7f79 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.220 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.227 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.227 187212 DEBUG nova.compute.manager [req-d18f5097-6af4-4f7a-a17c-57c7760d85b6 req-6529407e-88c6-46b0-9713-66e2cc00f387 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Received event network-changed-e4966b1a-1933-4c85-a2a0-6b5a788efd7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.228 187212 DEBUG nova.compute.manager [req-d18f5097-6af4-4f7a-a17c-57c7760d85b6 req-6529407e-88c6-46b0-9713-66e2cc00f387 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Refreshing instance network info cache due to event network-changed-e4966b1a-1933-4c85-a2a0-6b5a788efd7a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.228 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee1e307-b5c8-41af-90f9-f2c7192e19d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.228 187212 DEBUG oslo_concurrency.lockutils [req-d18f5097-6af4-4f7a-a17c-57c7760d85b6 req-6529407e-88c6-46b0-9713-66e2cc00f387 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-10048ac5-1fbc-45e6-aa94-01eff87b9ffc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.228 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 namespace which is not needed anymore
Dec 05 12:04:35 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Dec 05 12:04:35 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000002b.scope: Consumed 6.971s CPU time.
Dec 05 12:04:35 compute-0 systemd-machined[153543]: Machine qemu-49-instance-0000002b terminated.
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.262 187212 DEBUG oslo_concurrency.processutils [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp61mdfko2/d94634f962a54597833cd16854168d8c.delta /var/lib/nova/instances/snapshots/tmp61mdfko2/d94634f962a54597833cd16854168d8c" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.263 187212 INFO nova.virt.libvirt.driver [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Snapshot extracted, beginning image upload
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.269 187212 DEBUG nova.compute.manager [req-6172650f-3d39-4412-be9c-24a81b65e609 req-12fa7a75-ccfc-4df3-9fac-45da4472ef7f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.269 187212 DEBUG oslo_concurrency.lockutils [req-6172650f-3d39-4412-be9c-24a81b65e609 req-12fa7a75-ccfc-4df3-9fac-45da4472ef7f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.270 187212 DEBUG oslo_concurrency.lockutils [req-6172650f-3d39-4412-be9c-24a81b65e609 req-12fa7a75-ccfc-4df3-9fac-45da4472ef7f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.270 187212 DEBUG oslo_concurrency.lockutils [req-6172650f-3d39-4412-be9c-24a81b65e609 req-12fa7a75-ccfc-4df3-9fac-45da4472ef7f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.270 187212 DEBUG nova.compute.manager [req-6172650f-3d39-4412-be9c-24a81b65e609 req-12fa7a75-ccfc-4df3-9fac-45da4472ef7f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] No waiting events found dispatching network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.270 187212 WARNING nova.compute.manager [req-6172650f-3d39-4412-be9c-24a81b65e609 req-12fa7a75-ccfc-4df3-9fac-45da4472ef7f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received unexpected event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 for instance with vm_state stopped and task_state None.
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.348 187212 INFO nova.virt.libvirt.driver [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Snapshot image upload complete
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.348 187212 INFO nova.compute.manager [None req-3634cd67-1f32-4d55-b822-db672309366f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Took 5.64 seconds to snapshot the instance on the hypervisor.
Dec 05 12:04:35 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[222784]: [NOTICE]   (222788) : haproxy version is 2.8.14-c23fe91
Dec 05 12:04:35 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[222784]: [NOTICE]   (222788) : path to executable is /usr/sbin/haproxy
Dec 05 12:04:35 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[222784]: [WARNING]  (222788) : Exiting Master process...
Dec 05 12:04:35 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[222784]: [ALERT]    (222788) : Current worker (222790) exited with code 143 (Terminated)
Dec 05 12:04:35 compute-0 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[222784]: [WARNING]  (222788) : All workers exited. Exiting... (0)
Dec 05 12:04:35 compute-0 systemd[1]: libpod-c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec.scope: Deactivated successfully.
Dec 05 12:04:35 compute-0 podman[223203]: 2025-12-05 12:04:35.412099104 +0000 UTC m=+0.065018570 container died c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.473 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '58c3288f-57bf-4c62-8d69-9842a22e43d6', 'name': 'tempest-MultipleCreateTestJSON-server-330967889-2', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'bebbbd9623064681bb9350747fba600e', 'user_id': '40620135b1ff4f8d9d80eb79f51fd593', 'hostId': 'dd2337b6188e5611a0eaeb949b28a5202b0ed0aef15c73b6e7e00895', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.475 187212 INFO nova.virt.libvirt.driver [-] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Instance destroyed successfully.
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.476 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '42d9566206cb469ebd803d0600019533', 'user_id': '8456efa356654e5c990efa4aef688e8a', 'hostId': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.476 187212 DEBUG nova.objects.instance [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'resources' on Instance uuid 58c3288f-57bf-4c62-8d69-9842a22e43d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.477 187212 DEBUG oslo_concurrency.lockutils [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.477 187212 DEBUG oslo_concurrency.lockutils [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.478 187212 DEBUG oslo_concurrency.lockutils [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.478 187212 DEBUG oslo_concurrency.lockutils [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.478 187212 DEBUG oslo_concurrency.lockutils [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.478 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2e63727-b45b-4249-a94f-85b0d6314ba0', 'name': 'tempest-DeleteServersTestJSON-server-64428055', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000029', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '4671f6c82ea049fab3a314ecf45b7656', 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'hostId': '2470f2977efa1109438e1cca9d5a938b61006ba2964f8bccc54d946c', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.479 187212 INFO nova.compute.manager [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Terminating instance
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.480 187212 DEBUG nova.compute.manager [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.480 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'name': 'tempest-ImagesTestJSON-server-404632133', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000028', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'hostId': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.482 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a7616662-639b-4642-b507-614773f4748f', 'name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79895287bd1d488c842f6013729a1f81', 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'hostId': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.484 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '42d9566206cb469ebd803d0600019533', 'user_id': '8456efa356654e5c990efa4aef688e8a', 'hostId': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.486 12 DEBUG ceilometer.compute.pollsters [-] Instance 58c3288f-57bf-4c62-8d69-9842a22e43d6 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:04:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-64f5a032348eaf428a0c9b64652c19c912ddd20a0b5d38a19d66425a40143449-merged.mount: Deactivated successfully.
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.490 12 DEBUG ceilometer.compute.pollsters [-] Instance c2e63727-b45b-4249-a94f-85b0d6314ba0 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:04:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec-userdata-shm.mount: Deactivated successfully.
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.494 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 / tapea8794b1-8d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.495 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.495 187212 INFO nova.virt.libvirt.driver [-] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance destroyed successfully.
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.495 187212 DEBUG nova.objects.instance [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'resources' on Instance uuid c2e63727-b45b-4249-a94f-85b0d6314ba0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.498 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a7616662-639b-4642-b507-614773f4748f / tap539a9707-ef inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.498 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c16b3bf6-8bf6-4e7b-a655-984fb2434e12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.485160', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '9106b792-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': 'e02b6758f8e20d5d1a28a9ceda9442bcd8f4bbf226a73657d9ab9aedde9d12e9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.485160', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '9107351e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': 'd9807cbc1a7b7953e622b3b9e5904db5167680d24bef7166b3c671ab67a02b89'}]}, 'timestamp': '2025-12-05 12:04:35.500778', '_unique_id': '0824234cd30e454097a0ce00557e6626'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.502 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.504 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.504 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MultipleCreateTestJSON-server-330967889-2>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1806870616>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-64428055>, <NovaLikeServer: tempest-ImagesTestJSON-server-404632133>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1115906111>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-469388429>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MultipleCreateTestJSON-server-330967889-2>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1806870616>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-64428055>, <NovaLikeServer: tempest-ImagesTestJSON-server-404632133>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1115906111>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-469388429>]
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.504 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.505 12 DEBUG ceilometer.compute.pollsters [-] Instance 58c3288f-57bf-4c62-8d69-9842a22e43d6 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.512 187212 DEBUG nova.virt.libvirt.vif [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-330967889',display_name='tempest-MultipleCreateTestJSON-server-330967889-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-330967889-2',id=43,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-12-05T12:04:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-epte65hh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:28Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=58c3288f-57bf-4c62-8d69-9842a22e43d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.512 187212 DEBUG nova.network.os_vif_util [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "address": "fa:16:3e:1b:c8:f9", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeabadaa6-16", "ovs_interfaceid": "eabadaa6-16c4-434c-83ea-96dfa62d7f79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.513 187212 DEBUG nova.network.os_vif_util [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1b:c8:f9,bridge_name='br-int',has_traffic_filtering=True,id=eabadaa6-16c4-434c-83ea-96dfa62d7f79,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeabadaa6-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.513 187212 DEBUG os_vif [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:c8:f9,bridge_name='br-int',has_traffic_filtering=True,id=eabadaa6-16c4-434c-83ea-96dfa62d7f79,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeabadaa6-16') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.515 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.516 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeabadaa6-16, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.517 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.519 187212 DEBUG nova.virt.libvirt.vif [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-64428055',display_name='tempest-DeleteServersTestJSON-server-64428055',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-64428055',id=41,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:04:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-rpqd5t4m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:31Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=c2e63727-b45b-4249-a94f-85b0d6314ba0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.519 187212 DEBUG nova.network.os_vif_util [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.519 187212 DEBUG nova.network.os_vif_util [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.520 187212 DEBUG os_vif [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.521 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.522 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.523 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.523 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap656f63d2-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.524 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.526 187212 INFO os_vif [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:c8:f9,bridge_name='br-int',has_traffic_filtering=True,id=eabadaa6-16c4-434c-83ea-96dfa62d7f79,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeabadaa6-16')
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.526 187212 INFO nova.virt.libvirt.driver [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Deleting instance files /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6_del
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.526 187212 INFO nova.virt.libvirt.driver [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Deletion of /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6_del complete
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.529 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.531 187212 INFO os_vif [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77')
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.531 187212 INFO nova.virt.libvirt.driver [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Deleting instance files /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0_del
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.532 187212 INFO nova.virt.libvirt.driver [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Deletion of /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0_del complete
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.543 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.read.latency volume: 300902882 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.544 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.read.latency volume: 11444919 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.545 12 DEBUG ceilometer.compute.pollsters [-] Instance c2e63727-b45b-4249-a94f-85b0d6314ba0 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.564 187212 WARNING nova.compute.manager [None req-e8eea6ef-4259-4f7b-9a84-923433bb0b78 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Image not found during snapshot: nova.exception.ImageNotFound: Image f2e4eb7a-1519-4e0e-b8c1-7e4ecba062ab could not be found.
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.567 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.read.latency volume: 268256280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.568 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.read.latency volume: 627308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.593 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.read.latency volume: 279729175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.594 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.read.latency volume: 3189302 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.613 187212 INFO nova.compute.manager [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Took 0.45 seconds to destroy the instance on the hypervisor.
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.614 187212 DEBUG oslo.service.loopingcall [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.614 187212 DEBUG nova.compute.manager [-] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.615 187212 DEBUG nova.network.neutron [-] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.623 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.read.latency volume: 331527916 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.624 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.read.latency volume: 46091995 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.624 187212 INFO nova.compute.manager [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Took 0.14 seconds to destroy the instance on the hypervisor.
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.625 187212 DEBUG oslo.service.loopingcall [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.625 187212 DEBUG nova.compute.manager [-] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.626 187212 DEBUG nova.network.neutron [-] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0d7df92-3fff-4813-a425-98949035910e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 300902882, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-vda', 'timestamp': '2025-12-05T12:04:35.504817', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '910e141a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': 'bc36fc87847f3bca70dfa8635006f7a3416d0a3f24d37fe9ae467fa22b892156'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11444919, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-sda', 'timestamp': '2025-12-05T12:04:35.504817', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '910e21e4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': 'ab412414c473286222d0fa7185081daafeafebea04104883dfad432b07e26046'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 268256280, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-vda', 'timestamp': '2025-12-05T12:04:35.504817', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9111bd0e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': '46dc6a906329d02f41b0c9d814768e12cd6667bf9519563fa82c4cab28e1f5a0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 627308, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-sda', 'timestamp': '2025-12-05T12:04:35.504817', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9111ccfe-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': '56c44334aea6d9872df473e12b6b3f46d1db3dc9128cc26fc56ba56898c87f5b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 279729175, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-vda', 'timestamp': '2025-12-05T12:04:35.504817', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9115b72e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': '332745aabc85b8cafbb077fad09514903e40cdd055febb95a219db14ed24cb2f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3189302, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-sda', 'timestamp': '2025-12-05T12:04:35.504817', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, '
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9115c4c6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': '826588b2df48109fd2b0f231d10fb5c9d2aeba7588819a8b2e1cbefd18f83d03'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 331527916, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-vda', 'timestamp': '2025-12-05T12:04:35.504817', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '911a3fe2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': 'df238ccdb98256e4b0fa641d25a6e940ed074766c6ba946cea99aeac3cff7901'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46091995, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-sda', 'timestamp': '2025-12-05T12:04:35.504817', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '911a4dfc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': '9d3254895bc148b1d2468e3e56a08f226c5e2a2808f7bc5429c5179e8932a1fa'}]}, 'timestamp': '2025-12-05 12:04:35.624496', '_unique_id': 'b40ff2a6e87f4b0c9c00225584dee9b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.625 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.626 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.627 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.628 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.628 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.628 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34b250ab-5501-450d-85ef-cc1fb2aa4de8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.626827', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '911afd74-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': 'd68ca6490a9d5a3420d6d44734923b4a819694f6892de1274a3284c52af985a3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.626827', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '911b0ce2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': '708335e28d2645d72b169d81ca7767e30ede7557ad56122415bec0d6a5289c91'}]}, 'timestamp': '2025-12-05 12:04:35.629365', '_unique_id': '6a6a0ef3e0d244af8b457398418ac299'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.630 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.631 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.631 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.643 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.644 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.645 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.654 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.654 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.665 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.666 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.668 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Updating instance_info_cache with network_info: [{"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.676 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.676 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb7896c7-c022-4fa1-8f14-090c79fc609b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-vda', 'timestamp': '2025-12-05T12:04:35.631272', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '911d4886-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.252885942, 'message_signature': '3c7768a792ee909495f39f873ea1ec9e30962832e0c4b091c671511541ea25fc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-sda', 'timestamp': '2025-12-05T12:04:35.631272', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '911d58d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.252885942, 'message_signature': 'ce6429ee7dbc0783c6d3137c61393a2a95f181260df39976c2a687ce2d63bf70'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-vda', 'timestamp': '2025-12-05T12:04:35.631272', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '911ef442-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.266253846, 'message_signature': '66875da938d8004ccc843e383a3af4684f24c78b1db6638245462b779881eec2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-sda', 'timestamp': '2025-12-05T12:04:35.631272', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '911f048c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.266253846, 'message_signature': 'e096c5f10c88e9fa072c692c3b390052bc943a337faf163dd0e70b80ad7e06a3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-vda', 'timestamp': '2025-12-05T12:04:35.631272', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9120b926-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.276357566, 'message_signature': '9e1caf6afc6915afaf250c7acf28a8f9707ca6c4130e26df44d61a840ffce588'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-sda', 'timestamp': '2025-12-05T12:04:35.631272', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb':
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]:  1, 'disk_name': 'sda'}, 'message_id': '9120c6e6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.276357566, 'message_signature': '6631e9a24e2d05eb31faedf577d766c762e780cc6b8805c856e860b2aecd46f4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-vda', 'timestamp': '2025-12-05T12:04:35.631272', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9122484a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.287891208, 'message_signature': 'a60e4be50a9fc12995512b0583af0ea02faff29e7c87a6d4919b235d2d0bc5b2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-sda', 'timestamp': '2025-12-05T12:04:35.631272', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9122531c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.287891208, 'message_signature': 'd3ba4c3bca95eeeda6e9242b5c41a8303ba795ecbc051239a9d917095137cd11'}]}, 'timestamp': '2025-12-05 12:04:35.676992', '_unique_id': 'b27509465ea64299b47f1e229791ba79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.678 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.679 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.680 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.680 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.680 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.681 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.681 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.681 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.681 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.682 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.682 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.write.requests volume: 290 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.682 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb84c493-76dd-4417-8f53-b9443446eda6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-vda', 'timestamp': '2025-12-05T12:04:35.679376', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9122d986-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': 'ffc8f53accbee3d16a6b2199246e8379c668f0468819c79b956a6338e1b8b706'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-sda', 'timestamp': '2025-12-05T12:04:35.679376', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9122e656-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': 'b6e18ba5728749ad4f588358f43d7750d2f0a511f674b453934942c6de671ea9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-vda', 'timestamp': '2025-12-05T12:04:35.679376', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '91230370-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': 'fc1a9af380055a0448f1e4d99188c09688e3e7245e3c80ef333d4bd183a7a1b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-sda', 'timestamp': '2025-12-05T12:04:35.679376', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9123131a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': 'e09946003647dbeb879eef929a94bed2242e453456b34b645d9208260a089120'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-vda', 'timestamp': '2025-12-05T12:04:35.679376', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '91231e1e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': '9c30181337edc8ac345e603d37bf3997ecd885bd197daec5406ec59c2deefab6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-sda', 'timestamp': '2025-12-05T12:04:35.679376', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, '
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9123280a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': 'e2191d4f5882d309fab5ca78fa882747481c8f21548a16a19b6372655d09f533'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 290, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-vda', 'timestamp': '2025-12-05T12:04:35.679376', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '912338d6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': '37b583bf8d58163706b43a64908dd6d549a255a25d9e7ec1f812779c12397b11'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-sda', 'timestamp': '2025-12-05T12:04:35.679376', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9123409c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': 'c1eac852bc4569ed89e3dff397d448ead61e59d1b533946412336477f96ca695'}]}, 'timestamp': '2025-12-05 12:04:35.683040', '_unique_id': 'abfebdcff2f040ccaa01288812f0996b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.683 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.684 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.684 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MultipleCreateTestJSON-server-330967889-2>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1806870616>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-64428055>, <NovaLikeServer: tempest-ImagesTestJSON-server-404632133>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1115906111>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-469388429>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MultipleCreateTestJSON-server-330967889-2>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1806870616>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-64428055>, <NovaLikeServer: tempest-ImagesTestJSON-server-404632133>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1115906111>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-469388429>]
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.685 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.685 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.685 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.686 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.686 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.686 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.686 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.686 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.687 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.687 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.687 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Releasing lock "refresh_cache-10048ac5-1fbc-45e6-aa94-01eff87b9ffc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.687 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Instance network_info: |[{"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.688 187212 DEBUG oslo_concurrency.lockutils [req-d18f5097-6af4-4f7a-a17c-57c7760d85b6 req-6529407e-88c6-46b0-9713-66e2cc00f387 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-10048ac5-1fbc-45e6-aa94-01eff87b9ffc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.688 187212 DEBUG nova.network.neutron [req-d18f5097-6af4-4f7a-a17c-57c7760d85b6 req-6529407e-88c6-46b0-9713-66e2cc00f387 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Refreshing network info cache for port e4966b1a-1933-4c85-a2a0-6b5a788efd7a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3654eb60-8070-4c67-9edf-4f4471f368fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-vda', 'timestamp': '2025-12-05T12:04:35.684778', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9123a06e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.252885942, 'message_signature': '54bccc4a711fdd701d6f1258be3527360e52589580cf88273b401d53d04c1ba1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-sda', 'timestamp': '2025-12-05T12:04:35.684778', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9123a820-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.252885942, 'message_signature': '52850e8a78394b57240e037345e700a61ae69ee9795e8b02a7a6bbd62c26e72c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-vda', 'timestamp': '2025-12-05T12:04:35.684778', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9123c3f0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.266253846, 'message_signature': 'b71c2e9bd2241eb4d2bc307f4f93bda564411269ead2175b2dac0d070fa04c07'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-sda', 'timestamp': '2025-12-05T12:04:35.684778', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9123cba2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.266253846, 'message_signature': 'ec364b5c7e1247b9e72960f7187553a83e7c8bc4076d7e86d0d1bbce245b57db'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-vda', 'timestamp': '2025-12-05T12:04:35.684778', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9123d46c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.276357566, 'message_signature': 'cb6c4893c8e79f42f8257b79e18104f4182062f4e73c7bc86c9bfee17cd3cc15'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-sda', 'timestamp': '2025-12-05T12:04:35.684778', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'mess
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: age_id': '9123dc5a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.276357566, 'message_signature': '3ab75b2bed1fc17044840113a1d8dc6606484552b41995d134371dc3133a076c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-vda', 'timestamp': '2025-12-05T12:04:35.684778', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9123e588-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.287891208, 'message_signature': 'd94bd1e661212876b579dd38a90b423b639fc9eda2e4001079c7824ce5cdd1aa'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-sda', 'timestamp': '2025-12-05T12:04:35.684778', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9123edee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.287891208, 'message_signature': '7f99dac5132e7a3d7f61eb6e4a0df0e4a27598079854ce2317f9b3ed7cc7ad4a'}]}, 'timestamp': '2025-12-05 12:04:35.687461', '_unique_id': '1496056dc50f438d91a8093fb23a12f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.688 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.689 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MultipleCreateTestJSON-server-330967889-2>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1806870616>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-64428055>, <NovaLikeServer: tempest-ImagesTestJSON-server-404632133>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1115906111>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-469388429>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MultipleCreateTestJSON-server-330967889-2>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1806870616>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-64428055>, <NovaLikeServer: tempest-ImagesTestJSON-server-404632133>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1115906111>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-469388429>]
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.689 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.690 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.690 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.690 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12308ab5-d12e-482e-851c-b0d230dd7c06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.689266', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '91245db0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': '54288d5f40a9d9227a3b5144b1d17728681af3f6bcdc39df614ecba5977acd87'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.689266', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '912465d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': '740288d133291c3d6c3c3c9a4f6ae3a5d5c4a97f803bb0e44dc8398a0a3a95d1'}]}, 'timestamp': '2025-12-05 12:04:35.690564', '_unique_id': '646e284916fd4d6ba55014893532b291'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.691 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.692 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.693 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.693 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.693 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.693 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3419cb1c-41fd-47a9-a6c9-eea3e308cc2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.692948', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '9124eb18-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': '907321c9d112b8c5fd6a9a7ba187d48a46ae2dc06c4e142d4618c3208f96b45e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.692948', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '9124f61c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': '31c1457144c9d8bed44ecce1f4314d4a1c9f7bb018a36444bad23068bf3a2553'}]}, 'timestamp': '2025-12-05 12:04:35.694236', '_unique_id': '6132b7498b1343738572fe7c585137b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.694 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.695 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.695 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.696 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.696 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.695 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Start _get_guest_xml network_info=[{"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.696 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6af95ab-6bf0-44d8-a033-ec22b07da7d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.695449', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '91254d10-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': '808d7355f600d949af95c5f3f4fbcabe9d8ef6b1e3460385e786dc5c846d7f54'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.695449', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '9125585a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': '4b037756d317a15d8b2020d83e13c3526b13c4c1e6921a230a1df328d96b811c'}]}, 'timestamp': '2025-12-05 12:04:35.696750', '_unique_id': '65d2698ee47d4859b1af517e889eaae4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.697 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.698 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.700 187212 WARNING nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.704 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.705 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.707 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.708 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.708 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.708 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.708 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.709 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.709 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.709 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.709 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.710 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.710 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.710 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.710 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.710 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.714 187212 DEBUG nova.virt.libvirt.vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-1',id=47,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:29Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=10048ac5-1fbc-45e6-aa94-01eff87b9ffc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.714 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/cpu volume: 10140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.715 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.715 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.715 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:db:3c,bridge_name='br-int',has_traffic_filtering=True,id=e4966b1a-1933-4c85-a2a0-6b5a788efd7a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4966b1a-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.717 187212 DEBUG nova.objects.instance [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 10048ac5-1fbc-45e6-aa94-01eff87b9ffc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.729 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/cpu volume: 10580000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.732 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <uuid>10048ac5-1fbc-45e6-aa94-01eff87b9ffc</uuid>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <name>instance-0000002f</name>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <nova:name>tempest-ListServersNegativeTestJSON-server-951078504-1</nova:name>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:35</nova:creationTime>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:35 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:35 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:35 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:35 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:35 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:35 compute-0 nova_compute[187208]:         <nova:user uuid="d51d545246e0434591329e386f100a7d">tempest-ListServersNegativeTestJSON-1128597959-project-member</nova:user>
Dec 05 12:04:35 compute-0 nova_compute[187208]:         <nova:project uuid="31d3d0a57b064ff6abd01727d4443c0b">tempest-ListServersNegativeTestJSON-1128597959</nova:project>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:35 compute-0 nova_compute[187208]:         <nova:port uuid="e4966b1a-1933-4c85-a2a0-6b5a788efd7a">
Dec 05 12:04:35 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <entry name="serial">10048ac5-1fbc-45e6-aa94-01eff87b9ffc</entry>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <entry name="uuid">10048ac5-1fbc-45e6-aa94-01eff87b9ffc</entry>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk.config"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:2b:db:3c"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <target dev="tape4966b1a-19"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/console.log" append="off"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:35 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:35 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:35 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:35 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:35 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.732 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Preparing to wait for external event network-vif-plugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.732 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.733 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.733 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.734 187212 DEBUG nova.virt.libvirt.vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-1',id=47,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:29Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=10048ac5-1fbc-45e6-aa94-01eff87b9ffc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.734 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.735 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:db:3c,bridge_name='br-int',has_traffic_filtering=True,id=e4966b1a-1933-4c85-a2a0-6b5a788efd7a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4966b1a-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.735 187212 DEBUG os_vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:db:3c,bridge_name='br-int',has_traffic_filtering=True,id=e4966b1a-1933-4c85-a2a0-6b5a788efd7a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4966b1a-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.735 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.736 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.736 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.738 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.738 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape4966b1a-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.738 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape4966b1a-19, col_values=(('external_ids', {'iface-id': 'e4966b1a-1933-4c85-a2a0-6b5a788efd7a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:db:3c', 'vm-uuid': '10048ac5-1fbc-45e6-aa94-01eff87b9ffc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.739 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 NetworkManager[55691]: <info>  [1764936275.7405] manager: (tape4966b1a-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.745 187212 INFO os_vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:db:3c,bridge_name='br-int',has_traffic_filtering=True,id=e4966b1a-1933-4c85-a2a0-6b5a788efd7a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4966b1a-19')
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.758 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/cpu volume: 6740000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.775 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/cpu volume: 11180000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd6600d4-139b-4ef2-9846-df813e6f15a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10140000000, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'timestamp': '2025-12-05T12:04:35.697883', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '91282e18-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.335906118, 'message_signature': '5989537ce626050096a7d8c71b4f8f91c0bb6483b1ead0989e145d14385674ef'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10580000000, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'timestamp': '2025-12-05T12:04:35.697883', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '912a56a2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.350184289, 'message_signature': '400171391805e1628d67ef1c13302ac509f252b50343cf2263d2efc448797b91'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6740000000, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f', 'timestamp': '2025-12-05T12:04:35.697883', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '912ec782-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.379060129, 'message_signature': 'bc6cc1efea2ddfcc88bb34c06ffc501273daab64a9e1b22abb630763fa7a24b9'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11180000000, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'timestamp': '2025-12-05T12:04:35.697883', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9131764e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.396649655, 'message_signature': '09706b5a797b95d60839b0253ef5123ba8cb9b939c7348dc76d1121a072c2f42'}]}, 'timestamp': '2025-12-05 12:04:35.776269', '_unique_id': 'e730cd8ecdf548d182a9693bf9474e47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.778 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.783 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.783 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.784 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.785 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.785 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.785 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.786 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.786 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.787 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.read.bytes volume: 30972416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.787 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bef31ac4-824a-4828-8cd3-b970b4b18bec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-vda', 'timestamp': '2025-12-05T12:04:35.782922', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9132aa5a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': 'ab85bbfc85b43a55481d64b19de6dd2b21a1766196a37897e88e0a3a93b4f702'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-sda', 'timestamp': '2025-12-05T12:04:35.782922', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9132b9b4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': 'f2799af9fa0c798f35da3e1032cb7dfbfcb6dfa5f1b6369a1b75173430d45a30'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-vda', 'timestamp': '2025-12-05T12:04:35.782922', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9132f0aa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': '2b993c5608410a6ddcb82fce939436cfc53d20598e4adbe7b0d0691952e1c405'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-sda', 'timestamp': '2025-12-05T12:04:35.782922', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '91330162-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': 'c42db0a6ddc8af9d01631f6c1c392640580d5ef2735ac20f8e3a2b860cae9b4c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-vda', 'timestamp': '2025-12-05T12:04:35.782922', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '91331120-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': '9b03930f1da6baa858efe1367576b18352fb2560abf98d83e4257a1d34435dcb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-sda', 'timestamp': '2025-12-05T12:04:35.782922', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]:  'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9133207a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': 'c11e8d17db589748424c07b6aa19f532d2d5c2588ef39cb12bdc8c57be644d6e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30972416, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-vda', 'timestamp': '2025-12-05T12:04:35.782922', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '91333164-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': '0b7602f90844f15821121615dae8bb9c9ab6cb87db95cc10a4c6479ae8cfd0d7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-sda', 'timestamp': '2025-12-05T12:04:35.782922', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '913340be-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': '4e4a84d20a78ba20c57f782bdeb6f174736164c4aaf7066e7ec3b3393211b094'}]}, 'timestamp': '2025-12-05 12:04:35.787955', '_unique_id': '686544e2de02460eba5afd6d74ba4126'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.788 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.792 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.793 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.793 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.793 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.794 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.795 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.798 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.798 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.799 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.799 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.799 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76d3004a-e749-47aa-a931-c7ce7f387694', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-vda', 'timestamp': '2025-12-05T12:04:35.792991', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '91342c68-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.252885942, 'message_signature': '073a276833098b2c5756cd8b882a1bfedc415b32ee82fbf81312ec6e0777ebf7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-sda', 'timestamp': '2025-12-05T12:04:35.792991', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '91343816-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.252885942, 'message_signature': '33fc5e251d7d4ec37ef650d4df0eafb3bff36eb9bf99c5fa89b46bfb7af282dd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-vda', 'timestamp': '2025-12-05T12:04:35.792991', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9134db9a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.266253846, 'message_signature': '9080fada678e7e9b2be01325eb3e58d6af49f23e46fababfd1b1eb68bb73643c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-sda', 'timestamp': '2025-12-05T12:04:35.792991', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9134ee3c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.266253846, 'message_signature': '67df5b10ce372204070bc65d2760fbde2365b8db620bbf4230d6730ca883e617'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-vda', 'timestamp': '2025-12-05T12:04:35.792991', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9134f97c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.276357566, 'message_signature': 'defcddde5f5bb2ef9a5d5e7ae58ba9192a967f26a12bacc6db2899d67eba63e2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-sda', 'timestamp': '2025-12-05T12:04:35.792991', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb':
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]:  1, 'disk_name': 'sda'}, 'message_id': '91350408-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.276357566, 'message_signature': '9b1b825cc10d045e5c9af8820eb463a265fcdd221732e7f76c8de55da533f010'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-vda', 'timestamp': '2025-12-05T12:04:35.792991', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '91350f5c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.287891208, 'message_signature': '24d9c79f9c57226c1ccf7bf355339a9bbb27a76d19a88c9eba8e0ea3d4130313'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-sda', 'timestamp': '2025-12-05T12:04:35.792991', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '91351a88-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.287891208, 'message_signature': 'b5b01560b8704d9476fb8394809c6b3afb7c3576cb79e932257e705078970bc3'}]}, 'timestamp': '2025-12-05 12:04:35.800106', '_unique_id': 'b2fb026aa53a47229203e6336c90c325'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.801 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.805 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.805 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 podman[223203]: 2025-12-05 12:04:35.805755091 +0000 UTC m=+0.458674537 container cleanup c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.806 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.807 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.807 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.807 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.807 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.808 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.808 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.write.bytes volume: 72679424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.808 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f5f0179-c98f-4193-b332-c17828474415', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-vda', 'timestamp': '2025-12-05T12:04:35.802557', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9135fd5e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': '7730f03341c8e7b0de2ed8d2d8932fa70d355761bea7182d2d6f73c1364392cf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-sda', 'timestamp': '2025-12-05T12:04:35.802557', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9136257c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': '4bab201ab6151556211a8902a5cf8286bfc80fddd6cd8dbee3e0ab0d060d6c18'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-vda', 'timestamp': '2025-12-05T12:04:35.802557', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '913644a8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': 'a47c39678f86ad4bdafd528069a33481ff9637343c8c3c3b4fe6cdbed31a225d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-sda', 'timestamp': '2025-12-05T12:04:35.802557', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '91364d90-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': 'bb3744332646bf64d3c92202d777e57a7d6c0190d271319c57352a5d3c0f6098'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-vda', 'timestamp': '2025-12-05T12:04:35.802557', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '91365830-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': 'afd3bc6393019f8b6f5bee7e95bc43f87ac1e0fa6a90ceb5e8eb54d42bee97f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-sda', 'timestamp': '2025-12-05T12:04:35.802557', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'roo
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: t_gb': 1, 'disk_name': 'sda'}, 'message_id': '9136617c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': '68ad878894023cce149b66f85f1039d7856ebd9352a246fdfdf48ba80f5ddc3e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72679424, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-vda', 'timestamp': '2025-12-05T12:04:35.802557', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '91366be0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': 'd689132dc49ad07d33b748a3040af7e7d9a04bf181a07d94d2d6331a9dcb0761'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-sda', 'timestamp': '2025-12-05T12:04:35.802557', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '913675c2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': '6cb03773a35e19d7cb90a19194ed25dc6d6cd1ec8a60f0eab7134c62d1508f29'}]}, 'timestamp': '2025-12-05 12:04:35.808956', '_unique_id': 'f5c8ecd7300f45cbb80d8011e4253c43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.810 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.811 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.814 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.815 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.815 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.816 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 systemd[1]: libpod-conmon-c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec.scope: Deactivated successfully.
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98d8a620-138d-4436-8150-216b1fa2a691', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.811337', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '913790a6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': '832e7d62c2a6762aebeb2850df52abccb67447e79c75a56d08a50eeb87f2e64f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.811337', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '91379ede-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': '65471fdc0378be67a905245feea970167f8420965eee99833394a9abc50e0085'}]}, 'timestamp': '2025-12-05 12:04:35.816570', '_unique_id': 'e18ccf3e61aa46c09cb3848ba4cb370f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.817 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.818 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.821 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.821 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.822 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.824 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.824 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.824 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.825 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.825 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.825 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.read.requests volume: 1126 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.826 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e95b1af-ffa6-476f-be09-9c0da7b69161', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-vda', 'timestamp': '2025-12-05T12:04:35.818566', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '91387516-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': 'b0f39b56c1b4f8c3b8f83cc70b636d0c62a0037452ac1aabf228a421070355f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-sda', 'timestamp': '2025-12-05T12:04:35.818566', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9138827c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': 'fb1da038f43802fbb6cd8838ef38af04b83444a8a50184382e62fe23a19e0883'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-vda', 'timestamp': '2025-12-05T12:04:35.818566', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9138da06-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': '6dd4159f3e607e1e6adc8f8b6be263195a33eea10367b726a599991dd9aac5cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-sda', 'timestamp': '2025-12-05T12:04:35.818566', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9138eca8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': '759264bbc5caa215a6aca669ea127d2ceba4dc287c5ad5217a21e82be998c7b1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-vda', 'timestamp': '2025-12-05T12:04:35.818566', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9138fba8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': 'aae550505dd6fa9e7bd81080af07c081b001ef4ae6b4c146d635a8cf2228df36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-sda', 'timestamp': '2025-12-05T12:04:35.818566', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, '
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '91390936-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': '8f94b83e066090d23ead7a87c18b6de6527bc12e1ddb98aa2f0e2aab70e6da08'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1126, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-vda', 'timestamp': '2025-12-05T12:04:35.818566', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9139185e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': '5d04a4bb89b652b331e1d7afddb80890de49902532383719ff101f777fe0cf96'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-sda', 'timestamp': '2025-12-05T12:04:35.818566', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '91392740-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': '7e6750eca2688fe7a0805a690eb7e961802d88076177f047d31a14e681d44d88'}]}, 'timestamp': '2025-12-05 12:04:35.826646', '_unique_id': '36d2ebd302c44258abad8796d10b8805'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.827 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.832 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.832 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.832 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.833 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.833 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.833 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.833 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.833 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.834 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.write.latency volume: 5438402210 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.834 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a886a5a7-291d-4cc4-8d1e-3895008f6725', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-vda', 'timestamp': '2025-12-05T12:04:35.831714', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '913a1542-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': '0cc3604377d4cb87d771b5ee670fc6e9d8f8705647e97e468ff1764eda3009b0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503-sda', 'timestamp': '2025-12-05T12:04:35.831714', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '913a1e20-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.126597822, 'message_signature': '6d41a42928973dad8754c00fb4f122b3be97078d1e25b7638caf6b4fad4d627f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-vda', 'timestamp': '2025-12-05T12:04:35.831714', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '913a3806-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': '4f33e20ca3efbe48e7eddb1d47996bba147993362a5c25b87ce260bbc3746368'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-sda', 'timestamp': '2025-12-05T12:04:35.831714', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '913a4210-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.166394695, 'message_signature': '48445db2ab6b41b77eb61b692030b1c338c24c6465312fc6b775694c9653c9b0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-vda', 'timestamp': '2025-12-05T12:04:35.831714', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '913a49ae-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': 'f90e3bba8a993ac0301ef69ba8735f666b0e2751f8b6c1b236d11b06e1f66480'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'a7616662-639b-4642-b507-614773f4748f-sda', 'timestamp': '2025-12-05T12:04:35.831714', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'instance-0000002c', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephe
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: meral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '913a519c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.189749657, 'message_signature': '1afae57c6606a3d24b3162fb297f7bdbc48d509057e5f0b0687eb5f32822cfca'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5438402210, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-vda', 'timestamp': '2025-12-05T12:04:35.831714', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '913a58ea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': 'a6e3f25182994a217af618714e22115e07c5e251026f2ae9fc5c2be2a0535761'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037-sda', 'timestamp': '2025-12-05T12:04:35.831714', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '913a610a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.215735824, 'message_signature': 'c6ddd23c8cfe1b5626a7f5492ff01e2fd083cbaec90c4857474dfca09cdee389'}]}, 'timestamp': '2025-12-05 12:04:35.834594', '_unique_id': 'c8c8dcb725e44f4e8ad48b39cd43efb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.835 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.836 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.837 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.837 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.838 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.838 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42d9b0f4-0bc6-4258-8a52-d231c95b466b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.836498', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '913af41c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': '249c6ce173a8c44a7b51d81bb691007fc134abbbedc6f156a42abce5bec1dc47'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.836498', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '913afff2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': '1c65d13c2eee21af412935a560ee8f75fc9e1b6cc45b5ff596bc08f5e5c1687b'}]}, 'timestamp': '2025-12-05 12:04:35.838713', '_unique_id': 'b526dd98ed094e10a7e45efbe3cb2069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.839 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.840 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.840 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.841 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.842 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.842 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cd9c93f-6cc2-40fa-8b16-c665435b6f7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.840397', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '913b9020-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': 'd8086ae1d1b39683212d3da49061a13141ae65e74bf961737cc54e82e6a6da65'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.840397', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '913b9bec-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': '657ed791d5e7e9a4078520417a1bb9dcff8d9ac2554d3904f02baf66c0ec1cc8'}]}, 'timestamp': '2025-12-05 12:04:35.842705', '_unique_id': '670aa6f30dba455584f1a54050ede458'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.843 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.844 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.845 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.845 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.845 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.846 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.846 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.846 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.846 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] No VIF found with MAC fa:16:3e:2b:db:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.847 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Using config drive
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea2b8f49-45d9-40ca-97bd-585068d420c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.844555', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '913c1cac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': '58c356b0fd1c7499009651fc37db43bf5e905074fd7e262db2a2551185cad393'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.844555', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '913c2aee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': '38509d1dc9b95239f208deb41f1c1b9b7be3efb4b9b8143fe9c5dc29c855653c'}]}, 'timestamp': '2025-12-05 12:04:35.846371', '_unique_id': '646927e9e11b4810afa641a5c58c57b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.847 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.848 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.848 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.848 12 DEBUG ceilometer.compute.pollsters [-] f50947f2-f8d0-4d6b-bca4-b5412a206503/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.849 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.849 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/memory.usage volume: 40.46875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.849 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.849 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance a7616662-639b-4642-b507-614773f4748f: ceilometer.compute.pollsters.NoVolumeException
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.850 12 DEBUG ceilometer.compute.pollsters [-] 004672c5-70e2-4940-bc9c-8971d94cc037/memory.usage volume: 40.4375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b267a6d8-8fab-441a-8f87-89cee5f417c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'timestamp': '2025-12-05T12:04:35.848204', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1806870616', 'name': 'instance-0000002e', 'instance_id': 'f50947f2-f8d0-4d6b-bca4-b5412a206503', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '913c9ca4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.335906118, 'message_signature': '5c79bc5fbc74e337717c2ca6da53055c54fcc4803b09e2a762336728bb879908'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.46875, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'timestamp': '2025-12-05T12:04:35.848204', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'instance-00000028', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '913cb568-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.350184289, 'message_signature': 'cbe211635337f7978209846eb1075a7f735693aa1cf5f52ac9222bd2b8da685a'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4375, 'user_id': '8456efa356654e5c990efa4aef688e8a', 'user_name': None, 'project_id': '42d9566206cb469ebd803d0600019533', 'project_name': None, 'resource_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'timestamp': '2025-12-05T12:04:35.848204', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-469388429', 'name': 'instance-0000002d', 'instance_id': '004672c5-70e2-4940-bc9c-8971d94cc037', 'instance_type': 'm1.nano', 'host': '512e3e71ce8692561881bff5a1d13a7b051fc6abbd2230027b98b7c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '913cc97c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.396649655, 'message_signature': '048c95d1ae4c1230d797f19214df9d1cebab3dea726d88941a0904325ce3c8e6'}]}, 'timestamp': '2025-12-05 12:04:35.850411', '_unique_id': 'd28c615630344ee9aee6c99254d149d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.851 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.852 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MultipleCreateTestJSON-server-330967889-2>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1806870616>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-64428055>, <NovaLikeServer: tempest-ImagesTestJSON-server-404632133>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1115906111>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-469388429>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MultipleCreateTestJSON-server-330967889-2>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1806870616>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-64428055>, <NovaLikeServer: tempest-ImagesTestJSON-server-404632133>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1115906111>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-469388429>]
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.852 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.852 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000002b, id=58c3288f-57bf-4c62-8d69-9842a22e43d6>: [Error Code 42] Domain not found: no domain with matching uuid '58c3288f-57bf-4c62-8d69-9842a22e43d6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0'
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.853 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000029, id=c2e63727-b45b-4249-a94f-85b0d6314ba0>: [Error Code 42] Domain not found: no domain with matching uuid 'c2e63727-b45b-4249-a94f-85b0d6314ba0' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.853 12 DEBUG ceilometer.compute.pollsters [-] e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.853 12 DEBUG ceilometer.compute.pollsters [-] a7616662-639b-4642-b507-614773f4748f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb6b7425-c5ba-4c7e-b226-0b52d5ccd2d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_name': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_name': None, 'resource_id': 'instance-00000028-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-tapea8794b1-8d', 'timestamp': '2025-12-05T12:04:35.852310', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-404632133', 'name': 'tapea8794b1-8d', 'instance_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'instance_type': 'm1.nano', 'host': '46d70a9fd44d827666e5289f88605389cf1e3cae8a88068b7515609a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:21:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea8794b1-8d'}, 'message_id': '913d43d4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.111505219, 'message_signature': 'c2dea47c162478e0b34eec5c3b59bfdcc141cc1bd92aa91940aa647b3b64fa0c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_name': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_name': None, 'resource_id': 'instance-0000002c-a7616662-639b-4642-b507-614773f4748f-tap539a9707-ef', 'timestamp': '2025-12-05T12:04:35.852310', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1115906111', 'name': 'tap539a9707-ef', 'instance_id': 'a7616662-639b-4642-b507-614773f4748f', 'instance_type': 'm1.nano', 'host': '0ddb7ea4feabfddba184409bfc5216ae07d16736775ae243ba1247dc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:c8:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap539a9707-ef'}, 'message_id': '913d4f96-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3656.117228633, 'message_signature': '791ec7d606dc93b1fdf5f9e8334f2df6a95530dc2b420f096d824e31c8e24065'}]}, 'timestamp': '2025-12-05 12:04:35.853857', '_unique_id': '01ea43645d1440f6b67dc64e1b359683'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:04:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:04:35.854 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:04:35 compute-0 podman[223250]: 2025-12-05 12:04:35.891759843 +0000 UTC m=+0.061958832 container remove c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.895 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[59c1a4bc-fc43-46ee-af97-640835261683]: (4, ('Fri Dec  5 12:04:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 (c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec)\nc4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec\nFri Dec  5 12:04:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 (c4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec)\nc4ea933a5b4bb92928ab7cad1dbbddcea314301b044e9ba14236950acd4858ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.897 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1e844d6b-3e4f-4170-b36e-155f05c77aef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.898 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:35 compute-0 kernel: tapb8ea1ed6-90: left promiscuous mode
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.901 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 nova_compute[187208]: 2025-12-05 12:04:35.916 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.918 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0b4d1ee4-389a-4694-99f6-9b52e84bd7c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.937 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4ea893-9065-44d1-a204-9d0cd21d5ff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.938 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bc1ff742-80ea-4339-a215-422d9d9cd107]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.956 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[efd70005-b226-4f75-aed1-7fa92e395a50]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364440, 'reachable_time': 35222, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223268, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:35 compute-0 systemd[1]: run-netns-ovnmeta\x2db8ea1ed6\x2d9eec\x2d4cb3\x2da2b6\x2d6146b7b65c36.mount: Deactivated successfully.
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.959 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:04:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:35.959 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a32bd4-7182-4088-bd69-77604fc7bdff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.344 187212 DEBUG nova.network.neutron [-] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.362 187212 INFO nova.compute.manager [-] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Took 1.97 seconds to deallocate network for instance.
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.411 187212 DEBUG oslo_concurrency.lockutils [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.411 187212 DEBUG oslo_concurrency.lockutils [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.484 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Updating instance_info_cache with network_info: [{"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.488 187212 DEBUG nova.network.neutron [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Updating instance_info_cache with network_info: [{"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.520 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Releasing lock "refresh_cache-c6a957dd-2181-4e92-9e06-e1a15fe5c307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.521 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Instance network_info: |[{"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.524 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Start _get_guest_xml network_info=[{"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.525 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Releasing lock "refresh_cache-97020786-7ba5-4c8b-8a2c-838c0f663bb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.525 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Instance network_info: |[{"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.527 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Creating config drive at /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk.config
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.532 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3x0ny88r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.555 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Start _get_guest_xml network_info=[{"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.567 187212 WARNING nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.570 187212 WARNING nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.575 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.577 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.577 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.577 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.581 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.582 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.582 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.582 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.583 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.583 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.583 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.584 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.584 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.584 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.584 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.584 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.585 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.585 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.589 187212 DEBUG nova.virt.libvirt.vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-2',id=48,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:30Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=c6a957dd-2181-4e92-9e06-e1a15fe5c307,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.590 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.591 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.592 187212 DEBUG nova.objects.instance [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'pci_devices' on Instance uuid c6a957dd-2181-4e92-9e06-e1a15fe5c307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.593 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.593 187212 DEBUG nova.virt.libvirt.host [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.594 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.594 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.594 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.594 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.594 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.595 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.595 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.595 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.595 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.595 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.596 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.596 187212 DEBUG nova.virt.hardware [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.599 187212 DEBUG nova.virt.libvirt.vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-3',id=49,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:31Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=97020786-7ba5-4c8b-8a2c-838c0f663bb4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.600 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.600 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.601 187212 DEBUG nova.objects.instance [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 97020786-7ba5-4c8b-8a2c-838c0f663bb4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.623 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <uuid>c6a957dd-2181-4e92-9e06-e1a15fe5c307</uuid>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <name>instance-00000030</name>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:name>tempest-ListServersNegativeTestJSON-server-951078504-2</nova:name>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:36</nova:creationTime>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:user uuid="d51d545246e0434591329e386f100a7d">tempest-ListServersNegativeTestJSON-1128597959-project-member</nova:user>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:project uuid="31d3d0a57b064ff6abd01727d4443c0b">tempest-ListServersNegativeTestJSON-1128597959</nova:project>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:port uuid="b7157ade-85e0-4802-8d6a-0dfb86921b3a">
Dec 05 12:04:36 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="serial">c6a957dd-2181-4e92-9e06-e1a15fe5c307</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="uuid">c6a957dd-2181-4e92-9e06-e1a15fe5c307</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk.config"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:ea:04:47"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <target dev="tapb7157ade-85"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/console.log" append="off"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:36 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:36 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.624 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Preparing to wait for external event network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.624 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.624 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.624 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.625 187212 DEBUG nova.virt.libvirt.vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-2',id=48,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:30Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=c6a957dd-2181-4e92-9e06-e1a15fe5c307,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.625 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.626 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.626 187212 DEBUG os_vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.627 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <uuid>97020786-7ba5-4c8b-8a2c-838c0f663bb4</uuid>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <name>instance-00000031</name>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:name>tempest-ListServersNegativeTestJSON-server-951078504-3</nova:name>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:36</nova:creationTime>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:user uuid="d51d545246e0434591329e386f100a7d">tempest-ListServersNegativeTestJSON-1128597959-project-member</nova:user>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:project uuid="31d3d0a57b064ff6abd01727d4443c0b">tempest-ListServersNegativeTestJSON-1128597959</nova:project>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         <nova:port uuid="83fb1d43-a495-47f4-ad3a-569fd7c02c76">
Dec 05 12:04:36 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="serial">97020786-7ba5-4c8b-8a2c-838c0f663bb4</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="uuid">97020786-7ba5-4c8b-8a2c-838c0f663bb4</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk.config"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:41:de:18"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <target dev="tap83fb1d43-a4"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/console.log" append="off"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:36 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:36 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:36 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:36 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:36 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.627 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Preparing to wait for external event network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.627 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.627 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.628 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.628 187212 DEBUG nova.virt.libvirt.vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-3',id=49,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:31Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=97020786-7ba5-4c8b-8a2c-838c0f663bb4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.628 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.629 187212 DEBUG nova.network.os_vif_util [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.629 187212 DEBUG os_vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.630 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.630 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.630 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.631 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.631 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.631 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.633 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.633 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7157ade-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.634 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7157ade-85, col_values=(('external_ids', {'iface-id': 'b7157ade-85e0-4802-8d6a-0dfb86921b3a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:04:47', 'vm-uuid': 'c6a957dd-2181-4e92-9e06-e1a15fe5c307'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:36 compute-0 NetworkManager[55691]: <info>  [1764936276.6360] manager: (tapb7157ade-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.639 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.642 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.642 187212 INFO os_vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85')
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.643 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.644 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83fb1d43-a4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.644 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap83fb1d43-a4, col_values=(('external_ids', {'iface-id': '83fb1d43-a495-47f4-ad3a-569fd7c02c76', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:de:18', 'vm-uuid': '97020786-7ba5-4c8b-8a2c-838c0f663bb4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:36 compute-0 NetworkManager[55691]: <info>  [1764936276.6458] manager: (tap83fb1d43-a4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.647 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.656 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.659 187212 INFO os_vif [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4')
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.668 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3x0ny88r" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.708 187212 DEBUG nova.compute.provider_tree [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:36 compute-0 kernel: tape4966b1a-19: entered promiscuous mode
Dec 05 12:04:36 compute-0 systemd-udevd[223129]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:36 compute-0 NetworkManager[55691]: <info>  [1764936276.7247] manager: (tape4966b1a-19): new Tun device (/org/freedesktop/NetworkManager/Devices/156)
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.728 187212 DEBUG nova.scheduler.client.report [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:36 compute-0 ovn_controller[95610]: 2025-12-05T12:04:36Z|00364|binding|INFO|Claiming lport e4966b1a-1933-4c85-a2a0-6b5a788efd7a for this chassis.
Dec 05 12:04:36 compute-0 ovn_controller[95610]: 2025-12-05T12:04:36Z|00365|binding|INFO|e4966b1a-1933-4c85-a2a0-6b5a788efd7a: Claiming fa:16:3e:2b:db:3c 10.100.0.6
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.734 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 NetworkManager[55691]: <info>  [1764936276.7409] device (tape4966b1a-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:36 compute-0 NetworkManager[55691]: <info>  [1764936276.7427] device (tape4966b1a-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.750 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:db:3c 10.100.0.6'], port_security=['fa:16:3e:2b:db:3c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '10048ac5-1fbc-45e6-aa94-01eff87b9ffc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e4966b1a-1933-4c85-a2a0-6b5a788efd7a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.751 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e4966b1a-1933-4c85-a2a0-6b5a788efd7a in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 bound to our chassis
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.752 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.760 187212 DEBUG oslo_concurrency.lockutils [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.764 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[18a84200-f80c-4045-aa2b-27f6a9790125]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.765 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2dd8ae79-a1 in ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.767 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2dd8ae79-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.767 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5df07317-5304-44f9-b311-09095b1ca0ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 systemd-machined[153543]: New machine qemu-51-instance-0000002f.
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.771 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6da21fad-a81e-407e-a842-b2c7829fb0fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.784 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[71e77786-63ea-44f2-a1e8-917dd9b9ccd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 systemd[1]: Started Virtual Machine qemu-51-instance-0000002f.
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.800 187212 INFO nova.scheduler.client.report [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Deleted allocations for instance 082d2145-1505-4170-9a11-4e46bf86fed2
Dec 05 12:04:36 compute-0 ovn_controller[95610]: 2025-12-05T12:04:36Z|00366|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:04:36 compute-0 ovn_controller[95610]: 2025-12-05T12:04:36Z|00367|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.804 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.804 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.805 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] No VIF found with MAC fa:16:3e:ea:04:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.805 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Using config drive
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.807 187212 DEBUG nova.network.neutron [-] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.810 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.810 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.810 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] No VIF found with MAC fa:16:3e:41:de:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.811 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Using config drive
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.817 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6a00fb8e-baf0-4862-8cff-de02146a4dc3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.836 187212 INFO nova.compute.manager [-] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Took 1.22 seconds to deallocate network for instance.
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.859 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4f9753-6bd1-4e3b-93b6-6c0fc2f0144c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 NetworkManager[55691]: <info>  [1764936276.8916] manager: (tap2dd8ae79-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/157)
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.891 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fab3c976-733f-4797-85ca-5ce1db0ab198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.893 187212 DEBUG oslo_concurrency.lockutils [None req-9cb0c8eb-2aed-4406-8d37-d193ab3acb52 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.896 187212 DEBUG oslo_concurrency.lockutils [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.896 187212 DEBUG oslo_concurrency.lockutils [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:36 compute-0 ovn_controller[95610]: 2025-12-05T12:04:36Z|00368|binding|INFO|Setting lport e4966b1a-1933-4c85-a2a0-6b5a788efd7a ovn-installed in OVS
Dec 05 12:04:36 compute-0 ovn_controller[95610]: 2025-12-05T12:04:36Z|00369|binding|INFO|Setting lport e4966b1a-1933-4c85-a2a0-6b5a788efd7a up in Southbound
Dec 05 12:04:36 compute-0 nova_compute[187208]: 2025-12-05 12:04:36.946 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.954 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[be31db88-b476-4cd4-8e45-5392803afa81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.957 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d827ea1b-e2f8-4e4b-be22-2ec1792c2ca8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:36 compute-0 NetworkManager[55691]: <info>  [1764936276.9788] device (tap2dd8ae79-a0): carrier: link connected
Dec 05 12:04:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.983 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3a5fa663-9f80-41fe-befa-87ab583f6b11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:36.999 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f133686e-09e5-4ae8-9c1e-06f8a3b60782]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365754, 'reachable_time': 25183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223366, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.014 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a809622a-50e3-4c4a-8d95-317c07534abc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:dbed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365754, 'tstamp': 365754}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223375, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.031 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ca514036-93e7-4de7-8b73-224b05ef8070]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365754, 'reachable_time': 25183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223376, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.064 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb31dbd-3aef-4fa8-af97-f7326782720b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.081 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.081 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.103 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936277.0967088, 10048ac5-1fbc-45e6-aa94-01eff87b9ffc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.104 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] VM Started (Lifecycle Event)
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.134 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.135 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.135 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.138 187212 DEBUG nova.compute.provider_tree [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.142 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936277.0967984, 10048ac5-1fbc-45e6-aa94-01eff87b9ffc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.142 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] VM Paused (Lifecycle Event)
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.149 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed672e9-a109-4b2c-a600-1e67a4cad2f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.151 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.151 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.152 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dd8ae79-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:37 compute-0 NetworkManager[55691]: <info>  [1764936277.1546] manager: (tap2dd8ae79-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Dec 05 12:04:37 compute-0 kernel: tap2dd8ae79-a0: entered promiscuous mode
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.153 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.157 187212 DEBUG nova.network.neutron [-] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.159 187212 DEBUG nova.scheduler.client.report [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.162 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2dd8ae79-a0, col_values=(('external_ids', {'iface-id': '96c6c9a6-c871-4fab-9fdc-eedbdd230979'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:37 compute-0 ovn_controller[95610]: 2025-12-05T12:04:37Z|00370|binding|INFO|Releasing lport 96c6c9a6-c871-4fab-9fdc-eedbdd230979 from this chassis (sb_readonly=0)
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.179 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.182 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.187 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.187 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.188 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.188 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[34fcc2e9-9ad4-4790-80c8-00b1d1428c4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.189 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.pid.haproxy
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.191 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'env', 'PROCESS_TAG=haproxy-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.209 187212 DEBUG oslo_concurrency.lockutils [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.211 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.214 187212 INFO nova.compute.manager [-] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Took 1.59 seconds to deallocate network for instance.
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.232 187212 INFO nova.scheduler.client.report [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Deleted allocations for instance 58c3288f-57bf-4c62-8d69-9842a22e43d6
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.240 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Creating config drive at /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk.config
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.244 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc2mzb4sc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.278 187212 INFO nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Creating config drive at /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk.config
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.285 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzcsdgqy7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.311 187212 DEBUG oslo_concurrency.lockutils [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.312 187212 DEBUG oslo_concurrency.lockutils [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.367 187212 DEBUG oslo_concurrency.lockutils [None req-88e58ad4-3443-420a-96b1-53042c3098a2 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.381 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc2mzb4sc" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.418 187212 DEBUG oslo_concurrency.processutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzcsdgqy7" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:37 compute-0 kernel: tap83fb1d43-a4: entered promiscuous mode
Dec 05 12:04:37 compute-0 NetworkManager[55691]: <info>  [1764936277.4590] manager: (tap83fb1d43-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Dec 05 12:04:37 compute-0 ovn_controller[95610]: 2025-12-05T12:04:37Z|00371|binding|INFO|Claiming lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 for this chassis.
Dec 05 12:04:37 compute-0 ovn_controller[95610]: 2025-12-05T12:04:37Z|00372|binding|INFO|83fb1d43-a495-47f4-ad3a-569fd7c02c76: Claiming fa:16:3e:41:de:18 10.100.0.14
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.464 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.467 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:de:18 10.100.0.14'], port_security=['fa:16:3e:41:de:18 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '97020786-7ba5-4c8b-8a2c-838c0f663bb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=83fb1d43-a495-47f4-ad3a-569fd7c02c76) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:37 compute-0 ovn_controller[95610]: 2025-12-05T12:04:37Z|00373|binding|INFO|Setting lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 ovn-installed in OVS
Dec 05 12:04:37 compute-0 ovn_controller[95610]: 2025-12-05T12:04:37Z|00374|binding|INFO|Setting lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 up in Southbound
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.481 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 NetworkManager[55691]: <info>  [1764936277.5012] manager: (tapb7157ade-85): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Dec 05 12:04:37 compute-0 systemd-udevd[223441]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:37 compute-0 kernel: tapb7157ade-85: entered promiscuous mode
Dec 05 12:04:37 compute-0 systemd-udevd[223443]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:37 compute-0 ovn_controller[95610]: 2025-12-05T12:04:37Z|00375|binding|INFO|Claiming lport b7157ade-85e0-4802-8d6a-0dfb86921b3a for this chassis.
Dec 05 12:04:37 compute-0 ovn_controller[95610]: 2025-12-05T12:04:37Z|00376|binding|INFO|b7157ade-85e0-4802-8d6a-0dfb86921b3a: Claiming fa:16:3e:ea:04:47 10.100.0.5
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.505 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.515 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:04:47 10.100.0.5'], port_security=['fa:16:3e:ea:04:47 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c6a957dd-2181-4e92-9e06-e1a15fe5c307', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b7157ade-85e0-4802-8d6a-0dfb86921b3a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:37 compute-0 systemd-machined[153543]: New machine qemu-52-instance-00000031.
Dec 05 12:04:37 compute-0 ovn_controller[95610]: 2025-12-05T12:04:37Z|00377|binding|INFO|Setting lport b7157ade-85e0-4802-8d6a-0dfb86921b3a ovn-installed in OVS
Dec 05 12:04:37 compute-0 ovn_controller[95610]: 2025-12-05T12:04:37Z|00378|binding|INFO|Setting lport b7157ade-85e0-4802-8d6a-0dfb86921b3a up in Southbound
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.520 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 NetworkManager[55691]: <info>  [1764936277.5216] device (tapb7157ade-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.523 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 systemd[1]: Started Virtual Machine qemu-52-instance-00000031.
Dec 05 12:04:37 compute-0 NetworkManager[55691]: <info>  [1764936277.5245] device (tapb7157ade-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:37 compute-0 NetworkManager[55691]: <info>  [1764936277.5252] device (tap83fb1d43-a4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:37 compute-0 NetworkManager[55691]: <info>  [1764936277.5261] device (tap83fb1d43-a4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.528 187212 DEBUG nova.compute.provider_tree [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.544 187212 DEBUG nova.scheduler.client.report [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:37 compute-0 systemd-machined[153543]: New machine qemu-53-instance-00000030.
Dec 05 12:04:37 compute-0 systemd[1]: Started Virtual Machine qemu-53-instance-00000030.
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.565 187212 DEBUG oslo_concurrency.lockutils [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.581 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.593 187212 INFO nova.scheduler.client.report [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Deleted allocations for instance c2e63727-b45b-4249-a94f-85b0d6314ba0
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.596 187212 DEBUG nova.network.neutron [req-d18f5097-6af4-4f7a-a17c-57c7760d85b6 req-6529407e-88c6-46b0-9713-66e2cc00f387 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Updated VIF entry in instance network info cache for port e4966b1a-1933-4c85-a2a0-6b5a788efd7a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.597 187212 DEBUG nova.network.neutron [req-d18f5097-6af4-4f7a-a17c-57c7760d85b6 req-6529407e-88c6-46b0-9713-66e2cc00f387 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Updating instance_info_cache with network_info: [{"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.614 187212 DEBUG oslo_concurrency.lockutils [req-d18f5097-6af4-4f7a-a17c-57c7760d85b6 req-6529407e-88c6-46b0-9713-66e2cc00f387 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-10048ac5-1fbc-45e6-aa94-01eff87b9ffc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:37 compute-0 podman[223454]: 2025-12-05 12:04:37.653163441 +0000 UTC m=+0.080746282 container create 787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.676 187212 DEBUG oslo_concurrency.lockutils [None req-af84f0f4-791a-4043-b6cc-4f5a1dc8f294 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:37 compute-0 systemd[1]: Started libpod-conmon-787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da.scope.
Dec 05 12:04:37 compute-0 podman[223454]: 2025-12-05 12:04:37.608340222 +0000 UTC m=+0.035923093 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:04:37 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:04:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc61896ea332a9ba7458de04edb53befc42cb9ab21a97cb2cedca198eb888fcf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:04:37 compute-0 podman[223454]: 2025-12-05 12:04:37.734051956 +0000 UTC m=+0.161634817 container init 787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 05 12:04:37 compute-0 podman[223454]: 2025-12-05 12:04:37.740338377 +0000 UTC m=+0.167921218 container start 787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 05 12:04:37 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [NOTICE]   (223483) : New worker (223485) forked
Dec 05 12:04:37 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [NOTICE]   (223483) : Loading success.
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.805 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 83fb1d43-a495-47f4-ad3a-569fd7c02c76 in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 unbound from our chassis
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.808 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.829 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[361cf095-33ad-4dee-934e-430b8002e3a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.890 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e3e41f-e099-411b-93eb-123251e2566d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.893 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1f008211-ba33-4fb5-84b4-8acac531af92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.921 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f97cedb6-e486-4f7f-b53d-89c9e6217b4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.935 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a43810d2-0eef-4c8a-a96f-f4eac34bb6e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365754, 'reachable_time': 25183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223506, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.945 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936277.9447007, c6a957dd-2181-4e92-9e06-e1a15fe5c307 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.945 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] VM Started (Lifecycle Event)
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.950 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fd1c1bc2-f99b-401d-9894-e6097b1382e3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365765, 'tstamp': 365765}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223507, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365769, 'tstamp': 365769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223507, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.951 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.993 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.993 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.994 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dd8ae79-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.994 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.995 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2dd8ae79-a0, col_values=(('external_ids', {'iface-id': '96c6c9a6-c871-4fab-9fdc-eedbdd230979'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.995 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.997 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b7157ade-85e0-4802-8d6a-0dfb86921b3a in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 unbound from our chassis
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.998 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936277.9455109, c6a957dd-2181-4e92-9e06-e1a15fe5c307 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:37 compute-0 nova_compute[187208]: 2025-12-05 12:04:37.998 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] VM Paused (Lifecycle Event)
Dec 05 12:04:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:37.999 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.014 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.015 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[79786158-f2ee-4af7-8256-da4066c5d5be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.019 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.036 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.042 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[88e6e53b-6dc8-496f-a537-458a7899c183]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.047 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[02049a78-f6c9-48c1-8707-9080327800e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.076 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[421fef53-d68e-40d4-8f1f-fd0d184904c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.099 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d0bec51d-8e38-431b-9cfb-7808b2b9253e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 7, 'rx_bytes': 266, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 7, 'rx_bytes': 266, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365754, 'reachable_time': 25183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223520, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.110 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936278.1096356, 97020786-7ba5-4c8b-8a2c-838c0f663bb4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.110 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] VM Started (Lifecycle Event)
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.115 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[14f73b75-279c-41fc-bdd2-8ddfaa83b8bb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365765, 'tstamp': 365765}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223521, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365769, 'tstamp': 365769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223521, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.117 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.119 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.120 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dd8ae79-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.120 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.120 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2dd8ae79-a0, col_values=(('external_ids', {'iface-id': '96c6c9a6-c871-4fab-9fdc-eedbdd230979'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:38.120 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.128 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.132 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936278.1117687, 97020786-7ba5-4c8b-8a2c-838c0f663bb4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.132 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] VM Paused (Lifecycle Event)
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.151 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.153 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.186 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:38 compute-0 ovn_controller[95610]: 2025-12-05T12:04:38Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:21:a9 10.100.0.3
Dec 05 12:04:38 compute-0 ovn_controller[95610]: 2025-12-05T12:04:38Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:21:a9 10.100.0.3
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.469 187212 DEBUG nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-changed-b7157ade-85e0-4802-8d6a-0dfb86921b3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.470 187212 DEBUG nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Refreshing instance network info cache due to event network-changed-b7157ade-85e0-4802-8d6a-0dfb86921b3a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.470 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-c6a957dd-2181-4e92-9e06-e1a15fe5c307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.471 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-c6a957dd-2181-4e92-9e06-e1a15fe5c307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.471 187212 DEBUG nova.network.neutron [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Refreshing network info cache for port b7157ade-85e0-4802-8d6a-0dfb86921b3a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.896 187212 DEBUG nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Received event network-vif-unplugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.896 187212 DEBUG oslo_concurrency.lockutils [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.897 187212 DEBUG oslo_concurrency.lockutils [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.897 187212 DEBUG oslo_concurrency.lockutils [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.897 187212 DEBUG nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] No waiting events found dispatching network-vif-unplugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.897 187212 WARNING nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Received unexpected event network-vif-unplugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 for instance with vm_state deleted and task_state None.
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.898 187212 DEBUG nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received event network-vif-deleted-d7b765ff-93e1-4594-9e3c-e177dee2e07b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.898 187212 DEBUG nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Received event network-vif-plugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.898 187212 DEBUG oslo_concurrency.lockutils [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.898 187212 DEBUG oslo_concurrency.lockutils [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.898 187212 DEBUG oslo_concurrency.lockutils [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.899 187212 DEBUG nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] No waiting events found dispatching network-vif-plugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.899 187212 WARNING nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Received unexpected event network-vif-plugged-eabadaa6-16c4-434c-83ea-96dfa62d7f79 for instance with vm_state deleted and task_state None.
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.899 187212 DEBUG nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Received event network-vif-deleted-eabadaa6-16c4-434c-83ea-96dfa62d7f79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.899 187212 DEBUG nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received event network-vif-deleted-656f63d2-77f9-46f7-9338-81bc5a056ad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.899 187212 DEBUG nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Received event network-vif-plugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.900 187212 DEBUG oslo_concurrency.lockutils [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.900 187212 DEBUG oslo_concurrency.lockutils [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.900 187212 DEBUG oslo_concurrency.lockutils [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.900 187212 DEBUG nova.compute.manager [req-cb7c2810-ec5c-470b-b407-3377c6550e7b req-54cc8ce6-aa35-44fe-bff5-85f350c408ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Processing event network-vif-plugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.901 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.905 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936278.9049475, 10048ac5-1fbc-45e6-aa94-01eff87b9ffc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.905 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] VM Resumed (Lifecycle Event)
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.926 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.927 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.930 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.932 187212 INFO nova.virt.libvirt.driver [-] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Instance spawned successfully.
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.932 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.954 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.959 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.959 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.960 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.960 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.960 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:38 compute-0 nova_compute[187208]: 2025-12-05 12:04:38.961 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.025 187212 INFO nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Took 9.14 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.025 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.096 187212 INFO nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Took 10.20 seconds to build instance.
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.116 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.266 187212 DEBUG oslo_concurrency.lockutils [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.266 187212 DEBUG oslo_concurrency.lockutils [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.267 187212 DEBUG oslo_concurrency.lockutils [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.267 187212 DEBUG oslo_concurrency.lockutils [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.267 187212 DEBUG oslo_concurrency.lockutils [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.268 187212 INFO nova.compute.manager [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Terminating instance
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.269 187212 DEBUG nova.compute.manager [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:04:39 compute-0 kernel: tap539a9707-ef (unregistering): left promiscuous mode
Dec 05 12:04:39 compute-0 NetworkManager[55691]: <info>  [1764936279.2845] device (tap539a9707-ef): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.291 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:39 compute-0 ovn_controller[95610]: 2025-12-05T12:04:39Z|00379|binding|INFO|Releasing lport 539a9707-ef82-4c64-aec4-3759222680f0 from this chassis (sb_readonly=0)
Dec 05 12:04:39 compute-0 ovn_controller[95610]: 2025-12-05T12:04:39Z|00380|binding|INFO|Setting lport 539a9707-ef82-4c64-aec4-3759222680f0 down in Southbound
Dec 05 12:04:39 compute-0 ovn_controller[95610]: 2025-12-05T12:04:39Z|00381|binding|INFO|Removing iface tap539a9707-ef ovn-installed in OVS
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.295 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.303 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:39 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Dec 05 12:04:39 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000002c.scope: Consumed 11.166s CPU time.
Dec 05 12:04:39 compute-0 systemd-machined[153543]: Machine qemu-45-instance-0000002c terminated.
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.413 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:c8:06 10.100.0.4'], port_security=['fa:16:3e:d2:c8:06 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a7616662-639b-4642-b507-614773f4748f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d064000-316c-46a7-a23c-1dc26318b6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79895287bd1d488c842f6013729a1f81', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e1ec2415-6840-4cf9-b5ac-efaf1a9c9a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3804b014-203a-4c47-b0bb-7634579c4ec4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=539a9707-ef82-4c64-aec4-3759222680f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.414 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 539a9707-ef82-4c64-aec4-3759222680f0 in datapath 5d064000-316c-46a7-a23c-1dc26318b6a4 unbound from our chassis
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.416 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d064000-316c-46a7-a23c-1dc26318b6a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.417 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[55e56281-21bd-426b-93e4-c57d752d3450]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.418 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 namespace which is not needed anymore
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.543 187212 INFO nova.virt.libvirt.driver [-] [instance: a7616662-639b-4642-b507-614773f4748f] Instance destroyed successfully.
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.544 187212 DEBUG nova.objects.instance [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'resources' on Instance uuid a7616662-639b-4642-b507-614773f4748f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:39 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[222398]: [NOTICE]   (222402) : haproxy version is 2.8.14-c23fe91
Dec 05 12:04:39 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[222398]: [NOTICE]   (222402) : path to executable is /usr/sbin/haproxy
Dec 05 12:04:39 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[222398]: [WARNING]  (222402) : Exiting Master process...
Dec 05 12:04:39 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[222398]: [ALERT]    (222402) : Current worker (222404) exited with code 143 (Terminated)
Dec 05 12:04:39 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[222398]: [WARNING]  (222402) : All workers exited. Exiting... (0)
Dec 05 12:04:39 compute-0 systemd[1]: libpod-4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc.scope: Deactivated successfully.
Dec 05 12:04:39 compute-0 podman[223554]: 2025-12-05 12:04:39.585952185 +0000 UTC m=+0.049982028 container died 4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:04:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc-userdata-shm.mount: Deactivated successfully.
Dec 05 12:04:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-95fa845c032e9a9b1ecabac8ab36f6e4863c974067db1dae8ab549ec5cbc0437-merged.mount: Deactivated successfully.
Dec 05 12:04:39 compute-0 podman[223554]: 2025-12-05 12:04:39.630971799 +0000 UTC m=+0.095001642 container cleanup 4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:04:39 compute-0 systemd[1]: libpod-conmon-4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc.scope: Deactivated successfully.
Dec 05 12:04:39 compute-0 podman[223589]: 2025-12-05 12:04:39.69952731 +0000 UTC m=+0.041629788 container remove 4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.704 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f519096-371c-4740-a772-e19f0ac22763]: (4, ('Fri Dec  5 12:04:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 (4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc)\n4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc\nFri Dec  5 12:04:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 (4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc)\n4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.706 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cab57377-9370-42cc-b57b-91f5104973c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.706 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d064000-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:39 compute-0 kernel: tap5d064000-30: left promiscuous mode
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.710 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.736 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.740 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3d20eb-fb54-4126-be8a-039ce8594e5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.745 187212 DEBUG nova.virt.libvirt.vif [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1115906111',id=44,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:04:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-jo4hn4lg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:35Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=a7616662-639b-4642-b507-614773f4748f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.746 187212 DEBUG nova.network.os_vif_util [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.747 187212 DEBUG nova.network.os_vif_util [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.748 187212 DEBUG os_vif [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.750 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3d90b46e-3be8-4592-9333-243c0f06a749]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.751 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[377cbab7-5e70-4ceb-bcb1-d3845e882128]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.754 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap539a9707-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.762 187212 INFO os_vif [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef')
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.763 187212 INFO nova.virt.libvirt.driver [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Deleting instance files /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f_del
Dec 05 12:04:39 compute-0 nova_compute[187208]: 2025-12-05 12:04:39.764 187212 INFO nova.virt.libvirt.driver [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Deletion of /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f_del complete
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.770 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a9025f43-a957-41c0-bd17-8bb364f7280d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363781, 'reachable_time': 25276, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223606, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d5d064000\x2d316c\x2d46a7\x2da23c\x2d1dc26318b6a4.mount: Deactivated successfully.
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.774 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:04:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:39.774 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0a1fa5-1270-4ae1-9039-4a0b1c4643c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.713 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.714 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.714 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.714 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.799 187212 INFO nova.compute.manager [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Took 1.53 seconds to destroy the instance on the hypervisor.
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.800 187212 DEBUG oslo.service.loopingcall [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.800 187212 DEBUG nova.compute.manager [-] [instance: a7616662-639b-4642-b507-614773f4748f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.800 187212 DEBUG nova.network.neutron [-] [instance: a7616662-639b-4642-b507-614773f4748f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.812 187212 DEBUG nova.network.neutron [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Updated VIF entry in instance network info cache for port b7157ade-85e0-4802-8d6a-0dfb86921b3a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.812 187212 DEBUG nova.network.neutron [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Updating instance_info_cache with network_info: [{"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:40 compute-0 podman[223609]: 2025-12-05 12:04:40.819735544 +0000 UTC m=+0.053085557 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.852 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.875 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-c6a957dd-2181-4e92-9e06-e1a15fe5c307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.876 187212 DEBUG nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received event network-vif-unplugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.876 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.876 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.877 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.877 187212 DEBUG nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] No waiting events found dispatching network-vif-unplugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.877 187212 WARNING nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received unexpected event network-vif-unplugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b for instance with vm_state deleted and task_state None.
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.877 187212 DEBUG nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received event network-changed-83fb1d43-a495-47f4-ad3a-569fd7c02c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.878 187212 DEBUG nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Refreshing instance network info cache due to event network-changed-83fb1d43-a495-47f4-ad3a-569fd7c02c76. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.878 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-97020786-7ba5-4c8b-8a2c-838c0f663bb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.878 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-97020786-7ba5-4c8b-8a2c-838c0f663bb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.878 187212 DEBUG nova.network.neutron [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Refreshing network info cache for port 83fb1d43-a495-47f4-ad3a-569fd7c02c76 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.914 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.915 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.939 187212 DEBUG nova.compute.manager [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.944 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.945 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.966 187212 DEBUG nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.973 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:40 compute-0 nova_compute[187208]: 2025-12-05 12:04:40.979 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.018 187212 INFO nova.compute.manager [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] instance snapshotting
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.038 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.039 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.073 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.074 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.089 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.089 187212 INFO nova.compute.claims [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.100 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.105 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:41.117 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:41.118 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.172 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.172 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.240 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.246 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.315 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.316 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.382 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.387 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.445 187212 DEBUG nova.compute.provider_tree [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.449 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.449 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.472 187212 DEBUG nova.scheduler.client.report [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.503 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.504 187212 DEBUG nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.508 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.510 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Error from libvirt while getting description of instance-0000002c: [Error Code 42] Domain not found: no domain with matching uuid 'a7616662-639b-4642-b507-614773f4748f' (instance-0000002c): libvirt.libvirtError: Domain not found: no domain with matching uuid 'a7616662-639b-4642-b507-614773f4748f' (instance-0000002c)
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.514 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.567 187212 DEBUG nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.568 187212 DEBUG nova.network.neutron [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.577 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.578 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.600 187212 INFO nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.623 187212 DEBUG nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.639 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.722 187212 DEBUG nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.723 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.724 187212 INFO nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Creating image(s)
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.724 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.724 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.725 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.726 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "3f7693dca94777de01080bcebdbe8d46d5f07f15" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.726 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "3f7693dca94777de01080bcebdbe8d46d5f07f15" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.770 187212 INFO nova.virt.libvirt.driver [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Beginning live snapshot process
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.856 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.857 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4792MB free_disk=73.20376586914062GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.857 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.857 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:41 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.939 187212 DEBUG nova.policy [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.942 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.971 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.972 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance a7616662-639b-4642-b507-614773f4748f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.972 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 004672c5-70e2-4940-bc9c-8971d94cc037 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.972 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance f50947f2-f8d0-4d6b-bca4-b5412a206503 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.972 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 10048ac5-1fbc-45e6-aa94-01eff87b9ffc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.973 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance c6a957dd-2181-4e92-9e06-e1a15fe5c307 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.973 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 97020786-7ba5-4c8b-8a2c-838c0f663bb4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.973 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.973 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 8 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:04:41 compute-0 nova_compute[187208]: 2025-12-05 12:04:41.974 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1536MB phys_disk=79GB used_disk=8GB total_vcpus=8 used_vcpus=8 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.003 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json -f qcow2" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.004 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.066 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.082 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.147 187212 DEBUG nova.network.neutron [-] [instance: a7616662-639b-4642-b507-614773f4748f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.149 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.150 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp6042ldnt/62fa4d75a2b143faafbd93625417e1a6.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.174 187212 INFO nova.compute.manager [-] [instance: a7616662-639b-4642-b507-614773f4748f] Took 1.37 seconds to deallocate network for instance.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.237 187212 DEBUG oslo_concurrency.lockutils [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.322 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.344 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp6042ldnt/62fa4d75a2b143faafbd93625417e1a6.delta 1073741824" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.345 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.349 187212 INFO nova.virt.libvirt.driver [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.387 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.387 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.530s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.388 187212 DEBUG oslo_concurrency.lockutils [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.623 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.629 187212 DEBUG nova.network.neutron [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Successfully created port: 009ccacf-51c2-430a-9da2-4a4b522861e6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.633 187212 DEBUG nova.compute.manager [req-82f1f59b-90d7-4579-93cc-00bdb399ad99 req-0c66f890-f1b9-4211-9777-50091cd3b7e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received event network-vif-unplugged-539a9707-ef82-4c64-aec4-3759222680f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.633 187212 DEBUG oslo_concurrency.lockutils [req-82f1f59b-90d7-4579-93cc-00bdb399ad99 req-0c66f890-f1b9-4211-9777-50091cd3b7e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.633 187212 DEBUG oslo_concurrency.lockutils [req-82f1f59b-90d7-4579-93cc-00bdb399ad99 req-0c66f890-f1b9-4211-9777-50091cd3b7e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.633 187212 DEBUG oslo_concurrency.lockutils [req-82f1f59b-90d7-4579-93cc-00bdb399ad99 req-0c66f890-f1b9-4211-9777-50091cd3b7e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.633 187212 DEBUG nova.compute.manager [req-82f1f59b-90d7-4579-93cc-00bdb399ad99 req-0c66f890-f1b9-4211-9777-50091cd3b7e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] No waiting events found dispatching network-vif-unplugged-539a9707-ef82-4c64-aec4-3759222680f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.634 187212 WARNING nova.compute.manager [req-82f1f59b-90d7-4579-93cc-00bdb399ad99 req-0c66f890-f1b9-4211-9777-50091cd3b7e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received unexpected event network-vif-unplugged-539a9707-ef82-4c64-aec4-3759222680f0 for instance with vm_state deleted and task_state None.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.635 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Received event network-vif-plugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.635 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.636 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.636 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.636 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] No waiting events found dispatching network-vif-plugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.636 187212 WARNING nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Received unexpected event network-vif-plugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a for instance with vm_state active and task_state None.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.636 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received event network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.637 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.637 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.637 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.637 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Processing event network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.637 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received event network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.638 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.638 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.638 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.638 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] No waiting events found dispatching network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.638 187212 WARNING nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received unexpected event network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 for instance with vm_state building and task_state spawning.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.639 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.639 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.639 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.639 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.639 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Processing event network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.640 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.640 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.640 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.640 187212 DEBUG oslo_concurrency.lockutils [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.640 187212 DEBUG nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] No waiting events found dispatching network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.640 187212 WARNING nova.compute.manager [req-39d5e6f5-cb8e-4dfc-963d-6d23102e7d4b req-40b7296e-1cdf-419a-9c2c-521c5153bc0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received unexpected event network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a for instance with vm_state building and task_state spawning.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.644 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.644 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.646 187212 DEBUG nova.virt.libvirt.guest [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] COPY block job progress, current cursor: 10682368 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.648 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936282.6483042, c6a957dd-2181-4e92-9e06-e1a15fe5c307 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.648 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] VM Resumed (Lifecycle Event)
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.651 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.654 187212 INFO nova.virt.libvirt.driver [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Instance spawned successfully.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.655 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.657 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.665 187212 INFO nova.virt.libvirt.driver [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Instance spawned successfully.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.666 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.675 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.681 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.684 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.685 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.685 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.686 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.686 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.686 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.699 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.700 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.700 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.701 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.701 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.701 187212 DEBUG nova.virt.libvirt.driver [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.740 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.741 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936282.6491756, 97020786-7ba5-4c8b-8a2c-838c0f663bb4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.741 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] VM Resumed (Lifecycle Event)
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.794 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.797 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.824 187212 INFO nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Took 12.07 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.825 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.826 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.830 187212 INFO nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Took 11.23 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.830 187212 DEBUG nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.837 187212 DEBUG nova.compute.provider_tree [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.872 187212 DEBUG nova.scheduler.client.report [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.903 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.925 187212 DEBUG oslo_concurrency.lockutils [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.938 187212 INFO nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Took 13.92 seconds to build instance.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.940 187212 INFO nova.compute.manager [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Took 13.87 seconds to build instance.
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.959 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.960 187212 DEBUG oslo_concurrency.lockutils [None req-6540fdd8-eae0-48f6-a4de-05c650139a20 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.963 187212 INFO nova.scheduler.client.report [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Deleted allocations for instance a7616662-639b-4642-b507-614773f4748f
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.966 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15.part --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.967 187212 DEBUG nova.virt.images [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] 71fb11d9-b174-4d79-8db5-59ad67ae02ef was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.967 187212 DEBUG nova.privsep.utils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.968 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15.part /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.990 187212 DEBUG nova.network.neutron [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Updated VIF entry in instance network info cache for port 83fb1d43-a495-47f4-ad3a-569fd7c02c76. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:42 compute-0 nova_compute[187208]: 2025-12-05 12:04:42.990 187212 DEBUG nova.network.neutron [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Updating instance_info_cache with network_info: [{"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.024 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-97020786-7ba5-4c8b-8a2c-838c0f663bb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.025 187212 DEBUG nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received event network-vif-plugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.025 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.026 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.026 187212 DEBUG oslo_concurrency.lockutils [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.026 187212 DEBUG nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] No waiting events found dispatching network-vif-plugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.027 187212 WARNING nova.compute.manager [req-73dc98ad-924c-467b-b4ef-2926ff4f56d5 req-d4e6430e-2486-4cfc-a07b-5c669b966312 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received unexpected event network-vif-plugged-d7b765ff-93e1-4594-9e3c-e177dee2e07b for instance with vm_state deleted and task_state None.
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.055 187212 DEBUG oslo_concurrency.lockutils [None req-1b768fe9-eea4-4856-a12f-bd21b329c1db 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.789s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.144 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15.part /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15.converted" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.149 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.174 187212 DEBUG nova.virt.libvirt.guest [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] COPY block job progress, current cursor: 75235328 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.178 187212 INFO nova.virt.libvirt.driver [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.215 187212 DEBUG nova.privsep.utils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.215 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp6042ldnt/62fa4d75a2b143faafbd93625417e1a6.delta /var/lib/nova/instances/snapshots/tmp6042ldnt/62fa4d75a2b143faafbd93625417e1a6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.235 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15.converted --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.236 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "3f7693dca94777de01080bcebdbe8d46d5f07f15" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.251 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.341 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.343 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "3f7693dca94777de01080bcebdbe8d46d5f07f15" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.343 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "3f7693dca94777de01080bcebdbe8d46d5f07f15" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.360 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.386 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.386 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.386 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.387 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.442 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.443 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15,backing_fmt=raw /var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.616 187212 DEBUG nova.network.neutron [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Successfully updated port: 009ccacf-51c2-430a-9da2-4a4b522861e6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.628 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.629 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.629 187212 DEBUG nova.network.neutron [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.693 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15,backing_fmt=raw /var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk 1073741824" returned: 0 in 0.251s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.694 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "3f7693dca94777de01080bcebdbe8d46d5f07f15" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.695 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.718 187212 DEBUG oslo_concurrency.processutils [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp6042ldnt/62fa4d75a2b143faafbd93625417e1a6.delta /var/lib/nova/instances/snapshots/tmp6042ldnt/62fa4d75a2b143faafbd93625417e1a6" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.726 187212 INFO nova.virt.libvirt.driver [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Snapshot extracted, beginning image upload
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.760 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.760 187212 DEBUG nova.objects.instance [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.768 187212 DEBUG nova.network.neutron [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.780 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.780 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Ensure instance console log exists: /var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.781 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.782 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:43 compute-0 nova_compute[187208]: 2025-12-05 12:04:43.783 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.097 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.098 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.117 187212 DEBUG nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.186 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.187 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.197 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.197 187212 INFO nova.compute.claims [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.486 187212 DEBUG nova.compute.provider_tree [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.509 187212 DEBUG nova.scheduler.client.report [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.536 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.536 187212 DEBUG nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.625 187212 DEBUG nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.625 187212 DEBUG nova.network.neutron [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.654 187212 INFO nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.679 187212 DEBUG nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.758 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.785 187212 DEBUG nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.786 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.787 187212 INFO nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Creating image(s)
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.788 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "/var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.788 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.789 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.804 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.866 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.868 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.868 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.886 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.958 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:44 compute-0 nova_compute[187208]: 2025-12-05 12:04:44.958 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.000 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.002 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.002 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.061 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.064 187212 DEBUG nova.virt.disk.api [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Checking if we can resize image /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.064 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.122 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.123 187212 DEBUG nova.virt.disk.api [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Cannot resize image /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.124 187212 DEBUG nova.objects.instance [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'migration_context' on Instance uuid 21873f07-a1da-4158-a5b2-1d44d547874e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.141 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.142 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Ensure instance console log exists: /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.142 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.143 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.143 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.309 187212 DEBUG nova.policy [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4671f6c82ea049fab3a314ecf45b7656', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.578 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936270.5633159, c2e63727-b45b-4249-a94f-85b0d6314ba0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.579 187212 INFO nova.compute.manager [-] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] VM Stopped (Lifecycle Event)
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.608 187212 DEBUG nova.network.neutron [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Updating instance_info_cache with network_info: [{"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.612 187212 DEBUG nova.compute.manager [None req-a9a859a7-898d-4e8c-a7a1-46e31b10a26a - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.637 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.637 187212 DEBUG nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Instance network_info: |[{"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.641 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Start _get_guest_xml network_info=[{"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-12-05T12:04:28Z,direct_url=<?>,disk_format='qcow2',id=71fb11d9-b174-4d79-8db5-59ad67ae02ef,min_disk=1,min_ram=0,name='tempest-test-snap-348196468',owner='43e63f5c6b0f4840ad4df23fb5c10764',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-12-05T12:04:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': '71fb11d9-b174-4d79-8db5-59ad67ae02ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.645 187212 WARNING nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.651 187212 DEBUG nova.compute.manager [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received event network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.651 187212 DEBUG oslo_concurrency.lockutils [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.652 187212 DEBUG oslo_concurrency.lockutils [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.652 187212 DEBUG oslo_concurrency.lockutils [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.652 187212 DEBUG nova.compute.manager [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] No waiting events found dispatching network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.652 187212 WARNING nova.compute.manager [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received unexpected event network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 for instance with vm_state deleted and task_state None.
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.652 187212 DEBUG nova.compute.manager [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Received event network-changed-009ccacf-51c2-430a-9da2-4a4b522861e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.652 187212 DEBUG nova.compute.manager [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Refreshing instance network info cache due to event network-changed-009ccacf-51c2-430a-9da2-4a4b522861e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.652 187212 DEBUG oslo_concurrency.lockutils [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.653 187212 DEBUG oslo_concurrency.lockutils [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.653 187212 DEBUG nova.network.neutron [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Refreshing network info cache for port 009ccacf-51c2-430a-9da2-4a4b522861e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.654 187212 DEBUG nova.virt.libvirt.host [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.655 187212 DEBUG nova.virt.libvirt.host [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.663 187212 DEBUG nova.virt.libvirt.host [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.663 187212 DEBUG nova.virt.libvirt.host [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.664 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.664 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-12-05T12:04:28Z,direct_url=<?>,disk_format='qcow2',id=71fb11d9-b174-4d79-8db5-59ad67ae02ef,min_disk=1,min_ram=0,name='tempest-test-snap-348196468',owner='43e63f5c6b0f4840ad4df23fb5c10764',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-12-05T12:04:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.665 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.665 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.665 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.665 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.665 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.666 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.666 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.666 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.666 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.666 187212 DEBUG nova.virt.hardware [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.669 187212 DEBUG nova.virt.libvirt.vif [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-827375400',display_name='tempest-ImagesTestJSON-server-827375400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-827375400',id=50,image_ref='71fb11d9-b174-4d79-8db5-59ad67ae02ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-sg6vyxwo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='e5212ff3-c6ed-4f02-99c4-becad0e5f2a5',image_min_disk='1',image_min_ram='0',image_owner_id='43e63f5c6b0f4840ad4df23fb5c10764',image_owner_project_name='tempest-ImagesTestJSON-276789408',image_owner_user_name='tempest-ImagesTestJSON-276789408-project-member',image_user_id='a00ac4435e6647779ffaf4a5cde18fdb',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:41Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.669 187212 DEBUG nova.network.os_vif_util [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.670 187212 DEBUG nova.network.os_vif_util [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:cb:a5,bridge_name='br-int',has_traffic_filtering=True,id=009ccacf-51c2-430a-9da2-4a4b522861e6,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap009ccacf-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.671 187212 DEBUG nova.objects.instance [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.687 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <uuid>8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1</uuid>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <name>instance-00000032</name>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesTestJSON-server-827375400</nova:name>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:45</nova:creationTime>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:45 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:45 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:45 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:45 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:45 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:45 compute-0 nova_compute[187208]:         <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec 05 12:04:45 compute-0 nova_compute[187208]:         <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="71fb11d9-b174-4d79-8db5-59ad67ae02ef"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:45 compute-0 nova_compute[187208]:         <nova:port uuid="009ccacf-51c2-430a-9da2-4a4b522861e6">
Dec 05 12:04:45 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <entry name="serial">8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1</entry>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <entry name="uuid">8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1</entry>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk.config"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:20:cb:a5"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <target dev="tap009ccacf-51"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/console.log" append="off"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:45 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:45 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:45 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:45 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:45 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.687 187212 DEBUG nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Preparing to wait for external event network-vif-plugged-009ccacf-51c2-430a-9da2-4a4b522861e6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.687 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.687 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.687 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.688 187212 DEBUG nova.virt.libvirt.vif [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-827375400',display_name='tempest-ImagesTestJSON-server-827375400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-827375400',id=50,image_ref='71fb11d9-b174-4d79-8db5-59ad67ae02ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-sg6vyxwo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='e5212ff3-c6ed-4f02-99c4-becad0e5f2a5',image_min_disk='1',image_min_ram='0',image_owner_id='43e63f5c6b0f4840ad4df23fb5c10764',image_owner_project_name='tempest-ImagesTestJSON-276789408',image_owner_user_name='tempest-ImagesTestJSON-276789408-project-member',image_user_id='a00ac4435e6647779ffaf4a5cde18fdb',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:41Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.688 187212 DEBUG nova.network.os_vif_util [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.689 187212 DEBUG nova.network.os_vif_util [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:cb:a5,bridge_name='br-int',has_traffic_filtering=True,id=009ccacf-51c2-430a-9da2-4a4b522861e6,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap009ccacf-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.689 187212 DEBUG os_vif [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:cb:a5,bridge_name='br-int',has_traffic_filtering=True,id=009ccacf-51c2-430a-9da2-4a4b522861e6,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap009ccacf-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.689 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.690 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.690 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.692 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.692 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap009ccacf-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.693 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap009ccacf-51, col_values=(('external_ids', {'iface-id': '009ccacf-51c2-430a-9da2-4a4b522861e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:cb:a5', 'vm-uuid': '8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.730 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:45 compute-0 NetworkManager[55691]: <info>  [1764936285.7315] manager: (tap009ccacf-51): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.733 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.737 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.738 187212 INFO os_vif [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:cb:a5,bridge_name='br-int',has_traffic_filtering=True,id=009ccacf-51c2-430a-9da2-4a4b522861e6,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap009ccacf-51')
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.742 187212 DEBUG nova.compute.manager [req-2ccbb23f-f385-43da-a7be-17b2545c501b req-54154b6a-a906-4a60-b134-4a015b2de056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received event network-vif-deleted-539a9707-ef82-4c64-aec4-3759222680f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.798 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.798 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.798 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:20:cb:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:45 compute-0 nova_compute[187208]: 2025-12-05 12:04:45.799 187212 INFO nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Using config drive
Dec 05 12:04:46 compute-0 nova_compute[187208]: 2025-12-05 12:04:46.375 187212 DEBUG nova.network.neutron [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Successfully created port: c4a66ea2-9b1b-486a-a750-17072882c42e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:46 compute-0 nova_compute[187208]: 2025-12-05 12:04:46.621 187212 INFO nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Creating config drive at /var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk.config
Dec 05 12:04:46 compute-0 nova_compute[187208]: 2025-12-05 12:04:46.627 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq6s_q0mx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:46 compute-0 nova_compute[187208]: 2025-12-05 12:04:46.766 187212 DEBUG oslo_concurrency.processutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq6s_q0mx" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:46 compute-0 NetworkManager[55691]: <info>  [1764936286.8385] manager: (tap009ccacf-51): new Tun device (/org/freedesktop/NetworkManager/Devices/162)
Dec 05 12:04:46 compute-0 kernel: tap009ccacf-51: entered promiscuous mode
Dec 05 12:04:46 compute-0 nova_compute[187208]: 2025-12-05 12:04:46.856 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:46 compute-0 ovn_controller[95610]: 2025-12-05T12:04:46Z|00382|binding|INFO|Claiming lport 009ccacf-51c2-430a-9da2-4a4b522861e6 for this chassis.
Dec 05 12:04:46 compute-0 ovn_controller[95610]: 2025-12-05T12:04:46Z|00383|binding|INFO|009ccacf-51c2-430a-9da2-4a4b522861e6: Claiming fa:16:3e:20:cb:a5 10.100.0.10
Dec 05 12:04:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:46.871 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:cb:a5 10.100.0.10'], port_security=['fa:16:3e:20:cb:a5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=009ccacf-51c2-430a-9da2-4a4b522861e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:46.872 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 009ccacf-51c2-430a-9da2-4a4b522861e6 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis
Dec 05 12:04:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:46.875 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:04:46 compute-0 nova_compute[187208]: 2025-12-05 12:04:46.881 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:46 compute-0 ovn_controller[95610]: 2025-12-05T12:04:46Z|00384|binding|INFO|Setting lport 009ccacf-51c2-430a-9da2-4a4b522861e6 ovn-installed in OVS
Dec 05 12:04:46 compute-0 ovn_controller[95610]: 2025-12-05T12:04:46Z|00385|binding|INFO|Setting lport 009ccacf-51c2-430a-9da2-4a4b522861e6 up in Southbound
Dec 05 12:04:46 compute-0 systemd-udevd[223763]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:46 compute-0 nova_compute[187208]: 2025-12-05 12:04:46.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:46.892 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d6bed6e9-21d9-4f27-aa21-91f590796fb1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:46 compute-0 systemd-machined[153543]: New machine qemu-54-instance-00000032.
Dec 05 12:04:46 compute-0 NetworkManager[55691]: <info>  [1764936286.9042] device (tap009ccacf-51): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:46 compute-0 NetworkManager[55691]: <info>  [1764936286.9052] device (tap009ccacf-51): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:46 compute-0 systemd[1]: Started Virtual Machine qemu-54-instance-00000032.
Dec 05 12:04:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:46.935 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[da12670c-ec4e-4819-9a3a-4511d4df2383]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:46.939 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[52ca71ea-0563-44c7-b288-6db217b69f8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:46.976 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[92a0b7b1-0c26-41f0-9546-fade074c1a85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:46.992 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b525ec21-48e1-49b7-8d7b-c69666d696ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363907, 'reachable_time': 21906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223776, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.005 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed695a3b-ad15-493d-ad68-8591b454a245]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap41b3b495-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363920, 'tstamp': 363920}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223778, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap41b3b495-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363924, 'tstamp': 363924}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223778, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.007 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.008 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.013 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.013 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.014 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.014 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.209 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936287.2072084, 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.209 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] VM Started (Lifecycle Event)
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.232 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.239 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936287.208422, 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.239 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] VM Paused (Lifecycle Event)
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.260 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.265 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.302 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.420 187212 INFO nova.virt.libvirt.driver [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Snapshot image upload complete
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.421 187212 INFO nova.compute.manager [None req-63f865f8-9080-4fe8-b8cf-286f38bdc141 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Took 6.40 seconds to snapshot the instance on the hypervisor.
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.453 187212 DEBUG oslo_concurrency.lockutils [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.453 187212 DEBUG oslo_concurrency.lockutils [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.454 187212 DEBUG oslo_concurrency.lockutils [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.454 187212 DEBUG oslo_concurrency.lockutils [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.454 187212 DEBUG oslo_concurrency.lockutils [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.456 187212 INFO nova.compute.manager [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Terminating instance
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.457 187212 DEBUG nova.compute.manager [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:04:47 compute-0 kernel: tape4966b1a-19 (unregistering): left promiscuous mode
Dec 05 12:04:47 compute-0 NetworkManager[55691]: <info>  [1764936287.4760] device (tape4966b1a-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.486 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 ovn_controller[95610]: 2025-12-05T12:04:47Z|00386|binding|INFO|Releasing lport e4966b1a-1933-4c85-a2a0-6b5a788efd7a from this chassis (sb_readonly=0)
Dec 05 12:04:47 compute-0 ovn_controller[95610]: 2025-12-05T12:04:47Z|00387|binding|INFO|Setting lport e4966b1a-1933-4c85-a2a0-6b5a788efd7a down in Southbound
Dec 05 12:04:47 compute-0 ovn_controller[95610]: 2025-12-05T12:04:47Z|00388|binding|INFO|Removing iface tape4966b1a-19 ovn-installed in OVS
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.492 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.500 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:db:3c 10.100.0.6'], port_security=['fa:16:3e:2b:db:3c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '10048ac5-1fbc-45e6-aa94-01eff87b9ffc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e4966b1a-1933-4c85-a2a0-6b5a788efd7a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.501 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e4966b1a-1933-4c85-a2a0-6b5a788efd7a in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 unbound from our chassis
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.503 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.505 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.517 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f131872-f194-45eb-90f9-84128419c41f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:47 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Dec 05 12:04:47 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000002f.scope: Consumed 8.705s CPU time.
Dec 05 12:04:47 compute-0 systemd-machined[153543]: Machine qemu-51-instance-0000002f terminated.
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.532 187212 DEBUG nova.network.neutron [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Successfully updated port: c4a66ea2-9b1b-486a-a750-17072882c42e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.545 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-21873f07-a1da-4158-a5b2-1d44d547874e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.546 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-21873f07-a1da-4158-a5b2-1d44d547874e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.546 187212 DEBUG nova.network.neutron [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.550 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[524d05ab-ef5f-4a05-8894-51e1e9a51ae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.553 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed441f6-eaf9-4d5c-8b55-bcae427e70c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:47 compute-0 podman[223787]: 2025-12-05 12:04:47.57412448 +0000 UTC m=+0.072942148 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.585 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.593 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b81b23a1-ac42-4cca-bb46-09c8e0b815c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.608 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[431a1810-54ba-4c33-97b4-cf48fe0bfa61]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 9, 'rx_bytes': 532, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 9, 'rx_bytes': 532, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365754, 'reachable_time': 25183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223812, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.626 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[424d2bdc-9329-41ec-9fad-7ca55cb6adf7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365765, 'tstamp': 365765}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223813, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365769, 'tstamp': 365769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223813, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.628 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.630 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.636 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dd8ae79-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.636 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.637 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2dd8ae79-a0, col_values=(('external_ids', {'iface-id': '96c6c9a6-c871-4fab-9fdc-eedbdd230979'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:47.637 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.642 187212 DEBUG nova.network.neutron [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Updated VIF entry in instance network info cache for port 009ccacf-51c2-430a-9da2-4a4b522861e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.643 187212 DEBUG nova.network.neutron [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Updating instance_info_cache with network_info: [{"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.658 187212 DEBUG oslo_concurrency.lockutils [req-4e737f51-ea59-4929-866c-204f7253d40b req-feaa50a6-8ce9-47b4-b43c-38675e814fcc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.678 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.683 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.720 187212 INFO nova.virt.libvirt.driver [-] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Instance destroyed successfully.
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.720 187212 DEBUG nova.objects.instance [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'resources' on Instance uuid 10048ac5-1fbc-45e6-aa94-01eff87b9ffc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.734 187212 DEBUG nova.virt.libvirt.vif [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-1',id=47,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:04:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:39Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=10048ac5-1fbc-45e6-aa94-01eff87b9ffc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.734 187212 DEBUG nova.network.os_vif_util [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "address": "fa:16:3e:2b:db:3c", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4966b1a-19", "ovs_interfaceid": "e4966b1a-1933-4c85-a2a0-6b5a788efd7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.736 187212 DEBUG nova.network.os_vif_util [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:db:3c,bridge_name='br-int',has_traffic_filtering=True,id=e4966b1a-1933-4c85-a2a0-6b5a788efd7a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4966b1a-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.736 187212 DEBUG os_vif [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:db:3c,bridge_name='br-int',has_traffic_filtering=True,id=e4966b1a-1933-4c85-a2a0-6b5a788efd7a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4966b1a-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.738 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.738 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape4966b1a-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.740 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.742 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.745 187212 INFO os_vif [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:db:3c,bridge_name='br-int',has_traffic_filtering=True,id=e4966b1a-1933-4c85-a2a0-6b5a788efd7a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4966b1a-19')
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.746 187212 INFO nova.virt.libvirt.driver [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Deleting instance files /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc_del
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.747 187212 INFO nova.virt.libvirt.driver [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Deletion of /var/lib/nova/instances/10048ac5-1fbc-45e6-aa94-01eff87b9ffc_del complete
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.818 187212 INFO nova.compute.manager [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.819 187212 DEBUG oslo.service.loopingcall [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.819 187212 DEBUG nova.compute.manager [-] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.820 187212 DEBUG nova.network.neutron [-] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:04:47 compute-0 nova_compute[187208]: 2025-12-05 12:04:47.853 187212 DEBUG nova.network.neutron [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.146 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.147 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.186 187212 DEBUG nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.265 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.266 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.275 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.276 187212 INFO nova.compute.claims [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.574 187212 DEBUG nova.compute.provider_tree [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.588 187212 DEBUG nova.scheduler.client.report [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.608 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.609 187212 DEBUG nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.674 187212 DEBUG nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.675 187212 DEBUG nova.network.neutron [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.696 187212 INFO nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.699 187212 DEBUG nova.network.neutron [-] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.717 187212 DEBUG nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.721 187212 INFO nova.compute.manager [-] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Took 0.90 seconds to deallocate network for instance.
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.753 187212 DEBUG nova.compute.manager [req-5b23238a-c32f-4911-8fc1-95655f0a6e85 req-0a781b78-c95d-4ef4-8306-9c57a211ba9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received event network-changed-c4a66ea2-9b1b-486a-a750-17072882c42e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.753 187212 DEBUG nova.compute.manager [req-5b23238a-c32f-4911-8fc1-95655f0a6e85 req-0a781b78-c95d-4ef4-8306-9c57a211ba9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Refreshing instance network info cache due to event network-changed-c4a66ea2-9b1b-486a-a750-17072882c42e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.754 187212 DEBUG oslo_concurrency.lockutils [req-5b23238a-c32f-4911-8fc1-95655f0a6e85 req-0a781b78-c95d-4ef4-8306-9c57a211ba9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-21873f07-a1da-4158-a5b2-1d44d547874e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.779 187212 DEBUG oslo_concurrency.lockutils [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.780 187212 DEBUG oslo_concurrency.lockutils [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.830 187212 DEBUG nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.832 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.832 187212 INFO nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Creating image(s)
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.833 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "/var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.833 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.836 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.853 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.896 187212 DEBUG nova.compute.manager [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Received event network-vif-plugged-009ccacf-51c2-430a-9da2-4a4b522861e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.897 187212 DEBUG oslo_concurrency.lockutils [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.897 187212 DEBUG oslo_concurrency.lockutils [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.898 187212 DEBUG oslo_concurrency.lockutils [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.898 187212 DEBUG nova.compute.manager [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Processing event network-vif-plugged-009ccacf-51c2-430a-9da2-4a4b522861e6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.898 187212 DEBUG nova.compute.manager [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Received event network-vif-plugged-009ccacf-51c2-430a-9da2-4a4b522861e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.898 187212 DEBUG oslo_concurrency.lockutils [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.898 187212 DEBUG oslo_concurrency.lockutils [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.899 187212 DEBUG oslo_concurrency.lockutils [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.899 187212 DEBUG nova.compute.manager [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] No waiting events found dispatching network-vif-plugged-009ccacf-51c2-430a-9da2-4a4b522861e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.899 187212 WARNING nova.compute.manager [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Received unexpected event network-vif-plugged-009ccacf-51c2-430a-9da2-4a4b522861e6 for instance with vm_state building and task_state spawning.
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.900 187212 DEBUG nova.compute.manager [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Received event network-vif-unplugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.900 187212 DEBUG oslo_concurrency.lockutils [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.900 187212 DEBUG oslo_concurrency.lockutils [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.900 187212 DEBUG oslo_concurrency.lockutils [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.901 187212 DEBUG nova.compute.manager [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] No waiting events found dispatching network-vif-unplugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.901 187212 WARNING nova.compute.manager [req-ef514d91-5c55-4d24-be69-cb911f176ce4 req-b76c1e32-96e4-45b0-8f06-c614dbc72e92 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Received unexpected event network-vif-unplugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a for instance with vm_state deleted and task_state None.
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.904 187212 DEBUG nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.909 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936288.9086835, 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.911 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] VM Resumed (Lifecycle Event)
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.917 187212 DEBUG nova.virt.libvirt.driver [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.926 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.927 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.928 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.940 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.976 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.980 187212 INFO nova.virt.libvirt.driver [-] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Instance spawned successfully.
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.981 187212 INFO nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Took 7.26 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.982 187212 DEBUG nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.985 187212 DEBUG nova.policy [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:04:48 compute-0 nova_compute[187208]: 2025-12-05 12:04:48.987 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.005 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.008 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.037 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.046 187212 INFO nova.compute.manager [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Took 8.02 seconds to build instance.
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.065 187212 DEBUG nova.compute.provider_tree [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.070 187212 DEBUG oslo_concurrency.lockutils [None req-3da31d22-d96d-4bca-ad1b-270b96797446 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.082 187212 DEBUG nova.scheduler.client.report [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.101 187212 DEBUG oslo_concurrency.lockutils [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.107 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk 1073741824" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.107 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.108 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.132 187212 INFO nova.scheduler.client.report [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Deleted allocations for instance 10048ac5-1fbc-45e6-aa94-01eff87b9ffc
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.174 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.175 187212 DEBUG nova.virt.disk.api [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Checking if we can resize image /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.175 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.239 187212 DEBUG oslo_concurrency.lockutils [None req-8da3f013-53eb-46da-a8df-fe70ac2cea0f d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.242 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.242 187212 DEBUG nova.virt.disk.api [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Cannot resize image /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.243 187212 DEBUG nova.objects.instance [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'migration_context' on Instance uuid 00262d23-bf60-44d9-a775-63ba32adaf96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.260 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.261 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Ensure instance console log exists: /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.262 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.262 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.263 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.272 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936274.2717924, 082d2145-1505-4170-9a11-4e46bf86fed2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.273 187212 INFO nova.compute.manager [-] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] VM Stopped (Lifecycle Event)
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.298 187212 DEBUG nova.compute.manager [None req-0bf0ac85-06e0-4535-856f-75ae65ef88e6 - - - - - -] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.549 187212 DEBUG nova.network.neutron [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Updating instance_info_cache with network_info: [{"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.567 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-21873f07-a1da-4158-a5b2-1d44d547874e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.568 187212 DEBUG nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Instance network_info: |[{"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.568 187212 DEBUG oslo_concurrency.lockutils [req-5b23238a-c32f-4911-8fc1-95655f0a6e85 req-0a781b78-c95d-4ef4-8306-9c57a211ba9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-21873f07-a1da-4158-a5b2-1d44d547874e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.568 187212 DEBUG nova.network.neutron [req-5b23238a-c32f-4911-8fc1-95655f0a6e85 req-0a781b78-c95d-4ef4-8306-9c57a211ba9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Refreshing network info cache for port c4a66ea2-9b1b-486a-a750-17072882c42e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.572 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Start _get_guest_xml network_info=[{"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.575 187212 WARNING nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.582 187212 DEBUG nova.virt.libvirt.host [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.583 187212 DEBUG nova.virt.libvirt.host [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.586 187212 DEBUG nova.virt.libvirt.host [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.587 187212 DEBUG nova.virt.libvirt.host [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.588 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.588 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.588 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.589 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.589 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.589 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.590 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.590 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.590 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.591 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.591 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.591 187212 DEBUG nova.virt.hardware [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.596 187212 DEBUG nova.virt.libvirt.vif [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1892322843',display_name='tempest-DeleteServersTestJSON-server-1892322843',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1892322843',id=51,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-javbhuzq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:44Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=21873f07-a1da-4158-a5b2-1d44d547874e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.596 187212 DEBUG nova.network.os_vif_util [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.597 187212 DEBUG nova.network.os_vif_util [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.598 187212 DEBUG nova.objects.instance [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid 21873f07-a1da-4158-a5b2-1d44d547874e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.615 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <uuid>21873f07-a1da-4158-a5b2-1d44d547874e</uuid>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <name>instance-00000033</name>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <nova:name>tempest-DeleteServersTestJSON-server-1892322843</nova:name>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:49</nova:creationTime>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:49 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:49 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:49 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:49 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:49 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:49 compute-0 nova_compute[187208]:         <nova:user uuid="ff425b7b04144f93a2c15e3a347fc15c">tempest-DeleteServersTestJSON-554028480-project-member</nova:user>
Dec 05 12:04:49 compute-0 nova_compute[187208]:         <nova:project uuid="4671f6c82ea049fab3a314ecf45b7656">tempest-DeleteServersTestJSON-554028480</nova:project>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:49 compute-0 nova_compute[187208]:         <nova:port uuid="c4a66ea2-9b1b-486a-a750-17072882c42e">
Dec 05 12:04:49 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <entry name="serial">21873f07-a1da-4158-a5b2-1d44d547874e</entry>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <entry name="uuid">21873f07-a1da-4158-a5b2-1d44d547874e</entry>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk.config"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:b7:70:07"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <target dev="tapc4a66ea2-9b"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/console.log" append="off"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:49 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:49 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:49 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:49 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:49 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.620 187212 DEBUG nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Preparing to wait for external event network-vif-plugged-c4a66ea2-9b1b-486a-a750-17072882c42e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.620 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.621 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.621 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.622 187212 DEBUG nova.virt.libvirt.vif [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1892322843',display_name='tempest-DeleteServersTestJSON-server-1892322843',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1892322843',id=51,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-javbhuzq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:44Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=21873f07-a1da-4158-a5b2-1d44d547874e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.622 187212 DEBUG nova.network.os_vif_util [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.623 187212 DEBUG nova.network.os_vif_util [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.624 187212 DEBUG os_vif [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.625 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.625 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.626 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.628 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.629 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4a66ea2-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.629 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc4a66ea2-9b, col_values=(('external_ids', {'iface-id': 'c4a66ea2-9b1b-486a-a750-17072882c42e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:70:07', 'vm-uuid': '21873f07-a1da-4158-a5b2-1d44d547874e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:49 compute-0 NetworkManager[55691]: <info>  [1764936289.6324] manager: (tapc4a66ea2-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.631 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.633 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.637 187212 INFO os_vif [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b')
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.693 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.694 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.694 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No VIF found with MAC fa:16:3e:b7:70:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:49 compute-0 nova_compute[187208]: 2025-12-05 12:04:49.695 187212 INFO nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Using config drive
Dec 05 12:04:50 compute-0 nova_compute[187208]: 2025-12-05 12:04:50.049 187212 DEBUG nova.network.neutron [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Successfully created port: ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:04:50 compute-0 nova_compute[187208]: 2025-12-05 12:04:50.472 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936275.4712782, 58c3288f-57bf-4c62-8d69-9842a22e43d6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:50 compute-0 nova_compute[187208]: 2025-12-05 12:04:50.472 187212 INFO nova.compute.manager [-] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] VM Stopped (Lifecycle Event)
Dec 05 12:04:50 compute-0 nova_compute[187208]: 2025-12-05 12:04:50.490 187212 DEBUG nova.compute.manager [None req-ac514a55-9ee0-4eaa-9a31-fb40137f208a - - - - - -] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:50 compute-0 nova_compute[187208]: 2025-12-05 12:04:50.945 187212 INFO nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Creating config drive at /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk.config
Dec 05 12:04:50 compute-0 nova_compute[187208]: 2025-12-05 12:04:50.950 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj4dimltp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.081 187212 DEBUG oslo_concurrency.processutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj4dimltp" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.119 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:51 compute-0 kernel: tapc4a66ea2-9b: entered promiscuous mode
Dec 05 12:04:51 compute-0 NetworkManager[55691]: <info>  [1764936291.1439] manager: (tapc4a66ea2-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/164)
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.146 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:51 compute-0 ovn_controller[95610]: 2025-12-05T12:04:51Z|00389|binding|INFO|Claiming lport c4a66ea2-9b1b-486a-a750-17072882c42e for this chassis.
Dec 05 12:04:51 compute-0 ovn_controller[95610]: 2025-12-05T12:04:51Z|00390|binding|INFO|c4a66ea2-9b1b-486a-a750-17072882c42e: Claiming fa:16:3e:b7:70:07 10.100.0.9
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.156 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:70:07 10.100.0.9'], port_security=['fa:16:3e:b7:70:07 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '21873f07-a1da-4158-a5b2-1d44d547874e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c4a66ea2-9b1b-486a-a750-17072882c42e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.157 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c4a66ea2-9b1b-486a-a750-17072882c42e in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 bound to our chassis
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.159 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.173 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc7015d-3195-44fe-af9c-5e8334dfcf70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.174 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7360f84-b1 in ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.175 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7360f84-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.175 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eaf68931-ee2a-42ce-834d-de254f2e45ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.176 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac5aef4-56b2-4424-af16-a1576dcdf75f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.187 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[58a8130c-6cb0-43fd-9ea0-844d762fd96c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 systemd-machined[153543]: New machine qemu-55-instance-00000033.
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.210 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:51 compute-0 ovn_controller[95610]: 2025-12-05T12:04:51Z|00391|binding|INFO|Setting lport c4a66ea2-9b1b-486a-a750-17072882c42e ovn-installed in OVS
Dec 05 12:04:51 compute-0 ovn_controller[95610]: 2025-12-05T12:04:51Z|00392|binding|INFO|Setting lport c4a66ea2-9b1b-486a-a750-17072882c42e up in Southbound
Dec 05 12:04:51 compute-0 systemd[1]: Started Virtual Machine qemu-55-instance-00000033.
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.214 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b52cf983-750b-43ae-8ef8-c50929555ddb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 systemd-udevd[223870]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:51 compute-0 NetworkManager[55691]: <info>  [1764936291.2321] device (tapc4a66ea2-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:51 compute-0 NetworkManager[55691]: <info>  [1764936291.2330] device (tapc4a66ea2-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.246 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2e530b-0452-4c4f-b1be-6b71b53b2aff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 NetworkManager[55691]: <info>  [1764936291.2539] manager: (tapd7360f84-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/165)
Dec 05 12:04:51 compute-0 systemd-udevd[223876]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.253 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5fcaeac2-b0dc-4b09-972e-db0ec9072fb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.294 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6009d8-ccbf-4089-bcef-54eeb5073097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.297 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cae65d33-0e2b-4aa5-a638-7ff945a92e8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 NetworkManager[55691]: <info>  [1764936291.3224] device (tapd7360f84-b0): carrier: link connected
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.327 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fc577cf3-6c8e-48f3-a3aa-0eec8416a577]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.346 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[39c8baf1-94bf-4ada-adf0-299a9d34c771]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367188, 'reachable_time': 27580, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223900, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.364 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6f996f10-7790-4020-a058-54cdff00c526]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:2b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 367188, 'tstamp': 367188}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223901, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.381 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2aea1502-0ac7-41d6-ba92-a55e0e5ee662]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367188, 'reachable_time': 27580, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223902, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.413 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b7ab928-55b1-4678-a39a-2798ad09e2ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.481 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62358e10-a5db-44a8-90f1-808710cd686d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.482 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.483 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.483 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7360f84-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:51 compute-0 NetworkManager[55691]: <info>  [1764936291.4859] manager: (tapd7360f84-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.486 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:51 compute-0 kernel: tapd7360f84-b0: entered promiscuous mode
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.490 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.493 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7360f84-b0, col_values=(('external_ids', {'iface-id': 'd85bc323-c3ce-47e3-ac1f-d5f27467a4e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:51 compute-0 ovn_controller[95610]: 2025-12-05T12:04:51Z|00393|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.499 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.501 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f800d7d1-db6b-476b-9caa-034ca10315d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.502 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:04:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:51.502 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'env', 'PROCESS_TAG=haproxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.508 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.779 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936291.7789838, 21873f07-a1da-4158-a5b2-1d44d547874e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.780 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] VM Started (Lifecycle Event)
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.811 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.818 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936291.779221, 21873f07-a1da-4158-a5b2-1d44d547874e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.818 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] VM Paused (Lifecycle Event)
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.843 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.847 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.864 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.965 187212 DEBUG nova.network.neutron [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Successfully updated port: ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.980 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "refresh_cache-00262d23-bf60-44d9-a775-63ba32adaf96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.981 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquired lock "refresh_cache-00262d23-bf60-44d9-a775-63ba32adaf96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:51 compute-0 nova_compute[187208]: 2025-12-05 12:04:51.981 187212 DEBUG nova.network.neutron [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:04:52 compute-0 podman[223939]: 2025-12-05 12:04:52.091128527 +0000 UTC m=+0.058165953 container create 58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.102 187212 DEBUG nova.compute.manager [req-cb9d1bf2-b62e-47d2-b5ed-af947b4d8e4a req-4a2f4dc5-21a4-46d9-83f2-65f236b0a1f7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Received event network-vif-deleted-e4966b1a-1933-4c85-a2a0-6b5a788efd7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.132 187212 DEBUG nova.network.neutron [req-5b23238a-c32f-4911-8fc1-95655f0a6e85 req-0a781b78-c95d-4ef4-8306-9c57a211ba9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Updated VIF entry in instance network info cache for port c4a66ea2-9b1b-486a-a750-17072882c42e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.133 187212 DEBUG nova.network.neutron [req-5b23238a-c32f-4911-8fc1-95655f0a6e85 req-0a781b78-c95d-4ef4-8306-9c57a211ba9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Updating instance_info_cache with network_info: [{"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:52 compute-0 podman[223939]: 2025-12-05 12:04:52.054350139 +0000 UTC m=+0.021387585 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.182 187212 DEBUG oslo_concurrency.lockutils [req-5b23238a-c32f-4911-8fc1-95655f0a6e85 req-0a781b78-c95d-4ef4-8306-9c57a211ba9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-21873f07-a1da-4158-a5b2-1d44d547874e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.183 187212 DEBUG nova.network.neutron [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.191 187212 DEBUG nova.compute.manager [req-c9ff5958-740e-4170-a2db-7179bcdf1abe req-43956b7d-22ae-46bb-99d9-9a29cb1e7a72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Received event network-vif-plugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.192 187212 DEBUG oslo_concurrency.lockutils [req-c9ff5958-740e-4170-a2db-7179bcdf1abe req-43956b7d-22ae-46bb-99d9-9a29cb1e7a72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.192 187212 DEBUG oslo_concurrency.lockutils [req-c9ff5958-740e-4170-a2db-7179bcdf1abe req-43956b7d-22ae-46bb-99d9-9a29cb1e7a72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.192 187212 DEBUG oslo_concurrency.lockutils [req-c9ff5958-740e-4170-a2db-7179bcdf1abe req-43956b7d-22ae-46bb-99d9-9a29cb1e7a72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "10048ac5-1fbc-45e6-aa94-01eff87b9ffc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.192 187212 DEBUG nova.compute.manager [req-c9ff5958-740e-4170-a2db-7179bcdf1abe req-43956b7d-22ae-46bb-99d9-9a29cb1e7a72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] No waiting events found dispatching network-vif-plugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.192 187212 WARNING nova.compute.manager [req-c9ff5958-740e-4170-a2db-7179bcdf1abe req-43956b7d-22ae-46bb-99d9-9a29cb1e7a72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Received unexpected event network-vif-plugged-e4966b1a-1933-4c85-a2a0-6b5a788efd7a for instance with vm_state deleted and task_state None.
Dec 05 12:04:52 compute-0 systemd[1]: Started libpod-conmon-58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02.scope.
Dec 05 12:04:52 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:04:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80661869de99bb41bbf0b1d44f502ba85da6c00c2f0b6e9a79286957b69b3345/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:04:52 compute-0 podman[223939]: 2025-12-05 12:04:52.239463381 +0000 UTC m=+0.206500837 container init 58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:04:52 compute-0 podman[223939]: 2025-12-05 12:04:52.245595847 +0000 UTC m=+0.212633273 container start 58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:04:52 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[223952]: [NOTICE]   (223956) : New worker (223958) forked
Dec 05 12:04:52 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[223952]: [NOTICE]   (223956) : Loading success.
Dec 05 12:04:52 compute-0 nova_compute[187208]: 2025-12-05 12:04:52.587 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.160 187212 DEBUG oslo_concurrency.lockutils [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.160 187212 DEBUG oslo_concurrency.lockutils [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.160 187212 DEBUG oslo_concurrency.lockutils [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.161 187212 DEBUG oslo_concurrency.lockutils [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.161 187212 DEBUG oslo_concurrency.lockutils [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.162 187212 INFO nova.compute.manager [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Terminating instance
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.164 187212 DEBUG nova.compute.manager [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:04:53 compute-0 kernel: tap009ccacf-51 (unregistering): left promiscuous mode
Dec 05 12:04:53 compute-0 NetworkManager[55691]: <info>  [1764936293.1859] device (tap009ccacf-51): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:04:53 compute-0 ovn_controller[95610]: 2025-12-05T12:04:53Z|00394|binding|INFO|Releasing lport 009ccacf-51c2-430a-9da2-4a4b522861e6 from this chassis (sb_readonly=0)
Dec 05 12:04:53 compute-0 ovn_controller[95610]: 2025-12-05T12:04:53Z|00395|binding|INFO|Setting lport 009ccacf-51c2-430a-9da2-4a4b522861e6 down in Southbound
Dec 05 12:04:53 compute-0 ovn_controller[95610]: 2025-12-05T12:04:53Z|00396|binding|INFO|Removing iface tap009ccacf-51 ovn-installed in OVS
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.197 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.204 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:cb:a5 10.100.0.10'], port_security=['fa:16:3e:20:cb:a5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=009ccacf-51c2-430a-9da2-4a4b522861e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.205 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 009ccacf-51c2-430a-9da2-4a4b522861e6 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.208 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.212 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000032.scope: Deactivated successfully.
Dec 05 12:04:53 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000032.scope: Consumed 4.359s CPU time.
Dec 05 12:04:53 compute-0 systemd-machined[153543]: Machine qemu-54-instance-00000032 terminated.
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.227 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4ee7321f-cef4-4703-92b9-3f472a5a82d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.279 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[66ffa7d2-9e87-417d-8442-c56af60eddaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.283 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[72dafe0e-07d4-418f-af06-2f44d3d6716b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.311 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f018823c-732d-4a8e-acc0-0b1dfe42c7d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.343 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c13dcc75-c2c0-4e48-bcf8-51d7fb302a45]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363907, 'reachable_time': 21906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223976, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.361 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8eb0751-ed4c-4be9-837e-64806aeef4f5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap41b3b495-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363920, 'tstamp': 363920}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223977, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap41b3b495-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363924, 'tstamp': 363924}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223977, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.363 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.366 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.370 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.370 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.370 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.371 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.371 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:53 compute-0 kernel: tap009ccacf-51: entered promiscuous mode
Dec 05 12:04:53 compute-0 systemd-udevd[223897]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:53 compute-0 NetworkManager[55691]: <info>  [1764936293.3832] manager: (tap009ccacf-51): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Dec 05 12:04:53 compute-0 kernel: tap009ccacf-51 (unregistering): left promiscuous mode
Dec 05 12:04:53 compute-0 ovn_controller[95610]: 2025-12-05T12:04:53Z|00397|binding|INFO|Claiming lport 009ccacf-51c2-430a-9da2-4a4b522861e6 for this chassis.
Dec 05 12:04:53 compute-0 ovn_controller[95610]: 2025-12-05T12:04:53Z|00398|binding|INFO|009ccacf-51c2-430a-9da2-4a4b522861e6: Claiming fa:16:3e:20:cb:a5 10.100.0.10
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.427 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 ovn_controller[95610]: 2025-12-05T12:04:53Z|00399|binding|INFO|Setting lport 009ccacf-51c2-430a-9da2-4a4b522861e6 ovn-installed in OVS
Dec 05 12:04:53 compute-0 ovn_controller[95610]: 2025-12-05T12:04:53Z|00400|if_status|INFO|Dropped 2 log messages in last 55 seconds (most recently, 55 seconds ago) due to excessive rate
Dec 05 12:04:53 compute-0 ovn_controller[95610]: 2025-12-05T12:04:53Z|00401|if_status|INFO|Not setting lport 009ccacf-51c2-430a-9da2-4a4b522861e6 down as sb is readonly
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.448 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 ovn_controller[95610]: 2025-12-05T12:04:53Z|00402|binding|INFO|Releasing lport 009ccacf-51c2-430a-9da2-4a4b522861e6 from this chassis (sb_readonly=0)
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.455 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:cb:a5 10.100.0.10'], port_security=['fa:16:3e:20:cb:a5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=009ccacf-51c2-430a-9da2-4a4b522861e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.456 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 009ccacf-51c2-430a-9da2-4a4b522861e6 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.459 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.460 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:cb:a5 10.100.0.10'], port_security=['fa:16:3e:20:cb:a5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=009ccacf-51c2-430a-9da2-4a4b522861e6) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.474 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cee73588-88e2-448f-beb7-b35645ce6194]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.488 187212 INFO nova.virt.libvirt.driver [-] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Instance destroyed successfully.
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.488 187212 DEBUG nova.objects.instance [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.509 187212 DEBUG nova.virt.libvirt.vif [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-827375400',display_name='tempest-ImagesTestJSON-server-827375400',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-827375400',id=50,image_ref='71fb11d9-b174-4d79-8db5-59ad67ae02ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:04:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-sg6vyxwo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='e5212ff3-c6ed-4f02-99c4-becad0e5f2a5',image_min_disk='1',image_min_ram='0',image_owner_id='43e63f5c6b0f4840ad4df23fb5c10764',image_owner_project_name='tempest-ImagesTestJSON-276789408',image_owner_user_name='tempest-ImagesTestJSON-276789408-project-member',image_user_id='a00ac4435e6647779ffaf4a5cde18fdb',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:49Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.509 187212 DEBUG nova.network.os_vif_util [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "009ccacf-51c2-430a-9da2-4a4b522861e6", "address": "fa:16:3e:20:cb:a5", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap009ccacf-51", "ovs_interfaceid": "009ccacf-51c2-430a-9da2-4a4b522861e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.507 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f78bc6-54cd-4555-9efb-fb5ce4c5eb25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.510 187212 DEBUG nova.network.os_vif_util [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:cb:a5,bridge_name='br-int',has_traffic_filtering=True,id=009ccacf-51c2-430a-9da2-4a4b522861e6,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap009ccacf-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.510 187212 DEBUG os_vif [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:cb:a5,bridge_name='br-int',has_traffic_filtering=True,id=009ccacf-51c2-430a-9da2-4a4b522861e6,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap009ccacf-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.512 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[803e0083-320f-46bc-855a-d8d3fc55740d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.513 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.513 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap009ccacf-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.517 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.520 187212 INFO os_vif [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:cb:a5,bridge_name='br-int',has_traffic_filtering=True,id=009ccacf-51c2-430a-9da2-4a4b522861e6,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap009ccacf-51')
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.521 187212 INFO nova.virt.libvirt.driver [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Deleting instance files /var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1_del
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.522 187212 INFO nova.virt.libvirt.driver [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Deletion of /var/lib/nova/instances/8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1_del complete
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.542 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fad2fec6-f97c-4aeb-8c33-bcd338e6a877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.558 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b77f65df-82b7-4950-96b3-81b499bebeb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363907, 'reachable_time': 21906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223994, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.564 187212 INFO nova.compute.manager [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.565 187212 DEBUG oslo.service.loopingcall [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.565 187212 DEBUG nova.compute.manager [-] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.565 187212 DEBUG nova.network.neutron [-] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.582 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fd04b4e0-10ed-4b60-8fab-4a70aa6563f0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap41b3b495-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363920, 'tstamp': 363920}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223995, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap41b3b495-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363924, 'tstamp': 363924}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223995, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.583 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.586 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.586 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.586 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.587 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.587 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.588 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 009ccacf-51c2-430a-9da2-4a4b522861e6 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.590 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.604 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a0cdf6a5-5943-411d-bc25-f704a01261a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.629 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[535c1711-5799-49c6-b909-e623520e0370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.632 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4e55d57e-2c42-42ab-a27a-6e596666a4d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.662 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[00e0cf5f-0eeb-4267-8c81-f70e26670191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.677 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f415f2-f8f5-4d05-9844-6d1afe571d62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363907, 'reachable_time': 21906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224001, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.690 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[df670393-916f-4c69-9732-19795e4d36ec]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap41b3b495-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363920, 'tstamp': 363920}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224002, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap41b3b495-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363924, 'tstamp': 363924}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224002, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.692 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.695 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.695 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.696 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.696 187212 DEBUG nova.compute.manager [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:53.696 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.745 187212 INFO nova.compute.manager [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] instance snapshotting
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.968 187212 DEBUG nova.network.neutron [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Updating instance_info_cache with network_info: [{"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.988 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Releasing lock "refresh_cache-00262d23-bf60-44d9-a775-63ba32adaf96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.988 187212 DEBUG nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Instance network_info: |[{"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.990 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Start _get_guest_xml network_info=[{"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:04:53 compute-0 nova_compute[187208]: 2025-12-05 12:04:53.994 187212 WARNING nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.001 187212 DEBUG nova.virt.libvirt.host [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.001 187212 DEBUG nova.virt.libvirt.host [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.004 187212 DEBUG nova.virt.libvirt.host [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.005 187212 DEBUG nova.virt.libvirt.host [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.005 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.006 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.006 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.006 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.007 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.007 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.007 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.007 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.008 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.008 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.008 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.009 187212 DEBUG nova.virt.hardware [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.013 187212 DEBUG nova.virt.libvirt.vif [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1524934604',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1524934604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1524934604',id=52,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-z07pcike',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:48Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=00262d23-bf60-44d9-a775-63ba32adaf96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.014 187212 DEBUG nova.network.os_vif_util [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.014 187212 DEBUG nova.network.os_vif_util [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:a1,bridge_name='br-int',has_traffic_filtering=True,id=ff2850e9-aaf4-4f4e-a323-24b258a0b4c7,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2850e9-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.016 187212 DEBUG nova.objects.instance [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 00262d23-bf60-44d9-a775-63ba32adaf96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.030 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <uuid>00262d23-bf60-44d9-a775-63ba32adaf96</uuid>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <name>instance-00000034</name>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1524934604</nova:name>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:04:53</nova:creationTime>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:04:54 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:04:54 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:04:54 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:04:54 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:04:54 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:04:54 compute-0 nova_compute[187208]:         <nova:user uuid="3ee170bdfdd343189ee1da01bdb80be6">tempest-ImagesOneServerNegativeTestJSON-661137252-project-member</nova:user>
Dec 05 12:04:54 compute-0 nova_compute[187208]:         <nova:project uuid="79895287bd1d488c842f6013729a1f81">tempest-ImagesOneServerNegativeTestJSON-661137252</nova:project>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:04:54 compute-0 nova_compute[187208]:         <nova:port uuid="ff2850e9-aaf4-4f4e-a323-24b258a0b4c7">
Dec 05 12:04:54 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <system>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <entry name="serial">00262d23-bf60-44d9-a775-63ba32adaf96</entry>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <entry name="uuid">00262d23-bf60-44d9-a775-63ba32adaf96</entry>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     </system>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <os>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   </os>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <features>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   </features>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk.config"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:f2:70:a1"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <target dev="tapff2850e9-aa"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/console.log" append="off"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <video>
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     </video>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:04:54 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:04:54 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:04:54 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:04:54 compute-0 nova_compute[187208]: </domain>
Dec 05 12:04:54 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.032 187212 DEBUG nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Preparing to wait for external event network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.032 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.032 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.032 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.033 187212 DEBUG nova.virt.libvirt.vif [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1524934604',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1524934604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1524934604',id=52,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-z07pcike',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:48Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=00262d23-bf60-44d9-a775-63ba32adaf96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.033 187212 DEBUG nova.network.os_vif_util [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.034 187212 DEBUG nova.network.os_vif_util [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:a1,bridge_name='br-int',has_traffic_filtering=True,id=ff2850e9-aaf4-4f4e-a323-24b258a0b4c7,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2850e9-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.034 187212 DEBUG os_vif [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:a1,bridge_name='br-int',has_traffic_filtering=True,id=ff2850e9-aaf4-4f4e-a323-24b258a0b4c7,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2850e9-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.035 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.035 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.035 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.038 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.038 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff2850e9-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.039 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapff2850e9-aa, col_values=(('external_ids', {'iface-id': 'ff2850e9-aaf4-4f4e-a323-24b258a0b4c7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:70:a1', 'vm-uuid': '00262d23-bf60-44d9-a775-63ba32adaf96'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:54 compute-0 NetworkManager[55691]: <info>  [1764936294.0411] manager: (tapff2850e9-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.040 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.043 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.044 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.045 187212 INFO os_vif [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:a1,bridge_name='br-int',has_traffic_filtering=True,id=ff2850e9-aaf4-4f4e-a323-24b258a0b4c7,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2850e9-aa')
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.088 187212 INFO nova.virt.libvirt.driver [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Beginning live snapshot process
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.103 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.104 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.104 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No VIF found with MAC fa:16:3e:f2:70:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.104 187212 INFO nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Using config drive
Dec 05 12:04:54 compute-0 podman[224005]: 2025-12-05 12:04:54.141082789 +0000 UTC m=+0.063430885 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=edpm, distribution-scope=public, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 12:04:54 compute-0 podman[224006]: 2025-12-05 12:04:54.157245923 +0000 UTC m=+0.080235197 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 12:04:54 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.253 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.319 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.320 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.378 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.390 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.454 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.455 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpcfnwjd03/022f126783a5493cb0daa9d9eb565f3e.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.541 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936279.5404103, a7616662-639b-4642-b507-614773f4748f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.542 187212 INFO nova.compute.manager [-] [instance: a7616662-639b-4642-b507-614773f4748f] VM Stopped (Lifecycle Event)
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.837 187212 DEBUG nova.compute.manager [None req-c76480d8-1982-4df7-a403-452b9642e293 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.840 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpcfnwjd03/022f126783a5493cb0daa9d9eb565f3e.delta 1073741824" returned: 0 in 0.385s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.841 187212 INFO nova.virt.libvirt.driver [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.863 187212 DEBUG nova.network.neutron [-] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.886 187212 INFO nova.compute.manager [-] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Took 1.32 seconds to deallocate network for instance.
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.903 187212 DEBUG nova.virt.libvirt.guest [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] COPY block job progress, current cursor: 0 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.919 187212 INFO nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Creating config drive at /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk.config
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.926 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe0lp3x87 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.948 187212 DEBUG oslo_concurrency.lockutils [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:54 compute-0 nova_compute[187208]: 2025-12-05 12:04:54.949 187212 DEBUG oslo_concurrency.lockutils [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.054 187212 DEBUG oslo_concurrency.processutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe0lp3x87" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:55 compute-0 kernel: tapff2850e9-aa: entered promiscuous mode
Dec 05 12:04:55 compute-0 NetworkManager[55691]: <info>  [1764936295.1192] manager: (tapff2850e9-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/169)
Dec 05 12:04:55 compute-0 ovn_controller[95610]: 2025-12-05T12:04:55Z|00403|binding|INFO|Claiming lport ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 for this chassis.
Dec 05 12:04:55 compute-0 ovn_controller[95610]: 2025-12-05T12:04:55Z|00404|binding|INFO|ff2850e9-aaf4-4f4e-a323-24b258a0b4c7: Claiming fa:16:3e:f2:70:a1 10.100.0.12
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.131 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.135 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.140 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:70:a1 10.100.0.12'], port_security=['fa:16:3e:f2:70:a1 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '00262d23-bf60-44d9-a775-63ba32adaf96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d064000-316c-46a7-a23c-1dc26318b6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79895287bd1d488c842f6013729a1f81', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e1ec2415-6840-4cf9-b5ac-efaf1a9c9a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3804b014-203a-4c47-b0bb-7634579c4ec4, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ff2850e9-aaf4-4f4e-a323-24b258a0b4c7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.142 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 in datapath 5d064000-316c-46a7-a23c-1dc26318b6a4 bound to our chassis
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.145 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5d064000-316c-46a7-a23c-1dc26318b6a4
Dec 05 12:04:55 compute-0 systemd-udevd[224106]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:55 compute-0 ovn_controller[95610]: 2025-12-05T12:04:55Z|00405|binding|INFO|Setting lport ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 up in Southbound
Dec 05 12:04:55 compute-0 ovn_controller[95610]: 2025-12-05T12:04:55Z|00406|binding|INFO|Setting lport ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 ovn-installed in OVS
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.157 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:55 compute-0 NetworkManager[55691]: <info>  [1764936295.1649] device (tapff2850e9-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:04:55 compute-0 NetworkManager[55691]: <info>  [1764936295.1659] device (tapff2850e9-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.166 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bce5aba5-2329-43a7-b09a-b13709583cd0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.167 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5d064000-31 in ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.170 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5d064000-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a69ddd69-9208-4345-89f8-bd405262f695]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.171 187212 DEBUG nova.compute.provider_tree [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.171 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea24dcb-d157-4967-8ad3-df279d893e18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 systemd-machined[153543]: New machine qemu-56-instance-00000034.
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.184 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[3cfc2268-6693-4790-8cf2-2b06eed823b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 systemd[1]: Started Virtual Machine qemu-56-instance-00000034.
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.188 187212 DEBUG nova.scheduler.client.report [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.199 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f1644fa1-6f3e-4ecf-8062-c133e8b3d3be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.211 187212 DEBUG oslo_concurrency.lockutils [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.232 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[45553ddf-3d41-4724-89b9-7dd25c4633a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.241 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[615b8f68-03f4-47fa-8ca1-cc7397589e51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 NetworkManager[55691]: <info>  [1764936295.2427] manager: (tap5d064000-30): new Veth device (/org/freedesktop/NetworkManager/Devices/170)
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.242 187212 INFO nova.scheduler.client.report [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1
Dec 05 12:04:55 compute-0 systemd-udevd[224110]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.288 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cf11d0ff-78f4-44be-b06b-b6c65702c42d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.292 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6b205560-421d-4444-baad-5ce131f05fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.317 187212 DEBUG oslo_concurrency.lockutils [None req-cee939a6-d95f-4bf8-895a-b0e55034bdaa a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:55 compute-0 NetworkManager[55691]: <info>  [1764936295.3201] device (tap5d064000-30): carrier: link connected
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.329 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[18e141bd-16fe-4c11-9ffd-dfe3d57a4e9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.346 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d310ce48-cf78-45f8-99f6-5c6e5449dabd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367588, 'reachable_time': 16834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224141, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.361 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9270c997-794a-480b-a8d1-a65ad16f10d3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:6d24'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 367588, 'tstamp': 367588}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224142, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.377 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7011d7fb-e536-4061-b8de-54cd006783d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367588, 'reachable_time': 16834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224143, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.406 187212 DEBUG nova.virt.libvirt.guest [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] COPY block job progress, current cursor: 75366400 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.408 187212 INFO nova.virt.libvirt.driver [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.423 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[977bdc4b-907a-4c13-96f7-2470f233a37c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.444 187212 DEBUG nova.privsep.utils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.445 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpcfnwjd03/022f126783a5493cb0daa9d9eb565f3e.delta /var/lib/nova/instances/snapshots/tmpcfnwjd03/022f126783a5493cb0daa9d9eb565f3e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.493 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[04f088c6-3f21-49da-95d9-259fe5543ae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.494 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d064000-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d064000-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.497 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:55 compute-0 NetworkManager[55691]: <info>  [1764936295.4980] manager: (tap5d064000-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Dec 05 12:04:55 compute-0 kernel: tap5d064000-30: entered promiscuous mode
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.501 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5d064000-30, col_values=(('external_ids', {'iface-id': '1b49f23e-d835-4ef5-82b9-a339d97fd4cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.504 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:55 compute-0 ovn_controller[95610]: 2025-12-05T12:04:55Z|00407|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.505 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.506 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.507 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e1683628-09e3-4bd5-8c96-b7da4139a5c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.508 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-5d064000-316c-46a7-a23c-1dc26318b6a4
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 5d064000-316c-46a7-a23c-1dc26318b6a4
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:04:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:55.510 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'env', 'PROCESS_TAG=haproxy-5d064000-316c-46a7-a23c-1dc26318b6a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5d064000-316c-46a7-a23c-1dc26318b6a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.518 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.556 187212 DEBUG nova.compute.manager [req-c157b05b-2f2d-4880-b3cc-0926edd53963 req-6325e6ef-8a04-40cd-94f3-1ed60b34d498 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received event network-changed-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.556 187212 DEBUG nova.compute.manager [req-c157b05b-2f2d-4880-b3cc-0926edd53963 req-6325e6ef-8a04-40cd-94f3-1ed60b34d498 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Refreshing instance network info cache due to event network-changed-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.557 187212 DEBUG oslo_concurrency.lockutils [req-c157b05b-2f2d-4880-b3cc-0926edd53963 req-6325e6ef-8a04-40cd-94f3-1ed60b34d498 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-00262d23-bf60-44d9-a775-63ba32adaf96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.558 187212 DEBUG oslo_concurrency.lockutils [req-c157b05b-2f2d-4880-b3cc-0926edd53963 req-6325e6ef-8a04-40cd-94f3-1ed60b34d498 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-00262d23-bf60-44d9-a775-63ba32adaf96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.558 187212 DEBUG nova.network.neutron [req-c157b05b-2f2d-4880-b3cc-0926edd53963 req-6325e6ef-8a04-40cd-94f3-1ed60b34d498 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Refreshing network info cache for port ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.572 187212 DEBUG nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received event network-vif-plugged-c4a66ea2-9b1b-486a-a750-17072882c42e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.572 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.572 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.573 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.573 187212 DEBUG nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Processing event network-vif-plugged-c4a66ea2-9b1b-486a-a750-17072882c42e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.573 187212 DEBUG nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received event network-vif-plugged-c4a66ea2-9b1b-486a-a750-17072882c42e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.573 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.573 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.573 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.574 187212 DEBUG nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] No waiting events found dispatching network-vif-plugged-c4a66ea2-9b1b-486a-a750-17072882c42e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.574 187212 WARNING nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received unexpected event network-vif-plugged-c4a66ea2-9b1b-486a-a750-17072882c42e for instance with vm_state building and task_state spawning.
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.574 187212 DEBUG nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Received event network-vif-unplugged-009ccacf-51c2-430a-9da2-4a4b522861e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.574 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.574 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.575 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.575 187212 DEBUG nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] No waiting events found dispatching network-vif-unplugged-009ccacf-51c2-430a-9da2-4a4b522861e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.575 187212 WARNING nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Received unexpected event network-vif-unplugged-009ccacf-51c2-430a-9da2-4a4b522861e6 for instance with vm_state deleted and task_state None.
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.575 187212 DEBUG nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Received event network-vif-plugged-009ccacf-51c2-430a-9da2-4a4b522861e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.575 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.575 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.575 187212 DEBUG oslo_concurrency.lockutils [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.576 187212 DEBUG nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] No waiting events found dispatching network-vif-plugged-009ccacf-51c2-430a-9da2-4a4b522861e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.576 187212 WARNING nova.compute.manager [req-635e6c03-fa03-4da9-8aa8-17f8abc6d9f1 req-9ab3df0a-6bd7-496d-a8a1-79e5000ccda3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Received unexpected event network-vif-plugged-009ccacf-51c2-430a-9da2-4a4b522861e6 for instance with vm_state deleted and task_state None.
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.577 187212 DEBUG nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.593 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936295.5922248, 21873f07-a1da-4158-a5b2-1d44d547874e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.593 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] VM Resumed (Lifecycle Event)
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.596 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.602 187212 INFO nova.virt.libvirt.driver [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Instance spawned successfully.
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.603 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.628 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.634 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.634 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.635 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.635 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.635 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.636 187212 DEBUG nova.virt.libvirt.driver [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.640 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.674 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.708 187212 INFO nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Took 10.92 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.709 187212 DEBUG nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.774 187212 INFO nova.compute.manager [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Took 11.60 seconds to build instance.
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.794 187212 DEBUG oslo_concurrency.lockutils [None req-67a9121a-5a9d-4ead-960c-ce53e3c817f1 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:55 compute-0 ovn_controller[95610]: 2025-12-05T12:04:55Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ea:04:47 10.100.0.5
Dec 05 12:04:55 compute-0 ovn_controller[95610]: 2025-12-05T12:04:55Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ea:04:47 10.100.0.5
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.856 187212 DEBUG oslo_concurrency.processutils [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpcfnwjd03/022f126783a5493cb0daa9d9eb565f3e.delta /var/lib/nova/instances/snapshots/tmpcfnwjd03/022f126783a5493cb0daa9d9eb565f3e" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.862 187212 INFO nova.virt.libvirt.driver [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Snapshot extracted, beginning image upload
Dec 05 12:04:55 compute-0 ovn_controller[95610]: 2025-12-05T12:04:55Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:41:de:18 10.100.0.14
Dec 05 12:04:55 compute-0 ovn_controller[95610]: 2025-12-05T12:04:55Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:41:de:18 10.100.0.14
Dec 05 12:04:55 compute-0 podman[224191]: 2025-12-05 12:04:55.959104574 +0000 UTC m=+0.060375787 container create ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.976 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936295.9761298, 00262d23-bf60-44d9-a775-63ba32adaf96 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.977 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] VM Started (Lifecycle Event)
Dec 05 12:04:55 compute-0 nova_compute[187208]: 2025-12-05 12:04:55.995 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:56 compute-0 nova_compute[187208]: 2025-12-05 12:04:56.000 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936295.9763377, 00262d23-bf60-44d9-a775-63ba32adaf96 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:56 compute-0 nova_compute[187208]: 2025-12-05 12:04:56.000 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] VM Paused (Lifecycle Event)
Dec 05 12:04:56 compute-0 systemd[1]: Started libpod-conmon-ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425.scope.
Dec 05 12:04:56 compute-0 podman[224191]: 2025-12-05 12:04:55.932402886 +0000 UTC m=+0.033674159 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:04:56 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:04:56 compute-0 nova_compute[187208]: 2025-12-05 12:04:56.029 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c55979a754522ad6027e1e7c2694ea99c50ceaa6761d501a27544f4e299bfdab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:04:56 compute-0 nova_compute[187208]: 2025-12-05 12:04:56.038 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:56 compute-0 podman[224191]: 2025-12-05 12:04:56.045845977 +0000 UTC m=+0.147117210 container init ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 05 12:04:56 compute-0 podman[224191]: 2025-12-05 12:04:56.051260803 +0000 UTC m=+0.152532016 container start ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:04:56 compute-0 nova_compute[187208]: 2025-12-05 12:04:56.062 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:56 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[224206]: [NOTICE]   (224210) : New worker (224212) forked
Dec 05 12:04:56 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[224206]: [NOTICE]   (224210) : Loading success.
Dec 05 12:04:57 compute-0 nova_compute[187208]: 2025-12-05 12:04:57.590 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:57 compute-0 nova_compute[187208]: 2025-12-05 12:04:57.775 187212 DEBUG nova.network.neutron [req-c157b05b-2f2d-4880-b3cc-0926edd53963 req-6325e6ef-8a04-40cd-94f3-1ed60b34d498 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Updated VIF entry in instance network info cache for port ff2850e9-aaf4-4f4e-a323-24b258a0b4c7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:04:57 compute-0 nova_compute[187208]: 2025-12-05 12:04:57.775 187212 DEBUG nova.network.neutron [req-c157b05b-2f2d-4880-b3cc-0926edd53963 req-6325e6ef-8a04-40cd-94f3-1ed60b34d498 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Updating instance_info_cache with network_info: [{"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:04:57 compute-0 nova_compute[187208]: 2025-12-05 12:04:57.798 187212 DEBUG oslo_concurrency.lockutils [req-c157b05b-2f2d-4880-b3cc-0926edd53963 req-6325e6ef-8a04-40cd-94f3-1ed60b34d498 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-00262d23-bf60-44d9-a775-63ba32adaf96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.348 187212 DEBUG nova.compute.manager [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Received event network-vif-deleted-009ccacf-51c2-430a-9da2-4a4b522861e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.348 187212 DEBUG nova.compute.manager [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received event network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.348 187212 DEBUG oslo_concurrency.lockutils [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.348 187212 DEBUG oslo_concurrency.lockutils [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.348 187212 DEBUG oslo_concurrency.lockutils [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.349 187212 DEBUG nova.compute.manager [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Processing event network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.349 187212 DEBUG nova.compute.manager [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received event network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.349 187212 DEBUG oslo_concurrency.lockutils [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.349 187212 DEBUG oslo_concurrency.lockutils [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.349 187212 DEBUG oslo_concurrency.lockutils [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.349 187212 DEBUG nova.compute.manager [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] No waiting events found dispatching network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.350 187212 WARNING nova.compute.manager [req-1fe90744-30ed-43ae-88ff-7c765d014760 req-30dfe4fd-1d69-490f-8f79-3ebc022f85a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received unexpected event network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 for instance with vm_state building and task_state spawning.
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.350 187212 DEBUG nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.354 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936298.3538907, 00262d23-bf60-44d9-a775-63ba32adaf96 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.354 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] VM Resumed (Lifecycle Event)
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.356 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.359 187212 INFO nova.virt.libvirt.driver [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Instance spawned successfully.
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.360 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.374 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.380 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.382 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.382 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.383 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.383 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.384 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.384 187212 DEBUG nova.virt.libvirt.driver [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.414 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.443 187212 INFO nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Took 9.61 seconds to spawn the instance on the hypervisor.
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.444 187212 DEBUG nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.502 187212 INFO nova.compute.manager [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Took 10.26 seconds to build instance.
Dec 05 12:04:58 compute-0 nova_compute[187208]: 2025-12-05 12:04:58.519 187212 DEBUG oslo_concurrency.lockutils [None req-0970ec8a-37c2-4d1f-ad6f-29341e3c5e52 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.003 187212 DEBUG oslo_concurrency.lockutils [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.004 187212 DEBUG oslo_concurrency.lockutils [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.004 187212 DEBUG oslo_concurrency.lockutils [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.004 187212 DEBUG oslo_concurrency.lockutils [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.005 187212 DEBUG oslo_concurrency.lockutils [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.006 187212 INFO nova.compute.manager [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Terminating instance
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.007 187212 DEBUG nova.compute.manager [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:04:59 compute-0 kernel: tapea8794b1-8d (unregistering): left promiscuous mode
Dec 05 12:04:59 compute-0 NetworkManager[55691]: <info>  [1764936299.0435] device (tapea8794b1-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.044 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:59 compute-0 ovn_controller[95610]: 2025-12-05T12:04:59Z|00408|binding|INFO|Releasing lport ea8794b1-8d29-4839-af08-e1675802ea0a from this chassis (sb_readonly=0)
Dec 05 12:04:59 compute-0 ovn_controller[95610]: 2025-12-05T12:04:59Z|00409|binding|INFO|Setting lport ea8794b1-8d29-4839-af08-e1675802ea0a down in Southbound
Dec 05 12:04:59 compute-0 ovn_controller[95610]: 2025-12-05T12:04:59Z|00410|binding|INFO|Removing iface tapea8794b1-8d ovn-installed in OVS
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.057 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:21:a9 10.100.0.3'], port_security=['fa:16:3e:58:21:a9 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ea8794b1-8d29-4839-af08-e1675802ea0a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.059 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ea8794b1-8d29-4839-af08-e1675802ea0a in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.062 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.063 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa35dae0-803e-40b2-a824-3f577dbe562f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.064 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.064 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.069 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:59 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000028.scope: Deactivated successfully.
Dec 05 12:04:59 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000028.scope: Consumed 14.001s CPU time.
Dec 05 12:04:59 compute-0 systemd-machined[153543]: Machine qemu-46-instance-00000028 terminated.
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.183 187212 DEBUG nova.objects.instance [None req-4201f7f6-9fb3-488b-a16d-953ec07fc479 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid 21873f07-a1da-4158-a5b2-1d44d547874e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.205 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936299.2052639, 21873f07-a1da-4158-a5b2-1d44d547874e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.205 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] VM Paused (Lifecycle Event)
Dec 05 12:04:59 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[222505]: [NOTICE]   (222509) : haproxy version is 2.8.14-c23fe91
Dec 05 12:04:59 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[222505]: [NOTICE]   (222509) : path to executable is /usr/sbin/haproxy
Dec 05 12:04:59 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[222505]: [WARNING]  (222509) : Exiting Master process...
Dec 05 12:04:59 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[222505]: [ALERT]    (222509) : Current worker (222511) exited with code 143 (Terminated)
Dec 05 12:04:59 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[222505]: [WARNING]  (222509) : All workers exited. Exiting... (0)
Dec 05 12:04:59 compute-0 systemd[1]: libpod-c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589.scope: Deactivated successfully.
Dec 05 12:04:59 compute-0 conmon[222505]: conmon c4f96acfcec305d90bd6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589.scope/container/memory.events
Dec 05 12:04:59 compute-0 podman[224242]: 2025-12-05 12:04:59.2284082 +0000 UTC m=+0.053207811 container died c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.239 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.249 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:04:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589-userdata-shm.mount: Deactivated successfully.
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.272 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] During sync_power_state the instance has a pending task (suspending). Skip.
Dec 05 12:04:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-5b7c90b66f75d352f77a65a6e2c60491d5ded526586a774fd4cd500b1acf38ae-merged.mount: Deactivated successfully.
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.281 187212 INFO nova.virt.libvirt.driver [-] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Instance destroyed successfully.
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.282 187212 DEBUG nova.objects.instance [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.296 187212 DEBUG nova.virt.libvirt.vif [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-404632133',display_name='tempest-ImagesTestJSON-server-404632133',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-404632133',id=40,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:04:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-d69nje92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:34Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=e5212ff3-c6ed-4f02-99c4-becad0e5f2a5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.296 187212 DEBUG nova.network.os_vif_util [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.297 187212 DEBUG nova.network.os_vif_util [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.297 187212 DEBUG os_vif [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.299 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.299 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea8794b1-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.302 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.305 187212 INFO os_vif [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d')
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.306 187212 INFO nova.virt.libvirt.driver [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Deleting instance files /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5_del
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.306 187212 INFO nova.virt.libvirt.driver [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Deletion of /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5_del complete
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.332 187212 DEBUG nova.compute.manager [req-d58b1cb7-2c12-4b40-8716-61a6e3e32be0 req-b32ccdfd-cab9-4a05-8014-f6aa71d8c556 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received event network-vif-unplugged-ea8794b1-8d29-4839-af08-e1675802ea0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.332 187212 DEBUG oslo_concurrency.lockutils [req-d58b1cb7-2c12-4b40-8716-61a6e3e32be0 req-b32ccdfd-cab9-4a05-8014-f6aa71d8c556 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.332 187212 DEBUG oslo_concurrency.lockutils [req-d58b1cb7-2c12-4b40-8716-61a6e3e32be0 req-b32ccdfd-cab9-4a05-8014-f6aa71d8c556 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.332 187212 DEBUG oslo_concurrency.lockutils [req-d58b1cb7-2c12-4b40-8716-61a6e3e32be0 req-b32ccdfd-cab9-4a05-8014-f6aa71d8c556 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.333 187212 DEBUG nova.compute.manager [req-d58b1cb7-2c12-4b40-8716-61a6e3e32be0 req-b32ccdfd-cab9-4a05-8014-f6aa71d8c556 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] No waiting events found dispatching network-vif-unplugged-ea8794b1-8d29-4839-af08-e1675802ea0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.333 187212 DEBUG nova.compute.manager [req-d58b1cb7-2c12-4b40-8716-61a6e3e32be0 req-b32ccdfd-cab9-4a05-8014-f6aa71d8c556 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received event network-vif-unplugged-ea8794b1-8d29-4839-af08-e1675802ea0a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.355 187212 INFO nova.compute.manager [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.356 187212 DEBUG oslo.service.loopingcall [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.356 187212 DEBUG nova.compute.manager [-] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.356 187212 DEBUG nova.network.neutron [-] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.524 187212 INFO nova.virt.libvirt.driver [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Snapshot image upload complete
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.525 187212 INFO nova.compute.manager [None req-25e97d69-6739-44b3-9dda-5b3828e5a3cb 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Took 5.78 seconds to snapshot the instance on the hypervisor.
Dec 05 12:04:59 compute-0 podman[224242]: 2025-12-05 12:04:59.815491948 +0000 UTC m=+0.640291549 container cleanup c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:04:59 compute-0 systemd[1]: libpod-conmon-c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589.scope: Deactivated successfully.
Dec 05 12:04:59 compute-0 kernel: tapc4a66ea2-9b (unregistering): left promiscuous mode
Dec 05 12:04:59 compute-0 NetworkManager[55691]: <info>  [1764936299.8307] device (tapc4a66ea2-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:59 compute-0 ovn_controller[95610]: 2025-12-05T12:04:59Z|00411|binding|INFO|Releasing lport c4a66ea2-9b1b-486a-a750-17072882c42e from this chassis (sb_readonly=0)
Dec 05 12:04:59 compute-0 ovn_controller[95610]: 2025-12-05T12:04:59Z|00412|binding|INFO|Setting lport c4a66ea2-9b1b-486a-a750-17072882c42e down in Southbound
Dec 05 12:04:59 compute-0 ovn_controller[95610]: 2025-12-05T12:04:59Z|00413|binding|INFO|Removing iface tapc4a66ea2-9b ovn-installed in OVS
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.864 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.888 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:59 compute-0 podman[224292]: 2025-12-05 12:04:59.899462632 +0000 UTC m=+0.057711570 container remove c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.904 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c3371938-4258-4152-a957-942f8bca86d0]: (4, ('Fri Dec  5 12:04:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589)\nc4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589\nFri Dec  5 12:04:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589)\nc4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.906 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a02e464d-1b53-4bdf-9b4d-16d6a433ad1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.907 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.909 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:59 compute-0 kernel: tap41b3b495-c0: left promiscuous mode
Dec 05 12:04:59 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000033.scope: Deactivated successfully.
Dec 05 12:04:59 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000033.scope: Consumed 4.269s CPU time.
Dec 05 12:04:59 compute-0 systemd-machined[153543]: Machine qemu-55-instance-00000033 terminated.
Dec 05 12:04:59 compute-0 nova_compute[187208]: 2025-12-05 12:04:59.926 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.929 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3734d9d5-fea7-4372-a902-1de91fc36c22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:59 compute-0 podman[224294]: 2025-12-05 12:04:59.945527316 +0000 UTC m=+0.079878477 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.948 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c87bca74-a80b-47e7-a7b0-b7333a8e4ec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.949 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aa8b3c83-8c29-421d-98e2-6f248ecf6949]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:59 compute-0 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.963 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4e89dfc2-4cbc-4f78-ae71-8377b250adbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363898, 'reachable_time': 16510, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224358, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.965 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:04:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:04:59.965 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec47f0b-12d7-4019-99aa-1317e1019e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:04:59 compute-0 podman[224302]: 2025-12-05 12:04:59.974510669 +0000 UTC m=+0.113873034 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.080 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:70:07 10.100.0.9'], port_security=['fa:16:3e:b7:70:07 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '21873f07-a1da-4158-a5b2-1d44d547874e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c4a66ea2-9b1b-486a-a750-17072882c42e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.081 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c4a66ea2-9b1b-486a-a750-17072882c42e in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 unbound from our chassis
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.083 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.085 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1689ca7a-b9a9-4575-8e6b-dfd0efac0ba8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.085 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace which is not needed anymore
Dec 05 12:05:00 compute-0 nova_compute[187208]: 2025-12-05 12:05:00.095 187212 DEBUG nova.compute.manager [None req-4201f7f6-9fb3-488b-a16d-953ec07fc479 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:00 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[223952]: [NOTICE]   (223956) : haproxy version is 2.8.14-c23fe91
Dec 05 12:05:00 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[223952]: [NOTICE]   (223956) : path to executable is /usr/sbin/haproxy
Dec 05 12:05:00 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[223952]: [WARNING]  (223956) : Exiting Master process...
Dec 05 12:05:00 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[223952]: [ALERT]    (223956) : Current worker (223958) exited with code 143 (Terminated)
Dec 05 12:05:00 compute-0 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[223952]: [WARNING]  (223956) : All workers exited. Exiting... (0)
Dec 05 12:05:00 compute-0 systemd[1]: libpod-58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02.scope: Deactivated successfully.
Dec 05 12:05:00 compute-0 podman[224396]: 2025-12-05 12:05:00.226814972 +0000 UTC m=+0.046595381 container died 58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:05:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02-userdata-shm.mount: Deactivated successfully.
Dec 05 12:05:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-80661869de99bb41bbf0b1d44f502ba85da6c00c2f0b6e9a79286957b69b3345-merged.mount: Deactivated successfully.
Dec 05 12:05:00 compute-0 podman[224396]: 2025-12-05 12:05:00.266332458 +0000 UTC m=+0.086112847 container cleanup 58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 05 12:05:00 compute-0 systemd[1]: libpod-conmon-58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02.scope: Deactivated successfully.
Dec 05 12:05:00 compute-0 podman[224422]: 2025-12-05 12:05:00.336256968 +0000 UTC m=+0.051460250 container remove 58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.342 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35332e59-1268-4f24-9931-067ea14ad084]: (4, ('Fri Dec  5 12:05:00 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02)\n58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02\nFri Dec  5 12:05:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02)\n58c827002b8b504e77fe5dfe98e5606345688944f0b1e00e1e1de0ccd5b1cd02\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.343 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3a124b5e-9c19-4e06-8bd3-4d43785ee1a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.344 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:00 compute-0 nova_compute[187208]: 2025-12-05 12:05:00.377 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:00 compute-0 kernel: tapd7360f84-b0: left promiscuous mode
Dec 05 12:05:00 compute-0 nova_compute[187208]: 2025-12-05 12:05:00.395 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.397 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4438f5f0-1bf0-499a-9c9c-bf2f3c5317d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.409 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4c66bb-06b0-4915-912b-05ce79583e54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.410 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0aebe4ae-a8bc-435c-ab66-0fb4e17ab1f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.423 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d80bcf67-94f7-43fd-8d63-63cc454c6fec]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367180, 'reachable_time': 35355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224440, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.425 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:05:00 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:00.425 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[9a04e2a7-10e0-4d75-b10e-0a6707edbd9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:00 compute-0 systemd[1]: run-netns-ovnmeta\x2dd7360f84\x2dbcd5\x2d4e64\x2dbf43\x2d1fdbd8215a70.mount: Deactivated successfully.
Dec 05 12:05:00 compute-0 nova_compute[187208]: 2025-12-05 12:05:00.689 187212 DEBUG nova.network.neutron [-] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:00 compute-0 nova_compute[187208]: 2025-12-05 12:05:00.705 187212 INFO nova.compute.manager [-] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Took 1.35 seconds to deallocate network for instance.
Dec 05 12:05:00 compute-0 nova_compute[187208]: 2025-12-05 12:05:00.768 187212 DEBUG oslo_concurrency.lockutils [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:00 compute-0 nova_compute[187208]: 2025-12-05 12:05:00.769 187212 DEBUG oslo_concurrency.lockutils [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:00 compute-0 nova_compute[187208]: 2025-12-05 12:05:00.977 187212 DEBUG nova.compute.provider_tree [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.009 187212 DEBUG nova.scheduler.client.report [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.035 187212 DEBUG oslo_concurrency.lockutils [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.062 187212 INFO nova.scheduler.client.report [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance e5212ff3-c6ed-4f02-99c4-becad0e5f2a5
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.133 187212 DEBUG oslo_concurrency.lockutils [None req-3b01a008-20f2-4f84-98cb-589625eee4a0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.687 187212 DEBUG nova.compute.manager [req-a8f00ec2-fa06-4c5d-8918-8d45e3452966 req-53783493-a5c1-4946-b34d-00057c1cf994 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received event network-vif-deleted-ea8794b1-8d29-4839-af08-e1675802ea0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.948 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.949 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.949 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.949 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.950 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.951 187212 INFO nova.compute.manager [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Terminating instance
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.952 187212 DEBUG nova.compute.manager [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:05:01 compute-0 kernel: tapff2850e9-aa (unregistering): left promiscuous mode
Dec 05 12:05:01 compute-0 NetworkManager[55691]: <info>  [1764936301.9720] device (tapff2850e9-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.980 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:01 compute-0 ovn_controller[95610]: 2025-12-05T12:05:01Z|00414|binding|INFO|Releasing lport ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 from this chassis (sb_readonly=0)
Dec 05 12:05:01 compute-0 ovn_controller[95610]: 2025-12-05T12:05:01Z|00415|binding|INFO|Setting lport ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 down in Southbound
Dec 05 12:05:01 compute-0 ovn_controller[95610]: 2025-12-05T12:05:01Z|00416|binding|INFO|Removing iface tapff2850e9-aa ovn-installed in OVS
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.984 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:01.994 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:70:a1 10.100.0.12'], port_security=['fa:16:3e:f2:70:a1 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '00262d23-bf60-44d9-a775-63ba32adaf96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d064000-316c-46a7-a23c-1dc26318b6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79895287bd1d488c842f6013729a1f81', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e1ec2415-6840-4cf9-b5ac-efaf1a9c9a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3804b014-203a-4c47-b0bb-7634579c4ec4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ff2850e9-aaf4-4f4e-a323-24b258a0b4c7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:01 compute-0 nova_compute[187208]: 2025-12-05 12:05:01.994 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:01.995 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 in datapath 5d064000-316c-46a7-a23c-1dc26318b6a4 unbound from our chassis
Dec 05 12:05:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:01.997 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d064000-316c-46a7-a23c-1dc26318b6a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:05:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:01.997 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[29cb76cc-2408-4057-b53a-8b7817640fd8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:01.998 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 namespace which is not needed anymore
Dec 05 12:05:02 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000034.scope: Deactivated successfully.
Dec 05 12:05:02 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000034.scope: Consumed 4.300s CPU time.
Dec 05 12:05:02 compute-0 systemd-machined[153543]: Machine qemu-56-instance-00000034 terminated.
Dec 05 12:05:02 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[224206]: [NOTICE]   (224210) : haproxy version is 2.8.14-c23fe91
Dec 05 12:05:02 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[224206]: [NOTICE]   (224210) : path to executable is /usr/sbin/haproxy
Dec 05 12:05:02 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[224206]: [WARNING]  (224210) : Exiting Master process...
Dec 05 12:05:02 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[224206]: [ALERT]    (224210) : Current worker (224212) exited with code 143 (Terminated)
Dec 05 12:05:02 compute-0 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[224206]: [WARNING]  (224210) : All workers exited. Exiting... (0)
Dec 05 12:05:02 compute-0 systemd[1]: libpod-ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425.scope: Deactivated successfully.
Dec 05 12:05:02 compute-0 podman[224464]: 2025-12-05 12:05:02.121693926 +0000 UTC m=+0.043796800 container died ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 12:05:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425-userdata-shm.mount: Deactivated successfully.
Dec 05 12:05:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-c55979a754522ad6027e1e7c2694ea99c50ceaa6761d501a27544f4e299bfdab-merged.mount: Deactivated successfully.
Dec 05 12:05:02 compute-0 podman[224464]: 2025-12-05 12:05:02.153051608 +0000 UTC m=+0.075154462 container cleanup ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:05:02 compute-0 systemd[1]: libpod-conmon-ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425.scope: Deactivated successfully.
Dec 05 12:05:02 compute-0 podman[224496]: 2025-12-05 12:05:02.211273762 +0000 UTC m=+0.040273239 container remove ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.214 187212 INFO nova.virt.libvirt.driver [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Instance destroyed successfully.
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.215 187212 DEBUG nova.objects.instance [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'resources' on Instance uuid 00262d23-bf60-44d9-a775-63ba32adaf96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:02.218 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[add7fe9b-3893-4dfb-9ce4-14e59d1b3f6d]: (4, ('Fri Dec  5 12:05:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 (ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425)\nee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425\nFri Dec  5 12:05:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 (ee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425)\nee223d94277f45d264b1ac1808d6610f6f10fde0465f4885ae86c5300056e425\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:02.220 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6568e5d9-e21a-4eef-897b-7d81e0565cb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:02.221 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d064000-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.222 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:02 compute-0 kernel: tap5d064000-30: left promiscuous mode
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.235 187212 DEBUG nova.virt.libvirt.vif [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1524934604',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1524934604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1524934604',id=52,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:04:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-z07pcike',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:58Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=00262d23-bf60-44d9-a775-63ba32adaf96,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.235 187212 DEBUG nova.network.os_vif_util [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "address": "fa:16:3e:f2:70:a1", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2850e9-aa", "ovs_interfaceid": "ff2850e9-aaf4-4f4e-a323-24b258a0b4c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.236 187212 DEBUG nova.network.os_vif_util [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:70:a1,bridge_name='br-int',has_traffic_filtering=True,id=ff2850e9-aaf4-4f4e-a323-24b258a0b4c7,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2850e9-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.236 187212 DEBUG os_vif [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:70:a1,bridge_name='br-int',has_traffic_filtering=True,id=ff2850e9-aaf4-4f4e-a323-24b258a0b4c7,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2850e9-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.238 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.238 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff2850e9-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.242 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:05:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:02.242 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3cfc87ad-f0af-4d76-8fae-8abe12e0708d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.245 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.247 187212 INFO os_vif [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:70:a1,bridge_name='br-int',has_traffic_filtering=True,id=ff2850e9-aaf4-4f4e-a323-24b258a0b4c7,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2850e9-aa')
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.248 187212 INFO nova.virt.libvirt.driver [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Deleting instance files /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96_del
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.249 187212 INFO nova.virt.libvirt.driver [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Deletion of /var/lib/nova/instances/00262d23-bf60-44d9-a775-63ba32adaf96_del complete
Dec 05 12:05:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:02.255 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fab43840-643a-43f7-a2b0-4547b0fb1693]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:02.256 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6e365bb3-4e7f-489f-96c1-8477bdc6c005]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:02.271 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff5879e-17c1-4bdd-8507-c47b057cefe1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367578, 'reachable_time': 38283, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224534, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:02.273 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:05:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:02.273 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[9f4ddfa3-08e3-4d9a-8b49-fe5dd0f73ad2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d5d064000\x2d316c\x2d46a7\x2da23c\x2d1dc26318b6a4.mount: Deactivated successfully.
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.297 187212 INFO nova.compute.manager [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.298 187212 DEBUG oslo.service.loopingcall [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.298 187212 DEBUG nova.compute.manager [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.298 187212 DEBUG nova.network.neutron [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.591 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.634 187212 DEBUG nova.compute.manager [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received event network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.634 187212 DEBUG oslo_concurrency.lockutils [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.634 187212 DEBUG oslo_concurrency.lockutils [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.634 187212 DEBUG oslo_concurrency.lockutils [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.635 187212 DEBUG nova.compute.manager [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] No waiting events found dispatching network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.635 187212 WARNING nova.compute.manager [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received unexpected event network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a for instance with vm_state deleted and task_state None.
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.635 187212 DEBUG nova.compute.manager [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received event network-vif-unplugged-c4a66ea2-9b1b-486a-a750-17072882c42e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.635 187212 DEBUG oslo_concurrency.lockutils [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.635 187212 DEBUG oslo_concurrency.lockutils [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.636 187212 DEBUG oslo_concurrency.lockutils [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.636 187212 DEBUG nova.compute.manager [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] No waiting events found dispatching network-vif-unplugged-c4a66ea2-9b1b-486a-a750-17072882c42e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.636 187212 WARNING nova.compute.manager [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received unexpected event network-vif-unplugged-c4a66ea2-9b1b-486a-a750-17072882c42e for instance with vm_state suspended and task_state None.
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.636 187212 DEBUG nova.compute.manager [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received event network-vif-plugged-c4a66ea2-9b1b-486a-a750-17072882c42e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.636 187212 DEBUG oslo_concurrency.lockutils [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.636 187212 DEBUG oslo_concurrency.lockutils [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.637 187212 DEBUG oslo_concurrency.lockutils [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.637 187212 DEBUG nova.compute.manager [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] No waiting events found dispatching network-vif-plugged-c4a66ea2-9b1b-486a-a750-17072882c42e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.637 187212 WARNING nova.compute.manager [req-72d8fd0e-7765-4be0-97c3-ab61d92b8d94 req-a660fd47-af35-433f-8d8a-8431f8c80dcf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received unexpected event network-vif-plugged-c4a66ea2-9b1b-486a-a750-17072882c42e for instance with vm_state suspended and task_state None.
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.718 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936287.7182927, 10048ac5-1fbc-45e6-aa94-01eff87b9ffc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.719 187212 INFO nova.compute.manager [-] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] VM Stopped (Lifecycle Event)
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.750 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.751 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.752 187212 DEBUG nova.compute.manager [None req-fc7b8364-cd2b-4879-b965-810ef45cfd1b - - - - - -] [instance: 10048ac5-1fbc-45e6-aa94-01eff87b9ffc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:02 compute-0 nova_compute[187208]: 2025-12-05 12:05:02.783 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:05:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:03.011 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:03.012 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:03.013 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.034 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.034 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.041 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.041 187212 INFO nova.compute.claims [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.119 187212 DEBUG nova.network.neutron [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.136 187212 INFO nova.compute.manager [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Took 0.84 seconds to deallocate network for instance.
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.191 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.259 187212 DEBUG nova.compute.provider_tree [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.276 187212 DEBUG nova.scheduler.client.report [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:03 compute-0 rsyslogd[1004]: imjournal: 5109 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.304 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.305 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.307 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.352 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.353 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.370 187212 INFO nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.387 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.458 187212 DEBUG nova.compute.provider_tree [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.478 187212 DEBUG nova.scheduler.client.report [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.484 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.485 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.485 187212 INFO nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Creating image(s)
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.486 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.486 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.487 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.499 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.502 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.526 187212 INFO nova.scheduler.client.report [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Deleted allocations for instance 00262d23-bf60-44d9-a775-63ba32adaf96
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.569 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.570 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.571 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.582 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.606 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.658 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.659 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.700 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.703 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.703 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.728 187212 DEBUG nova.policy [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.767 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.768 187212 DEBUG nova.virt.disk.api [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.768 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.790 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.791 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.791 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.791 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.792 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.794 187212 INFO nova.compute.manager [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Terminating instance
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.795 187212 DEBUG nova.compute.manager [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.803 187212 INFO nova.virt.libvirt.driver [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Instance destroyed successfully.
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.804 187212 DEBUG nova.objects.instance [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'resources' on Instance uuid 21873f07-a1da-4158-a5b2-1d44d547874e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.815 187212 DEBUG nova.virt.libvirt.vif [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1892322843',display_name='tempest-DeleteServersTestJSON-server-1892322843',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1892322843',id=51,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:04:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-javbhuzq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:05:00Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=21873f07-a1da-4158-a5b2-1d44d547874e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.816 187212 DEBUG nova.network.os_vif_util [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.817 187212 DEBUG nova.network.os_vif_util [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.817 187212 DEBUG os_vif [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.819 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4a66ea2-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.821 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.826 187212 INFO os_vif [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b')
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.826 187212 INFO nova.virt.libvirt.driver [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Deleting instance files /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e_del
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.827 187212 INFO nova.virt.libvirt.driver [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Deletion of /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e_del complete
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.830 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.831 187212 DEBUG nova.virt.disk.api [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.831 187212 DEBUG nova.objects.instance [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid ed7b6780-872e-41ef-a0c7-c48d0d6d13fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.853 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.854 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Ensure instance console log exists: /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.854 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.854 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.854 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.887 187212 INFO nova.compute.manager [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Took 0.09 seconds to destroy the instance on the hypervisor.
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.887 187212 DEBUG oslo.service.loopingcall [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.887 187212 DEBUG nova.compute.manager [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:05:03 compute-0 nova_compute[187208]: 2025-12-05 12:05:03.888 187212 DEBUG nova.network.neutron [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.013 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received event network-vif-unplugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.013 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.013 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.013 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] No waiting events found dispatching network-vif-unplugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 WARNING nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received unexpected event network-vif-unplugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 for instance with vm_state deleted and task_state None.
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received event network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.015 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] No waiting events found dispatching network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.015 187212 WARNING nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received unexpected event network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 for instance with vm_state deleted and task_state None.
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.015 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received event network-vif-deleted-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.188 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.189 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.189 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.190 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.190 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.191 187212 INFO nova.compute.manager [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Terminating instance
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.191 187212 DEBUG nova.compute.manager [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:05:04 compute-0 kernel: tapb7157ade-85 (unregistering): left promiscuous mode
Dec 05 12:05:04 compute-0 podman[224550]: 2025-12-05 12:05:04.214266234 +0000 UTC m=+0.062968361 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 12:05:04 compute-0 NetworkManager[55691]: <info>  [1764936304.2171] device (tapb7157ade-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:05:04 compute-0 ovn_controller[95610]: 2025-12-05T12:05:04Z|00417|binding|INFO|Releasing lport b7157ade-85e0-4802-8d6a-0dfb86921b3a from this chassis (sb_readonly=0)
Dec 05 12:05:04 compute-0 ovn_controller[95610]: 2025-12-05T12:05:04Z|00418|binding|INFO|Setting lport b7157ade-85e0-4802-8d6a-0dfb86921b3a down in Southbound
Dec 05 12:05:04 compute-0 ovn_controller[95610]: 2025-12-05T12:05:04Z|00419|binding|INFO|Removing iface tapb7157ade-85 ovn-installed in OVS
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.223 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.231 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:04:47 10.100.0.5'], port_security=['fa:16:3e:ea:04:47 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c6a957dd-2181-4e92-9e06-e1a15fe5c307', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b7157ade-85e0-4802-8d6a-0dfb86921b3a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.232 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b7157ade-85e0-4802-8d6a-0dfb86921b3a in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 unbound from our chassis
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.234 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.237 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.251 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4e05d2-f685-4fa7-b10f-b5cb31094e73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:04 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000030.scope: Deactivated successfully.
Dec 05 12:05:04 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000030.scope: Consumed 12.997s CPU time.
Dec 05 12:05:04 compute-0 systemd-machined[153543]: Machine qemu-53-instance-00000030 terminated.
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.284 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf16a78-bfb7-4d11-8158-fbe2d28fc834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.287 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[feda4205-7c55-4970-96fc-efa747157ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.316 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfd177f-ecb1-45b8-a560-8c373eded33f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.335 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[32c96826-58f9-4ee2-b8f2-ed61684251de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365754, 'reachable_time': 25183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224579, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.354 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a23252d7-efd4-4d94-a0b0-efc49bbea9ff]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365765, 'tstamp': 365765}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224580, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365769, 'tstamp': 365769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224580, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.356 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.357 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.362 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.362 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dd8ae79-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.362 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.363 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2dd8ae79-a0, col_values=(('external_ids', {'iface-id': '96c6c9a6-c871-4fab-9fdc-eedbdd230979'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.363 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.417 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.422 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.469 187212 INFO nova.virt.libvirt.driver [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Instance destroyed successfully.
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.470 187212 DEBUG nova.objects.instance [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'resources' on Instance uuid c6a957dd-2181-4e92-9e06-e1a15fe5c307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.485 187212 DEBUG nova.virt.libvirt.vif [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-2',id=48,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-12-05T12:04:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:42Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=c6a957dd-2181-4e92-9e06-e1a15fe5c307,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.485 187212 DEBUG nova.network.os_vif_util [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.486 187212 DEBUG nova.network.os_vif_util [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.486 187212 DEBUG os_vif [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.488 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.488 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7157ade-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.490 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.493 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.559 187212 INFO os_vif [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85')
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.560 187212 INFO nova.virt.libvirt.driver [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Deleting instance files /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307_del
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.560 187212 INFO nova.virt.libvirt.driver [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Deletion of /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307_del complete
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.623 187212 INFO nova.compute.manager [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Took 0.43 seconds to destroy the instance on the hypervisor.
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.623 187212 DEBUG oslo.service.loopingcall [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.624 187212 DEBUG nova.compute.manager [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:05:04 compute-0 nova_compute[187208]: 2025-12-05 12:05:04.624 187212 DEBUG nova.network.neutron [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.078 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Successfully created port: db2c3297-b6c8-4933-9328-102d81d6faa3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.125 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.126 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.126 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.127 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.127 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.128 187212 INFO nova.compute.manager [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Terminating instance
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.130 187212 DEBUG nova.compute.manager [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:05:05 compute-0 kernel: tap83fb1d43-a4 (unregistering): left promiscuous mode
Dec 05 12:05:05 compute-0 NetworkManager[55691]: <info>  [1764936305.1531] device (tap83fb1d43-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00420|binding|INFO|Releasing lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 from this chassis (sb_readonly=0)
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00421|binding|INFO|Setting lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 down in Southbound
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.158 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00422|binding|INFO|Removing iface tap83fb1d43-a4 ovn-installed in OVS
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00423|binding|INFO|Releasing lport 96c6c9a6-c871-4fab-9fdc-eedbdd230979 from this chassis (sb_readonly=0)
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.170 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:de:18 10.100.0.14'], port_security=['fa:16:3e:41:de:18 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '97020786-7ba5-4c8b-8a2c-838c0f663bb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=83fb1d43-a495-47f4-ad3a-569fd7c02c76) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.171 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 83fb1d43-a495-47f4-ad3a-569fd7c02c76 in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 unbound from our chassis
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.173 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.175 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f0bd38aa-6b7a-4055-b3ef-81352fea8033]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.176 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 namespace which is not needed anymore
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [NOTICE]   (223483) : haproxy version is 2.8.14-c23fe91
Dec 05 12:05:05 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [NOTICE]   (223483) : path to executable is /usr/sbin/haproxy
Dec 05 12:05:05 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [ALERT]    (223483) : Current worker (223485) exited with code 143 (Terminated)
Dec 05 12:05:05 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [WARNING]  (223483) : All workers exited. Exiting... (0)
Dec 05 12:05:05 compute-0 systemd[1]: libpod-787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da.scope: Deactivated successfully.
Dec 05 12:05:05 compute-0 conmon[223479]: conmon 787283fd7dd42037005d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da.scope/container/memory.events
Dec 05 12:05:05 compute-0 podman[224620]: 2025-12-05 12:05:05.347812652 +0000 UTC m=+0.072424953 container died 787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:05:05 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000031.scope: Deactivated successfully.
Dec 05 12:05:05 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000031.scope: Consumed 13.619s CPU time.
Dec 05 12:05:05 compute-0 systemd-machined[153543]: Machine qemu-52-instance-00000031 terminated.
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.379 187212 DEBUG nova.network.neutron [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da-userdata-shm.mount: Deactivated successfully.
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.381 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-dc61896ea332a9ba7458de04edb53befc42cb9ab21a97cb2cedca198eb888fcf-merged.mount: Deactivated successfully.
Dec 05 12:05:05 compute-0 podman[224620]: 2025-12-05 12:05:05.404410399 +0000 UTC m=+0.129022680 container cleanup 787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:05:05 compute-0 systemd[1]: libpod-conmon-787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da.scope: Deactivated successfully.
Dec 05 12:05:05 compute-0 podman[224647]: 2025-12-05 12:05:05.472548698 +0000 UTC m=+0.046938951 container remove 787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.478 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ec34fe-1b73-415e-b5b9-7c0513be2015]: (4, ('Fri Dec  5 12:05:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 (787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da)\n787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da\nFri Dec  5 12:05:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 (787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da)\n787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.480 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[45232b24-036c-4720-b01b-69f61edc9558]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.481 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.482 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 kernel: tap2dd8ae79-a0: left promiscuous mode
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.498 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.501 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0e141c-1c0b-4ea5-8812-0c670ca75a4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.516 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[df67ccac-9883-4221-a984-22902db6eda1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.517 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed39343-9497-489f-b776-cc900d2c12c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.533 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f19f61-848b-4db6-af0d-8b2ef63a86f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365741, 'reachable_time': 18090, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224667, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d2dd8ae79\x2da0f0\x2d469c\x2d86de\x2da9a5d5b69f75.mount: Deactivated successfully.
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.536 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.537 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c3bd02f1-4c5c-41ea-9b4d-d2f0c618b36c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 kernel: tap83fb1d43-a4: entered promiscuous mode
Dec 05 12:05:05 compute-0 NetworkManager[55691]: <info>  [1764936305.5499] manager: (tap83fb1d43-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Dec 05 12:05:05 compute-0 kernel: tap83fb1d43-a4 (unregistering): left promiscuous mode
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.551 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00424|binding|INFO|Claiming lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 for this chassis.
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00425|binding|INFO|83fb1d43-a495-47f4-ad3a-569fd7c02c76: Claiming fa:16:3e:41:de:18 10.100.0.14
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.608 187212 INFO nova.virt.libvirt.driver [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Instance destroyed successfully.
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.608 187212 DEBUG nova.objects.instance [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'resources' on Instance uuid 97020786-7ba5-4c8b-8a2c-838c0f663bb4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00426|if_status|INFO|Dropped 2 log messages in last 12 seconds (most recently, 12 seconds ago) due to excessive rate
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00427|if_status|INFO|Not setting lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 down as sb is readonly
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.611 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.615 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00428|binding|INFO|Releasing lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 from this chassis (sb_readonly=0)
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.630 187212 INFO nova.compute.manager [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Took 1.74 seconds to deallocate network for instance.
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.631 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:de:18 10.100.0.14'], port_security=['fa:16:3e:41:de:18 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '97020786-7ba5-4c8b-8a2c-838c0f663bb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=83fb1d43-a495-47f4-ad3a-569fd7c02c76) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.633 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 83fb1d43-a495-47f4-ad3a-569fd7c02c76 in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 bound to our chassis
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.635 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.638 187212 DEBUG nova.virt.libvirt.vif [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-3',id=49,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-12-05T12:04:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:42Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=97020786-7ba5-4c8b-8a2c-838c0f663bb4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.639 187212 DEBUG nova.network.os_vif_util [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.639 187212 DEBUG nova.network.os_vif_util [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.640 187212 DEBUG os_vif [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.641 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83fb1d43-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.642 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:de:18 10.100.0.14'], port_security=['fa:16:3e:41:de:18 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '97020786-7ba5-4c8b-8a2c-838c0f663bb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=83fb1d43-a495-47f4-ad3a-569fd7c02c76) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.644 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.646 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.647 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[90205ae5-2662-42fc-b20b-f300ab68e75f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.648 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2dd8ae79-a1 in ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.648 187212 INFO os_vif [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4')
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.648 187212 INFO nova.virt.libvirt.driver [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Deleting instance files /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4_del
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.649 187212 INFO nova.virt.libvirt.driver [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Deletion of /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4_del complete
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.650 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2dd8ae79-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.650 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8361833-bb3e-4194-b1e9-d407e13f7bd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.651 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[638471e4-fea4-4787-95ab-67363fbba53d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.664 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[d5eebad3-d02d-4470-93bf-0c9c3ad8b69a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.690 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72b22825-6350-42a9-af2e-26f5e545642a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.729 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f81e0bc4-8f62-4415-9d76-09d1daff633e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.737 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f316c24-d920-40b5-804d-f9a81450050a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 NetworkManager[55691]: <info>  [1764936305.7381] manager: (tap2dd8ae79-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/173)
Dec 05 12:05:05 compute-0 systemd-udevd[224611]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.741 187212 INFO nova.compute.manager [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Took 0.61 seconds to destroy the instance on the hypervisor.
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.741 187212 DEBUG oslo.service.loopingcall [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.742 187212 DEBUG nova.compute.manager [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.742 187212 DEBUG nova.network.neutron [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.769 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6715e4db-1708-465e-9b61-1ef590526be0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.773 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b25083d7-65b4-432c-b804-d3c321eb2d62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 NetworkManager[55691]: <info>  [1764936305.7998] device (tap2dd8ae79-a0): carrier: link connected
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.807 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9f56a63f-2003-4542-84fa-21964003bdd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.827 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2e72bf-450b-48da-a3c8-d78cd91a81b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368636, 'reachable_time': 36434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224712, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.843 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e17db419-cc4e-4f16-a726-fd31eb54e938]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:dbed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368636, 'tstamp': 368636}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224713, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.861 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6ced642e-c5e8-4979-97bb-bf4f397b7532]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368636, 'reachable_time': 36434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224714, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.889 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[20d79d5e-ac85-4228-b085-c10efc036462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.912 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.912 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.950 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[758e17a1-1ae3-439d-ac19-4162485db29d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.952 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.952 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.952 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dd8ae79-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.954 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 NetworkManager[55691]: <info>  [1764936305.9549] manager: (tap2dd8ae79-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Dec 05 12:05:05 compute-0 kernel: tap2dd8ae79-a0: entered promiscuous mode
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.958 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2dd8ae79-a0, col_values=(('external_ids', {'iface-id': '96c6c9a6-c871-4fab-9fdc-eedbdd230979'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.959 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 ovn_controller[95610]: 2025-12-05T12:05:05Z|00429|binding|INFO|Releasing lport 96c6c9a6-c871-4fab-9fdc-eedbdd230979 from this chassis (sb_readonly=0)
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.967 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-unplugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.967 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] No waiting events found dispatching network-vif-unplugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-unplugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] No waiting events found dispatching network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 WARNING nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received unexpected event network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a for instance with vm_state active and task_state deleting.
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.972 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.973 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e00b754e-8e98-4ad7-82a4-3c83203a9a0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.973 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.pid.haproxy
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:05:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.974 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'env', 'PROCESS_TAG=haproxy-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:05:05 compute-0 nova_compute[187208]: 2025-12-05 12:05:05.975 187212 DEBUG nova.network.neutron [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.020 187212 INFO nova.compute.manager [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Took 1.40 seconds to deallocate network for instance.
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.045 187212 DEBUG nova.compute.provider_tree [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.066 187212 DEBUG nova.scheduler.client.report [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.106 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.107 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.109 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.148 187212 INFO nova.scheduler.client.report [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Deleted allocations for instance 21873f07-a1da-4158-a5b2-1d44d547874e
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.242 187212 DEBUG nova.compute.provider_tree [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.271 187212 DEBUG nova.scheduler.client.report [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.296 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:06 compute-0 podman[224745]: 2025-12-05 12:05:06.33599251 +0000 UTC m=+0.049860754 container create b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.372 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:06 compute-0 systemd[1]: Started libpod-conmon-b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c.scope.
Dec 05 12:05:06 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:05:06 compute-0 podman[224745]: 2025-12-05 12:05:06.310124727 +0000 UTC m=+0.023992991 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:05:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dceecec4caf6be812f54c3c703b049ec952977d6772c98d729786d6ad75fed0e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.410 187212 INFO nova.scheduler.client.report [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Deleted allocations for instance c6a957dd-2181-4e92-9e06-e1a15fe5c307
Dec 05 12:05:06 compute-0 podman[224745]: 2025-12-05 12:05:06.42399177 +0000 UTC m=+0.137860034 container init b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:05:06 compute-0 podman[224745]: 2025-12-05 12:05:06.429842898 +0000 UTC m=+0.143711142 container start b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:05:06 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [NOTICE]   (224764) : New worker (224766) forked
Dec 05 12:05:06 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [NOTICE]   (224764) : Loading success.
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.486 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 83fb1d43-a495-47f4-ad3a-569fd7c02c76 in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 unbound from our chassis
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.488 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.489 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[817cd543-33c9-4262-8ff4-5cee08905a94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.489 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 namespace which is not needed anymore
Dec 05 12:05:06 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [NOTICE]   (224764) : haproxy version is 2.8.14-c23fe91
Dec 05 12:05:06 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [NOTICE]   (224764) : path to executable is /usr/sbin/haproxy
Dec 05 12:05:06 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [WARNING]  (224764) : Exiting Master process...
Dec 05 12:05:06 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [WARNING]  (224764) : Exiting Master process...
Dec 05 12:05:06 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [ALERT]    (224764) : Current worker (224766) exited with code 143 (Terminated)
Dec 05 12:05:06 compute-0 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [WARNING]  (224764) : All workers exited. Exiting... (0)
Dec 05 12:05:06 compute-0 systemd[1]: libpod-b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c.scope: Deactivated successfully.
Dec 05 12:05:06 compute-0 podman[224791]: 2025-12-05 12:05:06.616289148 +0000 UTC m=+0.042851082 container died b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:05:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c-userdata-shm.mount: Deactivated successfully.
Dec 05 12:05:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-dceecec4caf6be812f54c3c703b049ec952977d6772c98d729786d6ad75fed0e-merged.mount: Deactivated successfully.
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.644 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.646 187212 DEBUG nova.network.neutron [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:06 compute-0 podman[224791]: 2025-12-05 12:05:06.649245536 +0000 UTC m=+0.075807460 container cleanup b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.664 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Successfully updated port: db2c3297-b6c8-4933-9328-102d81d6faa3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:05:06 compute-0 systemd[1]: libpod-conmon-b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c.scope: Deactivated successfully.
Dec 05 12:05:06 compute-0 podman[224821]: 2025-12-05 12:05:06.724048026 +0000 UTC m=+0.054487477 container remove b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.729 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7a28d7-ba7b-4e73-9bf4-b9f4236a9663]: (4, ('Fri Dec  5 12:05:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 (b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c)\nb8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c\nFri Dec  5 12:05:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 (b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c)\nb8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.730 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a78010-feaa-4d40-80f0-0bff5dcd08bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.731 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.733 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:06 compute-0 kernel: tap2dd8ae79-a0: left promiscuous mode
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.745 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.747 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b706710b-e16a-462b-867d-9c18024190b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.763 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[611f9dfa-0137-41d5-a522-cf3c59a637c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.764 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[145761e0-a244-4c26-acfd-3368ffb8fe1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.780 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6085efae-e50f-418b-82de-bac89391b6d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368628, 'reachable_time': 29011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224837, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d2dd8ae79\x2da0f0\x2d469c\x2d86de\x2da9a5d5b69f75.mount: Deactivated successfully.
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.782 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:05:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.782 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0f90b6-fdcc-490b-89ab-89e6dee24ad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.974 187212 INFO nova.compute.manager [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Took 1.23 seconds to deallocate network for instance.
Dec 05 12:05:06 compute-0 nova_compute[187208]: 2025-12-05 12:05:06.980 187212 DEBUG nova.compute.manager [req-4f0cf994-749d-46ea-9cc9-6c6b66f13b9b req-5197b3f9-2d88-4b36-85dc-43c6e7ad9a26 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received event network-vif-deleted-c4a66ea2-9b1b-486a-a750-17072882c42e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.027 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.027 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.027 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.084 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.085 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.167 187212 DEBUG nova.compute.provider_tree [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.183 187212 DEBUG nova.scheduler.client.report [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.209 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.233 187212 INFO nova.scheduler.client.report [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Deleted allocations for instance 97020786-7ba5-4c8b-8a2c-838c0f663bb4
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.311 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.342 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:07 compute-0 nova_compute[187208]: 2025-12-05 12:05:07.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.486 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936293.4859166, 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.487 187212 INFO nova.compute.manager [-] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] VM Stopped (Lifecycle Event)
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.511 187212 DEBUG nova.compute.manager [None req-d92b68d0-a353-4b2c-9a71-1a2f0f28f0f7 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.513 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-deleted-b7157ade-85e0-4802-8d6a-0dfb86921b3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received event network-vif-unplugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] No waiting events found dispatching network-vif-unplugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 WARNING nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received unexpected event network-vif-unplugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 for instance with vm_state deleted and task_state None.
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received event network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.516 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] No waiting events found dispatching network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.516 187212 WARNING nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received unexpected event network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 for instance with vm_state deleted and task_state None.
Dec 05 12:05:08 compute-0 nova_compute[187208]: 2025-12-05 12:05:08.516 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received event network-vif-deleted-83fb1d43-a495-47f4-ad3a-569fd7c02c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.110 187212 DEBUG nova.compute.manager [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received event network-changed-db2c3297-b6c8-4933-9328-102d81d6faa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.110 187212 DEBUG nova.compute.manager [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Refreshing instance network info cache due to event network-changed-db2c3297-b6c8-4933-9328-102d81d6faa3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.110 187212 DEBUG oslo_concurrency.lockutils [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.457 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Updating instance_info_cache with network_info: [{"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.474 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.474 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance network_info: |[{"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.474 187212 DEBUG oslo_concurrency.lockutils [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.475 187212 DEBUG nova.network.neutron [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Refreshing network info cache for port db2c3297-b6c8-4933-9328-102d81d6faa3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.477 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start _get_guest_xml network_info=[{"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.482 187212 WARNING nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.489 187212 DEBUG nova.virt.libvirt.host [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.490 187212 DEBUG nova.virt.libvirt.host [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.501 187212 DEBUG nova.virt.libvirt.host [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.503 187212 DEBUG nova.virt.libvirt.host [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.503 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.504 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.504 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.504 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.505 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.505 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.505 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.505 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.506 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.506 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.506 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.506 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.509 187212 DEBUG nova.virt.libvirt.vif [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-465631494',display_name='tempest-ImagesTestJSON-server-465631494',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-465631494',id=53,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-j61263vr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:03Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=ed7b6780-872e-41ef-a0c7-c48d0d6d13fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.510 187212 DEBUG nova.network.os_vif_util [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.510 187212 DEBUG nova.network.os_vif_util [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.511 187212 DEBUG nova.objects.instance [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid ed7b6780-872e-41ef-a0c7-c48d0d6d13fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.528 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <uuid>ed7b6780-872e-41ef-a0c7-c48d0d6d13fd</uuid>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <name>instance-00000035</name>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <nova:name>tempest-ImagesTestJSON-server-465631494</nova:name>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:05:09</nova:creationTime>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:05:09 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:05:09 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:05:09 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:05:09 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:05:09 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:05:09 compute-0 nova_compute[187208]:         <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec 05 12:05:09 compute-0 nova_compute[187208]:         <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:05:09 compute-0 nova_compute[187208]:         <nova:port uuid="db2c3297-b6c8-4933-9328-102d81d6faa3">
Dec 05 12:05:09 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <system>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <entry name="serial">ed7b6780-872e-41ef-a0c7-c48d0d6d13fd</entry>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <entry name="uuid">ed7b6780-872e-41ef-a0c7-c48d0d6d13fd</entry>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     </system>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <os>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   </os>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <features>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   </features>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.config"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:66:5d:24"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <target dev="tapdb2c3297-b6"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/console.log" append="off"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <video>
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     </video>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:05:09 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:05:09 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:05:09 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:05:09 compute-0 nova_compute[187208]: </domain>
Dec 05 12:05:09 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.529 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Preparing to wait for external event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.529 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.529 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.529 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.530 187212 DEBUG nova.virt.libvirt.vif [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-465631494',display_name='tempest-ImagesTestJSON-server-465631494',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-465631494',id=53,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-j61263vr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:03Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=ed7b6780-872e-41ef-a0c7-c48d0d6d13fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.530 187212 DEBUG nova.network.os_vif_util [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.531 187212 DEBUG nova.network.os_vif_util [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.531 187212 DEBUG os_vif [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.532 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.532 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.534 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.534 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb2c3297-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.534 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdb2c3297-b6, col_values=(('external_ids', {'iface-id': 'db2c3297-b6c8-4933-9328-102d81d6faa3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:5d:24', 'vm-uuid': 'ed7b6780-872e-41ef-a0c7-c48d0d6d13fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.536 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:09 compute-0 NetworkManager[55691]: <info>  [1764936309.5372] manager: (tapdb2c3297-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.543 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.544 187212 INFO os_vif [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6')
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.621 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.622 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.622 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:66:5d:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:05:09 compute-0 nova_compute[187208]: 2025-12-05 12:05:09.622 187212 INFO nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Using config drive
Dec 05 12:05:10 compute-0 nova_compute[187208]: 2025-12-05 12:05:10.031 187212 INFO nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Creating config drive at /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.config
Dec 05 12:05:10 compute-0 nova_compute[187208]: 2025-12-05 12:05:10.040 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8pwfs35a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:10 compute-0 nova_compute[187208]: 2025-12-05 12:05:10.185 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8pwfs35a" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:10 compute-0 kernel: tapdb2c3297-b6: entered promiscuous mode
Dec 05 12:05:10 compute-0 NetworkManager[55691]: <info>  [1764936310.2767] manager: (tapdb2c3297-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/176)
Dec 05 12:05:10 compute-0 nova_compute[187208]: 2025-12-05 12:05:10.277 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:10 compute-0 ovn_controller[95610]: 2025-12-05T12:05:10Z|00430|binding|INFO|Claiming lport db2c3297-b6c8-4933-9328-102d81d6faa3 for this chassis.
Dec 05 12:05:10 compute-0 ovn_controller[95610]: 2025-12-05T12:05:10Z|00431|binding|INFO|db2c3297-b6c8-4933-9328-102d81d6faa3: Claiming fa:16:3e:66:5d:24 10.100.0.5
Dec 05 12:05:10 compute-0 nova_compute[187208]: 2025-12-05 12:05:10.282 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:10 compute-0 systemd-udevd[224864]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:05:10 compute-0 systemd-machined[153543]: New machine qemu-57-instance-00000035.
Dec 05 12:05:10 compute-0 NetworkManager[55691]: <info>  [1764936310.3294] device (tapdb2c3297-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:05:10 compute-0 NetworkManager[55691]: <info>  [1764936310.3309] device (tapdb2c3297-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:05:10 compute-0 nova_compute[187208]: 2025-12-05 12:05:10.332 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:10 compute-0 ovn_controller[95610]: 2025-12-05T12:05:10Z|00432|binding|INFO|Setting lport db2c3297-b6c8-4933-9328-102d81d6faa3 ovn-installed in OVS
Dec 05 12:05:10 compute-0 nova_compute[187208]: 2025-12-05 12:05:10.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:10 compute-0 systemd[1]: Started Virtual Machine qemu-57-instance-00000035.
Dec 05 12:05:10 compute-0 ovn_controller[95610]: 2025-12-05T12:05:10Z|00433|binding|INFO|Setting lport db2c3297-b6c8-4933-9328-102d81d6faa3 up in Southbound
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.473 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:5d:24 10.100.0.5'], port_security=['fa:16:3e:66:5d:24 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ed7b6780-872e-41ef-a0c7-c48d0d6d13fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=db2c3297-b6c8-4933-9328-102d81d6faa3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.475 104471 INFO neutron.agent.ovn.metadata.agent [-] Port db2c3297-b6c8-4933-9328-102d81d6faa3 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.476 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.489 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a12a5d3f-45e5-4252-be33-f33ad9c3c29c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.490 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.495 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.495 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1a3c63-a997-4087-8946-64bda44ea9b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.496 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9d969bfc-46b2-4a31-8e9a-65a1cf853837]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.509 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[dc122b3b-c566-4f08-9833-6580cb025b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.527 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e268cdeb-c045-4c1b-b587-3dcca5aa5bcf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.564 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[16b2fce8-550f-439f-b504-ff77f1d520c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 NetworkManager[55691]: <info>  [1764936310.5787] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/177)
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.578 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7fea2d0c-8402-43db-9866-fa4622cdb27c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 systemd-udevd[224867]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.620 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e30562fb-e075-442f-b018-934761928dce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.623 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1efed765-a75f-4a95-8496-75c499cc4198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 NetworkManager[55691]: <info>  [1764936310.6451] device (tap41b3b495-c0): carrier: link connected
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.651 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f47435-81dd-488f-bdab-2444d0d16bce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.671 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e83e9fa8-4d7e-4652-9493-7cf05e928c5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369120, 'reachable_time': 42678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224901, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.692 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0aebad91-6948-414b-8019-7a2e9480f365]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369120, 'tstamp': 369120}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224902, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.715 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b61cda51-83dd-433c-9a04-a783d847d271]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369120, 'reachable_time': 42678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224903, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.748 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[64271236-41d0-4832-8220-12b92e43e257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.805 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5b6db54a-2b69-43f6-a408-bda1a706ce93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.808 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.808 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.808 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:10 compute-0 kernel: tap41b3b495-c0: entered promiscuous mode
Dec 05 12:05:10 compute-0 nova_compute[187208]: 2025-12-05 12:05:10.810 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:10 compute-0 NetworkManager[55691]: <info>  [1764936310.8129] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.813 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:10 compute-0 ovn_controller[95610]: 2025-12-05T12:05:10Z|00434|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.816 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.820 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[229d862d-be6f-4cb6-b6a8-82e2be878f76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.822 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:05:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.822 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:05:10 compute-0 nova_compute[187208]: 2025-12-05 12:05:10.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:11 compute-0 nova_compute[187208]: 2025-12-05 12:05:11.027 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936311.0264604, ed7b6780-872e-41ef-a0c7-c48d0d6d13fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:11 compute-0 nova_compute[187208]: 2025-12-05 12:05:11.029 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] VM Started (Lifecycle Event)
Dec 05 12:05:11 compute-0 podman[224943]: 2025-12-05 12:05:11.26848067 +0000 UTC m=+0.050141462 container create d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:05:11 compute-0 podman[224935]: 2025-12-05 12:05:11.269799068 +0000 UTC m=+0.055480596 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:05:11 compute-0 systemd[1]: Started libpod-conmon-d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc.scope.
Dec 05 12:05:11 compute-0 podman[224943]: 2025-12-05 12:05:11.241489514 +0000 UTC m=+0.023150336 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:05:11 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:05:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3400c5b4535476a6d33ef667b47e2b8030edbf230d73171381716b0df7dfcda6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:05:11 compute-0 podman[224943]: 2025-12-05 12:05:11.379948235 +0000 UTC m=+0.161609117 container init d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:05:11 compute-0 podman[224943]: 2025-12-05 12:05:11.386219015 +0000 UTC m=+0.167879827 container start d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 05 12:05:11 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [NOTICE]   (224986) : New worker (224988) forked
Dec 05 12:05:11 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [NOTICE]   (224986) : Loading success.
Dec 05 12:05:11 compute-0 nova_compute[187208]: 2025-12-05 12:05:11.469 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:11 compute-0 nova_compute[187208]: 2025-12-05 12:05:11.475 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936311.0283916, ed7b6780-872e-41ef-a0c7-c48d0d6d13fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:11 compute-0 nova_compute[187208]: 2025-12-05 12:05:11.476 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] VM Paused (Lifecycle Event)
Dec 05 12:05:11 compute-0 nova_compute[187208]: 2025-12-05 12:05:11.501 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:11 compute-0 nova_compute[187208]: 2025-12-05 12:05:11.505 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:11 compute-0 nova_compute[187208]: 2025-12-05 12:05:11.527 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.594 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.830 187212 DEBUG nova.compute.manager [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.830 187212 DEBUG oslo_concurrency.lockutils [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.830 187212 DEBUG oslo_concurrency.lockutils [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.831 187212 DEBUG oslo_concurrency.lockutils [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.831 187212 DEBUG nova.compute.manager [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Processing event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.831 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.835 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936312.8350568, ed7b6780-872e-41ef-a0c7-c48d0d6d13fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.835 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] VM Resumed (Lifecycle Event)
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.837 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.839 187212 INFO nova.virt.libvirt.driver [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance spawned successfully.
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.839 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.869 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.875 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.878 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.878 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.879 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.879 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.879 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.880 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.915 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.962 187212 INFO nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Took 9.48 seconds to spawn the instance on the hypervisor.
Dec 05 12:05:12 compute-0 nova_compute[187208]: 2025-12-05 12:05:12.962 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:13 compute-0 nova_compute[187208]: 2025-12-05 12:05:13.053 187212 INFO nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Took 10.04 seconds to build instance.
Dec 05 12:05:13 compute-0 nova_compute[187208]: 2025-12-05 12:05:13.077 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:13 compute-0 nova_compute[187208]: 2025-12-05 12:05:13.635 187212 DEBUG nova.network.neutron [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Updated VIF entry in instance network info cache for port db2c3297-b6c8-4933-9328-102d81d6faa3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:05:13 compute-0 nova_compute[187208]: 2025-12-05 12:05:13.636 187212 DEBUG nova.network.neutron [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Updating instance_info_cache with network_info: [{"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:13 compute-0 nova_compute[187208]: 2025-12-05 12:05:13.653 187212 DEBUG oslo_concurrency.lockutils [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:14 compute-0 nova_compute[187208]: 2025-12-05 12:05:14.275 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936299.274574, e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:14 compute-0 nova_compute[187208]: 2025-12-05 12:05:14.275 187212 INFO nova.compute.manager [-] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] VM Stopped (Lifecycle Event)
Dec 05 12:05:14 compute-0 nova_compute[187208]: 2025-12-05 12:05:14.300 187212 DEBUG nova.compute.manager [None req-d092ffa6-6746-410d-8a42-528440b51758 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:14 compute-0 nova_compute[187208]: 2025-12-05 12:05:14.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:14 compute-0 ovn_controller[95610]: 2025-12-05T12:05:14Z|00435|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec 05 12:05:14 compute-0 nova_compute[187208]: 2025-12-05 12:05:14.955 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.002 187212 DEBUG nova.compute.manager [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.003 187212 DEBUG oslo_concurrency.lockutils [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.003 187212 DEBUG oslo_concurrency.lockutils [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.004 187212 DEBUG oslo_concurrency.lockutils [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.004 187212 DEBUG nova.compute.manager [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] No waiting events found dispatching network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.005 187212 WARNING nova.compute.manager [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received unexpected event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 for instance with vm_state active and task_state None.
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.096 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936300.0949879, 21873f07-a1da-4158-a5b2-1d44d547874e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.097 187212 INFO nova.compute.manager [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] VM Stopped (Lifecycle Event)
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.143 187212 DEBUG nova.compute.manager [None req-7d932572-a523-4c35-bd22-7a9240366515 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.206 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "f50947f2-f8d0-4d6b-bca4-b5412a206503" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.206 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.207 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "f50947f2-f8d0-4d6b-bca4-b5412a206503-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.207 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.207 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.208 187212 INFO nova.compute.manager [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Terminating instance
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.209 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "refresh_cache-f50947f2-f8d0-4d6b-bca4-b5412a206503" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.209 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquired lock "refresh_cache-f50947f2-f8d0-4d6b-bca4-b5412a206503" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.209 187212 DEBUG nova.network.neutron [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.449 187212 DEBUG nova.network.neutron [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.808 187212 DEBUG nova.compute.manager [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.851 187212 INFO nova.compute.manager [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] instance snapshotting
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.897 187212 DEBUG nova.network.neutron [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.913 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Releasing lock "refresh_cache-f50947f2-f8d0-4d6b-bca4-b5412a206503" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:15 compute-0 nova_compute[187208]: 2025-12-05 12:05:15.913 187212 DEBUG nova.compute.manager [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:05:15 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Dec 05 12:05:15 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000002e.scope: Consumed 15.377s CPU time.
Dec 05 12:05:15 compute-0 systemd-machined[153543]: Machine qemu-50-instance-0000002e terminated.
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.082 187212 INFO nova.virt.libvirt.driver [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Beginning live snapshot process
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.169 187212 INFO nova.virt.libvirt.driver [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Instance destroyed successfully.
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.170 187212 DEBUG nova.objects.instance [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'resources' on Instance uuid f50947f2-f8d0-4d6b-bca4-b5412a206503 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.215 187212 INFO nova.virt.libvirt.driver [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Deleting instance files /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503_del
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.216 187212 INFO nova.virt.libvirt.driver [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Deletion of /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503_del complete
Dec 05 12:05:16 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.400 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.434 187212 INFO nova.compute.manager [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Took 0.52 seconds to destroy the instance on the hypervisor.
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.435 187212 DEBUG oslo.service.loopingcall [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.435 187212 DEBUG nova.compute.manager [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.436 187212 DEBUG nova.network.neutron [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.491 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json -f qcow2" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.492 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.548 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json -f qcow2" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.563 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.621 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.622 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.655 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5.delta 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.656 187212 INFO nova.virt.libvirt.driver [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.708 187212 DEBUG nova.virt.libvirt.guest [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.711 187212 INFO nova.virt.libvirt.driver [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.751 187212 DEBUG nova.privsep.utils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.752 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5.delta /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.960 187212 DEBUG nova.network.neutron [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.975 187212 DEBUG nova.network.neutron [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.989 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5.delta /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5" returned: 0 in 0.237s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.990 187212 INFO nova.virt.libvirt.driver [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Snapshot extracted, beginning image upload
Dec 05 12:05:16 compute-0 nova_compute[187208]: 2025-12-05 12:05:16.994 187212 INFO nova.compute.manager [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Took 0.56 seconds to deallocate network for instance.
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.052 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.052 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.155 187212 DEBUG nova.compute.provider_tree [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.171 187212 DEBUG nova.scheduler.client.report [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.195 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.213 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936302.2112913, 00262d23-bf60-44d9-a775-63ba32adaf96 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.214 187212 INFO nova.compute.manager [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] VM Stopped (Lifecycle Event)
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.233 187212 INFO nova.scheduler.client.report [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Deleted allocations for instance f50947f2-f8d0-4d6b-bca4-b5412a206503
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.238 187212 WARNING nova.compute.manager [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Image not found during snapshot: nova.exception.ImageNotFound: Image f0688666-c4f9-4480-9537-e60553567a2b could not be found.
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.241 187212 DEBUG nova.compute.manager [None req-f3ae35f3-12db-4735-a2ac-a5598e179a02 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.299 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:17 compute-0 nova_compute[187208]: 2025-12-05 12:05:17.596 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.224 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "004672c5-70e2-4940-bc9c-8971d94cc037" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.225 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.225 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "004672c5-70e2-4940-bc9c-8971d94cc037-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.226 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.226 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.227 187212 INFO nova.compute.manager [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Terminating instance
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.227 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "refresh_cache-004672c5-70e2-4940-bc9c-8971d94cc037" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.228 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquired lock "refresh_cache-004672c5-70e2-4940-bc9c-8971d94cc037" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.228 187212 DEBUG nova.network.neutron [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:05:18 compute-0 podman[225033]: 2025-12-05 12:05:18.229194878 +0000 UTC m=+0.079376933 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 12:05:18 compute-0 nova_compute[187208]: 2025-12-05 12:05:18.891 187212 DEBUG nova.network.neutron [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.468 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936304.4675956, c6a957dd-2181-4e92-9e06-e1a15fe5c307 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.469 187212 INFO nova.compute.manager [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] VM Stopped (Lifecycle Event)
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.487 187212 DEBUG nova.compute.manager [None req-4875ea78-5ecf-46e6-b3c7-62f83fdf0a85 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.541 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.800 187212 DEBUG nova.network.neutron [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.818 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Releasing lock "refresh_cache-004672c5-70e2-4940-bc9c-8971d94cc037" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.819 187212 DEBUG nova.compute.manager [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.943 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.944 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.944 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.945 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.945 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.946 187212 INFO nova.compute.manager [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Terminating instance
Dec 05 12:05:19 compute-0 nova_compute[187208]: 2025-12-05 12:05:19.947 187212 DEBUG nova.compute.manager [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:05:19 compute-0 kernel: tapdb2c3297-b6 (unregistering): left promiscuous mode
Dec 05 12:05:19 compute-0 NetworkManager[55691]: <info>  [1764936319.9722] device (tapdb2c3297-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:05:20 compute-0 ovn_controller[95610]: 2025-12-05T12:05:20Z|00436|binding|INFO|Releasing lport db2c3297-b6c8-4933-9328-102d81d6faa3 from this chassis (sb_readonly=0)
Dec 05 12:05:20 compute-0 ovn_controller[95610]: 2025-12-05T12:05:20Z|00437|binding|INFO|Setting lport db2c3297-b6c8-4933-9328-102d81d6faa3 down in Southbound
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:20 compute-0 ovn_controller[95610]: 2025-12-05T12:05:20Z|00438|binding|INFO|Removing iface tapdb2c3297-b6 ovn-installed in OVS
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.018 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.025 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:5d:24 10.100.0.5'], port_security=['fa:16:3e:66:5d:24 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ed7b6780-872e-41ef-a0c7-c48d0d6d13fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=db2c3297-b6c8-4933-9328-102d81d6faa3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.027 104471 INFO neutron.agent.ovn.metadata.agent [-] Port db2c3297-b6c8-4933-9328-102d81d6faa3 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.028 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.028 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:20 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.030 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c98e2b3-77c0-4532-9f82-c59f33b712a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.031 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore
Dec 05 12:05:20 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000002d.scope: Consumed 14.328s CPU time.
Dec 05 12:05:20 compute-0 systemd-machined[153543]: Machine qemu-47-instance-0000002d terminated.
Dec 05 12:05:20 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000035.scope: Deactivated successfully.
Dec 05 12:05:20 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000035.scope: Consumed 7.983s CPU time.
Dec 05 12:05:20 compute-0 systemd-machined[153543]: Machine qemu-57-instance-00000035 terminated.
Dec 05 12:05:20 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [NOTICE]   (224986) : haproxy version is 2.8.14-c23fe91
Dec 05 12:05:20 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [NOTICE]   (224986) : path to executable is /usr/sbin/haproxy
Dec 05 12:05:20 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [WARNING]  (224986) : Exiting Master process...
Dec 05 12:05:20 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [WARNING]  (224986) : Exiting Master process...
Dec 05 12:05:20 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [ALERT]    (224986) : Current worker (224988) exited with code 143 (Terminated)
Dec 05 12:05:20 compute-0 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [WARNING]  (224986) : All workers exited. Exiting... (0)
Dec 05 12:05:20 compute-0 systemd[1]: libpod-d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc.scope: Deactivated successfully.
Dec 05 12:05:20 compute-0 podman[225078]: 2025-12-05 12:05:20.178564679 +0000 UTC m=+0.051979545 container died d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.219 187212 INFO nova.virt.libvirt.driver [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance destroyed successfully.
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.220 187212 DEBUG nova.objects.instance [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid ed7b6780-872e-41ef-a0c7-c48d0d6d13fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc-userdata-shm.mount: Deactivated successfully.
Dec 05 12:05:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-3400c5b4535476a6d33ef667b47e2b8030edbf230d73171381716b0df7dfcda6-merged.mount: Deactivated successfully.
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.234 187212 DEBUG nova.virt.libvirt.vif [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-465631494',display_name='tempest-ImagesTestJSON-server-465631494',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-465631494',id=53,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-j61263vr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:05:17Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=ed7b6780-872e-41ef-a0c7-c48d0d6d13fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.235 187212 DEBUG nova.network.os_vif_util [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:20 compute-0 podman[225078]: 2025-12-05 12:05:20.236132204 +0000 UTC m=+0.109547070 container cleanup d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.236 187212 DEBUG nova.network.os_vif_util [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.236 187212 DEBUG os_vif [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.240 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb2c3297-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.242 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.244 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:20 compute-0 systemd[1]: libpod-conmon-d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc.scope: Deactivated successfully.
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.247 187212 INFO os_vif [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6')
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.247 187212 INFO nova.virt.libvirt.driver [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Deleting instance files /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd_del
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.248 187212 INFO nova.virt.libvirt.driver [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Deletion of /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd_del complete
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.279 187212 INFO nova.virt.libvirt.driver [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance destroyed successfully.
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.280 187212 DEBUG nova.objects.instance [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'resources' on Instance uuid 004672c5-70e2-4940-bc9c-8971d94cc037 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.311 187212 INFO nova.virt.libvirt.driver [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Deleting instance files /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037_del
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.312 187212 INFO nova.virt.libvirt.driver [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Deletion of /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037_del complete
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.320 187212 INFO nova.compute.manager [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Took 0.37 seconds to destroy the instance on the hypervisor.
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.321 187212 DEBUG oslo.service.loopingcall [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.321 187212 DEBUG nova.compute.manager [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.321 187212 DEBUG nova.network.neutron [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:05:20 compute-0 podman[225125]: 2025-12-05 12:05:20.335785579 +0000 UTC m=+0.074686128 container remove d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.341 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[70fba4e0-1792-4ec6-99fc-76e24c20babe]: (4, ('Fri Dec  5 12:05:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc)\nd7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc\nFri Dec  5 12:05:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc)\nd7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.343 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bec9ecb1-8ec8-44a2-93be-7d322917d560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.345 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.347 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:20 compute-0 kernel: tap41b3b495-c0: left promiscuous mode
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.361 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.365 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[450f11f3-60ec-46cf-b9a7-4d4828e653d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.368 187212 INFO nova.compute.manager [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Took 0.55 seconds to destroy the instance on the hypervisor.
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.369 187212 DEBUG oslo.service.loopingcall [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.369 187212 DEBUG nova.compute.manager [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.369 187212 DEBUG nova.network.neutron [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.381 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3599aa-920e-45f3-822f-3cb5806b1c2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.383 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[45973862-0c5b-4f52-9aa5-284367ff499b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.400 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[654cb02d-0d98-4bff-913a-403dec658e18]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369112, 'reachable_time': 20273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225145, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:20 compute-0 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.405 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:05:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.405 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[59525c24-db28-4949-9df6-983a67a38410]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.538 187212 DEBUG nova.network.neutron [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.552 187212 DEBUG nova.network.neutron [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.567 187212 INFO nova.compute.manager [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Took 0.20 seconds to deallocate network for instance.
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.605 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936305.6047473, 97020786-7ba5-4c8b-8a2c-838c0f663bb4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.606 187212 INFO nova.compute.manager [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] VM Stopped (Lifecycle Event)
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.647 187212 DEBUG nova.compute.manager [None req-c06b482d-58d5-4d25-a7cb-b2919a6e66c0 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.657 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.657 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.724 187212 DEBUG nova.compute.provider_tree [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.744 187212 DEBUG nova.scheduler.client.report [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.766 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.795 187212 INFO nova.scheduler.client.report [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Deleted allocations for instance 004672c5-70e2-4940-bc9c-8971d94cc037
Dec 05 12:05:20 compute-0 nova_compute[187208]: 2025-12-05 12:05:20.860 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.509 187212 DEBUG nova.network.neutron [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.525 187212 INFO nova.compute.manager [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Took 2.20 seconds to deallocate network for instance.
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.571 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.572 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.598 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.642 187212 DEBUG nova.compute.provider_tree [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.652 187212 DEBUG nova.compute.manager [req-de87d5d7-dd4e-427b-b89e-d9e9f9be1400 req-02c953b2-433d-4031-b307-19a506231703 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received event network-vif-deleted-db2c3297-b6c8-4933-9328-102d81d6faa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.656 187212 DEBUG nova.scheduler.client.report [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.673 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.699 187212 INFO nova.scheduler.client.report [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance ed7b6780-872e-41ef-a0c7-c48d0d6d13fd
Dec 05 12:05:22 compute-0 nova_compute[187208]: 2025-12-05 12:05:22.757 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:25 compute-0 podman[225146]: 2025-12-05 12:05:25.224832207 +0000 UTC m=+0.073008330 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7)
Dec 05 12:05:25 compute-0 podman[225147]: 2025-12-05 12:05:25.243334679 +0000 UTC m=+0.086172579 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:05:25 compute-0 nova_compute[187208]: 2025-12-05 12:05:25.243 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:27 compute-0 nova_compute[187208]: 2025-12-05 12:05:27.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:30 compute-0 nova_compute[187208]: 2025-12-05 12:05:30.182 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:30 compute-0 nova_compute[187208]: 2025-12-05 12:05:30.245 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:30 compute-0 podman[225188]: 2025-12-05 12:05:30.253937115 +0000 UTC m=+0.052607033 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:05:30 compute-0 podman[225189]: 2025-12-05 12:05:30.276800383 +0000 UTC m=+0.073543436 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:05:30 compute-0 nova_compute[187208]: 2025-12-05 12:05:30.840 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:30 compute-0 nova_compute[187208]: 2025-12-05 12:05:30.841 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:30 compute-0 nova_compute[187208]: 2025-12-05 12:05:30.857 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:05:30 compute-0 nova_compute[187208]: 2025-12-05 12:05:30.981 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:30 compute-0 nova_compute[187208]: 2025-12-05 12:05:30.982 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:30 compute-0 nova_compute[187208]: 2025-12-05 12:05:30.988 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:05:30 compute-0 nova_compute[187208]: 2025-12-05 12:05:30.989 187212 INFO nova.compute.claims [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.167 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936316.1663022, f50947f2-f8d0-4d6b-bca4-b5412a206503 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.167 187212 INFO nova.compute.manager [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] VM Stopped (Lifecycle Event)
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.180 187212 DEBUG nova.compute.provider_tree [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.306 187212 DEBUG nova.compute.manager [None req-7301aa01-26a2-4daf-bc9f-57f43ebe6a4b - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.307 187212 DEBUG nova.scheduler.client.report [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.341 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.342 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.487 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.488 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.533 187212 INFO nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.556 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.676 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.677 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.677 187212 INFO nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating image(s)
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.678 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.678 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.679 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.694 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.764 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.765 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.765 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.775 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.828 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.829 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.851 187212 DEBUG nova.policy [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.863 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.864 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.865 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.924 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.925 187212 DEBUG nova.virt.disk.api [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Checking if we can resize image /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.925 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.979 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.980 187212 DEBUG nova.virt.disk.api [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Cannot resize image /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.980 187212 DEBUG nova.objects.instance [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.997 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.997 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Ensure instance console log exists: /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.998 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.998 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:31 compute-0 nova_compute[187208]: 2025-12-05 12:05:31.998 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:32 compute-0 nova_compute[187208]: 2025-12-05 12:05:32.603 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.049 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.049 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.065 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.121 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.122 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.127 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.128 187212 INFO nova.compute.claims [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.231 187212 DEBUG nova.compute.provider_tree [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.242 187212 DEBUG nova.scheduler.client.report [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.258 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.259 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.298 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.298 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.315 187212 INFO nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.332 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.413 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.414 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.414 187212 INFO nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Creating image(s)
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.415 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.415 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.416 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.427 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.485 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.486 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.487 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.498 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.560 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.561 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.595 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.596 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.597 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.663 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.665 187212 DEBUG nova.virt.disk.api [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Checking if we can resize image /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.665 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.720 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.721 187212 DEBUG nova.virt.disk.api [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Cannot resize image /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.721 187212 DEBUG nova.objects.instance [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'migration_context' on Instance uuid 472c7e2c-bdad-4230-904b-6937ceb872d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.736 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.737 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Ensure instance console log exists: /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.737 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.737 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.738 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:33 compute-0 nova_compute[187208]: 2025-12-05 12:05:33.973 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Successfully created port: 2e9efd6c-740c-405b-b9f0-bd46434070a7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:05:34 compute-0 nova_compute[187208]: 2025-12-05 12:05:34.043 187212 DEBUG nova.policy [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.117 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.118 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.138 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:05:35 compute-0 podman[225266]: 2025-12-05 12:05:35.203198928 +0000 UTC m=+0.055537277 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.212 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Successfully created port: 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.215 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936320.2146416, ed7b6780-872e-41ef-a0c7-c48d0d6d13fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.215 187212 INFO nova.compute.manager [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] VM Stopped (Lifecycle Event)
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.223 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.223 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.230 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.230 187212 INFO nova.compute.claims [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.235 187212 DEBUG nova.compute.manager [None req-99fe8518-2502-4122-8c88-1c5984204fe7 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.246 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.279 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936320.2777467, 004672c5-70e2-4940-bc9c-8971d94cc037 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.279 187212 INFO nova.compute.manager [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] VM Stopped (Lifecycle Event)
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.304 187212 DEBUG nova.compute.manager [None req-abe09de8-7e1e-4102-a1f1-f812a840ab91 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.377 187212 DEBUG nova.compute.provider_tree [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.379 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Successfully updated port: 2e9efd6c-740c-405b-b9f0-bd46434070a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.397 187212 DEBUG nova.scheduler.client.report [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.400 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.400 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.400 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.422 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.423 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.596 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.597 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.617 187212 INFO nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.631 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.640 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.732 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.734 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.734 187212 INFO nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Creating image(s)
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.735 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.735 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.736 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.753 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.820 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.821 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.821 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.833 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.890 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.891 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.924 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.926 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.927 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.986 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.988 187212 DEBUG nova.virt.disk.api [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Checking if we can resize image /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:05:35 compute-0 nova_compute[187208]: 2025-12-05 12:05:35.989 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.048 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.049 187212 DEBUG nova.virt.disk.api [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Cannot resize image /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.050 187212 DEBUG nova.objects.instance [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'migration_context' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.062 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.063 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Ensure instance console log exists: /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.063 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.064 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.064 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.117 187212 DEBUG nova.policy [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.331 187212 DEBUG nova.compute.manager [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.332 187212 DEBUG nova.compute.manager [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing instance network info cache due to event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:05:36 compute-0 nova_compute[187208]: 2025-12-05 12:05:36.333 187212 DEBUG oslo_concurrency.lockutils [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.084 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.084 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.084 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.085 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.419 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Successfully updated port: 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.432 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.432 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.432 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.507 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.536 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.537 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance network_info: |[{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.537 187212 DEBUG oslo_concurrency.lockutils [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.538 187212 DEBUG nova.network.neutron [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.540 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start _get_guest_xml network_info=[{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.547 187212 WARNING nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.551 187212 DEBUG nova.virt.libvirt.host [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.552 187212 DEBUG nova.virt.libvirt.host [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.555 187212 DEBUG nova.virt.libvirt.host [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.555 187212 DEBUG nova.virt.libvirt.host [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.555 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.556 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.556 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.556 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.556 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.561 187212 DEBUG nova.virt.libvirt.vif [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCLIKVsEL0lmma4upWYe8NiCB7ZJxacCmu4vu1RJu3M5/Fu5S7w/HUSIKvvOTrl/9nUJ4pE5tXIAyPQxQDsptmV5i8IinhFeAgIm0GlEBvfbCuuhpWud8F+u8GsIwgaqpQ==',key_name='tempest-keypair-776546213',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.561 187212 DEBUG nova.network.os_vif_util [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.562 187212 DEBUG nova.network.os_vif_util [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.563 187212 DEBUG nova.objects.instance [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.576 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <uuid>24358eea-14fb-4863-a6c4-aadcdb495f54</uuid>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <name>instance-00000036</name>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestOtherB-server-1629320086</nova:name>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:05:37</nova:creationTime>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:05:37 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:05:37 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:05:37 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:05:37 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:05:37 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:05:37 compute-0 nova_compute[187208]:         <nova:user uuid="4ad1281afc874c0ca55d908d3a6e05a8">tempest-ServerActionsTestOtherB-1759520420-project-member</nova:user>
Dec 05 12:05:37 compute-0 nova_compute[187208]:         <nova:project uuid="58cbd93e463049988ccd6d013893e7d6">tempest-ServerActionsTestOtherB-1759520420</nova:project>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:05:37 compute-0 nova_compute[187208]:         <nova:port uuid="2e9efd6c-740c-405b-b9f0-bd46434070a7">
Dec 05 12:05:37 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <system>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <entry name="serial">24358eea-14fb-4863-a6c4-aadcdb495f54</entry>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <entry name="uuid">24358eea-14fb-4863-a6c4-aadcdb495f54</entry>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     </system>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <os>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   </os>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <features>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   </features>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:ab:5e:ef"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <target dev="tap2e9efd6c-74"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/console.log" append="off"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <video>
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     </video>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:05:37 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:05:37 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:05:37 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:05:37 compute-0 nova_compute[187208]: </domain>
Dec 05 12:05:37 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.577 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Preparing to wait for external event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.578 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.578 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.578 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.579 187212 DEBUG nova.virt.libvirt.vif [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCLIKVsEL0lmma4upWYe8NiCB7ZJxacCmu4vu1RJu3M5/Fu5S7w/HUSIKvvOTrl/9nUJ4pE5tXIAyPQxQDsptmV5i8IinhFeAgIm0GlEBvfbCuuhpWud8F+u8GsIwgaqpQ==',key_name='tempest-keypair-776546213',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.579 187212 DEBUG nova.network.os_vif_util [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.579 187212 DEBUG nova.network.os_vif_util [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.580 187212 DEBUG os_vif [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.580 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.581 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.581 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.584 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.584 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e9efd6c-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.584 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2e9efd6c-74, col_values=(('external_ids', {'iface-id': '2e9efd6c-740c-405b-b9f0-bd46434070a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:5e:ef', 'vm-uuid': '24358eea-14fb-4863-a6c4-aadcdb495f54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.586 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:37 compute-0 NetworkManager[55691]: <info>  [1764936337.5871] manager: (tap2e9efd6c-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.588 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.591 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.592 187212 INFO os_vif [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74')
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.603 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.651 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.651 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.651 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No VIF found with MAC fa:16:3e:ab:5e:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.652 187212 INFO nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Using config drive
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.686 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Successfully created port: 549318e9-e629-4e2c-8cbb-3cd263c2bc34 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:05:37 compute-0 nova_compute[187208]: 2025-12-05 12:05:37.826 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.069 187212 INFO nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating config drive at /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.075 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp60k8d0z5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.153 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.153 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.181 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.206 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp60k8d0z5" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:38 compute-0 kernel: tap2e9efd6c-74: entered promiscuous mode
Dec 05 12:05:38 compute-0 NetworkManager[55691]: <info>  [1764936338.2886] manager: (tap2e9efd6c-74): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Dec 05 12:05:38 compute-0 ovn_controller[95610]: 2025-12-05T12:05:38Z|00439|binding|INFO|Claiming lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 for this chassis.
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.290 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:38 compute-0 ovn_controller[95610]: 2025-12-05T12:05:38Z|00440|binding|INFO|2e9efd6c-740c-405b-b9f0-bd46434070a7: Claiming fa:16:3e:ab:5e:ef 10.100.0.5
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.290 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.290 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.300 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.300 187212 INFO nova.compute.claims [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.303 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.305 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.307 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:05:38 compute-0 systemd-udevd[225319]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.319 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[646ed345-13e7-4612-8c40-140cc801e53b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.321 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5c17e5c-21 in ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.323 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5c17e5c-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.324 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7115a235-62ab-4616-815e-8151d0ded9f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.324 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc01977-88b9-482a-905c-0d3d55b1a0cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 NetworkManager[55691]: <info>  [1764936338.3365] device (tap2e9efd6c-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:05:38 compute-0 NetworkManager[55691]: <info>  [1764936338.3377] device (tap2e9efd6c-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.337 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8d01b5bc-88d4-4242-9429-5095d79bcd41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 systemd-machined[153543]: New machine qemu-58-instance-00000036.
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.349 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.352 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3fafc3ef-2ec3-44e0-9333-b418125cf30c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 ovn_controller[95610]: 2025-12-05T12:05:38Z|00441|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 ovn-installed in OVS
Dec 05 12:05:38 compute-0 ovn_controller[95610]: 2025-12-05T12:05:38Z|00442|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 up in Southbound
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:38 compute-0 systemd[1]: Started Virtual Machine qemu-58-instance-00000036.
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.384 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d77cf07f-cff2-41e3-bf75-50638c2b7d8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 NetworkManager[55691]: <info>  [1764936338.3946] manager: (tapb5c17e5c-20): new Veth device (/org/freedesktop/NetworkManager/Devices/181)
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.394 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0331eb-2012-4c8f-b668-2ff0e378b4f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.427 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[729aad2f-cb1a-4cf5-b189-17a17c840b7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.429 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f0bd35-7b2c-489f-b02e-9ae79034ca2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 12:05:38 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 12:05:38 compute-0 NetworkManager[55691]: <info>  [1764936338.4505] device (tapb5c17e5c-20): carrier: link connected
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.456 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[114f970c-ff98-45a3-b51f-d6bb44fdc42b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.460 187212 DEBUG nova.compute.provider_tree [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.471 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6990bb94-e38c-47d3-8b1c-ee8af83b5d4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225357, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.475 187212 DEBUG nova.scheduler.client.report [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.485 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ecde5040-746c-48c5-8d96-3b39ac491f74]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:429f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371901, 'tstamp': 371901}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225358, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.499 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[474a3ebf-e833-4b25-aac4-8e43d1b92af9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225359, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.501 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.502 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.528 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[390ee6c2-a1d9-4da0-91c1-46fa7809f028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.548 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.549 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.569 187212 INFO nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.590 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.590 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[42b2f51f-3aa7-4284-af67-45a2c749ea1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.592 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.592 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.593 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.595 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:38 compute-0 kernel: tapb5c17e5c-20: entered promiscuous mode
Dec 05 12:05:38 compute-0 NetworkManager[55691]: <info>  [1764936338.5958] manager: (tapb5c17e5c-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.600 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:38 compute-0 ovn_controller[95610]: 2025-12-05T12:05:38Z|00443|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.602 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5c17e5c-2b6c-48d3-9992-ac34070e3363.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5c17e5c-2b6c-48d3-9992-ac34070e3363.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.603 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[639d3043-f916-47cc-bf0e-2b0c3f71dbaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.604 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/b5c17e5c-2b6c-48d3-9992-ac34070e3363.pid.haproxy
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:05:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.605 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'env', 'PROCESS_TAG=haproxy-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5c17e5c-2b6c-48d3-9992-ac34070e3363.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.712 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.714 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.715 187212 INFO nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Creating image(s)
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.715 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.715 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.716 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.729 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.788 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.789 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.790 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.802 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.859 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.861 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.894 187212 DEBUG nova.policy [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.901 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.902 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.902 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.968 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.969 187212 DEBUG nova.virt.disk.api [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Checking if we can resize image /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:05:38 compute-0 nova_compute[187208]: 2025-12-05 12:05:38.969 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:38 compute-0 podman[225399]: 2025-12-05 12:05:38.972457568 +0000 UTC m=+0.044974044 container create 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 12:05:39 compute-0 systemd[1]: Started libpod-conmon-65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5.scope.
Dec 05 12:05:39 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.036 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.037 187212 DEBUG nova.virt.disk.api [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Cannot resize image /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.038 187212 DEBUG nova.objects.instance [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'migration_context' on Instance uuid 8888dd78-1c78-4065-8536-9a1096bdf57b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93f32fdd3ea0552c523abfc1a627c1ddf05c35a6f969e26671c37410720e74dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:05:39 compute-0 podman[225399]: 2025-12-05 12:05:38.947001746 +0000 UTC m=+0.019518262 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.053 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.053 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Ensure instance console log exists: /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.054 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.055 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.055 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:39 compute-0 podman[225399]: 2025-12-05 12:05:39.068387585 +0000 UTC m=+0.140904101 container init 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:05:39 compute-0 podman[225399]: 2025-12-05 12:05:39.074441859 +0000 UTC m=+0.146958365 container start 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.086 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:39 compute-0 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [NOTICE]   (225423) : New worker (225425) forked
Dec 05 12:05:39 compute-0 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [NOTICE]   (225423) : Loading success.
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.626 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936339.6255472, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:39 compute-0 nova_compute[187208]: 2025-12-05 12:05:39.627 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Started (Lifecycle Event)
Dec 05 12:05:40 compute-0 nova_compute[187208]: 2025-12-05 12:05:40.075 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:40 compute-0 nova_compute[187208]: 2025-12-05 12:05:40.079 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936339.6265035, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:40 compute-0 nova_compute[187208]: 2025-12-05 12:05:40.079 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Paused (Lifecycle Event)
Dec 05 12:05:40 compute-0 nova_compute[187208]: 2025-12-05 12:05:40.097 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:40 compute-0 nova_compute[187208]: 2025-12-05 12:05:40.101 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:40 compute-0 nova_compute[187208]: 2025-12-05 12:05:40.122 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:40 compute-0 nova_compute[187208]: 2025-12-05 12:05:40.242 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Successfully created port: c5cb68aa-e5c2-48b0-b9c4-e0542120e065 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.114 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.137 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.138 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance network_info: |[{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.141 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start _get_guest_xml network_info=[{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.147 187212 WARNING nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.153 187212 DEBUG nova.virt.libvirt.host [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.154 187212 DEBUG nova.virt.libvirt.host [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.157 187212 DEBUG nova.virt.libvirt.host [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.158 187212 DEBUG nova.virt.libvirt.host [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.158 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.158 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.159 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.159 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.159 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.161 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.161 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.165 187212 DEBUG nova.virt.libvirt.vif [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-292918791',display_name='tempest-FloatingIPsAssociationTestJSON-server-292918791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-292918791',id=55,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-c3vnhg04',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:33Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=472c7e2c-bdad-4230-904b-6937ceb872d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.165 187212 DEBUG nova.network.os_vif_util [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.166 187212 DEBUG nova.network.os_vif_util [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.166 187212 DEBUG nova.objects.instance [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'pci_devices' on Instance uuid 472c7e2c-bdad-4230-904b-6937ceb872d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.181 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <uuid>472c7e2c-bdad-4230-904b-6937ceb872d2</uuid>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <name>instance-00000037</name>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <nova:name>tempest-FloatingIPsAssociationTestJSON-server-292918791</nova:name>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:05:41</nova:creationTime>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:05:41 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:05:41 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:05:41 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:05:41 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:05:41 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:05:41 compute-0 nova_compute[187208]:         <nova:user uuid="8cf2534e7c394130b675e44ed567401b">tempest-FloatingIPsAssociationTestJSON-883508882-project-member</nova:user>
Dec 05 12:05:41 compute-0 nova_compute[187208]:         <nova:project uuid="85037de7275442698e604ee3f6283cbc">tempest-FloatingIPsAssociationTestJSON-883508882</nova:project>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:05:41 compute-0 nova_compute[187208]:         <nova:port uuid="9357c6a6-eb6f-4ab9-bfd6-486765004ac5">
Dec 05 12:05:41 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <system>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <entry name="serial">472c7e2c-bdad-4230-904b-6937ceb872d2</entry>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <entry name="uuid">472c7e2c-bdad-4230-904b-6937ceb872d2</entry>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     </system>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <os>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   </os>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <features>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   </features>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:08:e8:08"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <target dev="tap9357c6a6-eb"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/console.log" append="off"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <video>
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     </video>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:05:41 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:05:41 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:05:41 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:05:41 compute-0 nova_compute[187208]: </domain>
Dec 05 12:05:41 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.183 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Preparing to wait for external event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.183 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.183 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.184 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.185 187212 DEBUG nova.virt.libvirt.vif [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-292918791',display_name='tempest-FloatingIPsAssociationTestJSON-server-292918791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-292918791',id=55,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-c3vnhg04',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:33Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=472c7e2c-bdad-4230-904b-6937ceb872d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.185 187212 DEBUG nova.network.os_vif_util [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.186 187212 DEBUG nova.network.os_vif_util [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.186 187212 DEBUG os_vif [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.187 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.188 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.188 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.191 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.192 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9357c6a6-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.192 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9357c6a6-eb, col_values=(('external_ids', {'iface-id': '9357c6a6-eb6f-4ab9-bfd6-486765004ac5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:e8:08', 'vm-uuid': '472c7e2c-bdad-4230-904b-6937ceb872d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:41 compute-0 NetworkManager[55691]: <info>  [1764936341.1951] manager: (tap9357c6a6-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.195 187212 DEBUG nova.network.neutron [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated VIF entry in instance network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.196 187212 DEBUG nova.network.neutron [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.199 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.204 187212 INFO os_vif [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb')
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.219 187212 DEBUG oslo_concurrency.lockutils [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:41.233 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.233 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:41.235 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:05:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:41.236 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.266 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.267 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.267 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No VIF found with MAC fa:16:3e:08:e8:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.267 187212 INFO nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Using config drive
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.298 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Successfully updated port: 549318e9-e629-4e2c-8cbb-3cd263c2bc34 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.302 187212 DEBUG nova.compute.manager [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.302 187212 DEBUG nova.compute.manager [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.302 187212 DEBUG oslo_concurrency.lockutils [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.303 187212 DEBUG oslo_concurrency.lockutils [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.303 187212 DEBUG nova.network.neutron [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.321 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.321 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquired lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:41 compute-0 nova_compute[187208]: 2025-12-05 12:05:41.321 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.040 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.054 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.059 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.133 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.134 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.135 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.135 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.204 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:42 compute-0 podman[225444]: 2025-12-05 12:05:42.208260742 +0000 UTC m=+0.060635114 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.266 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.267 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.323 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.329 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.399 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.400 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.424 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Successfully updated port: c5cb68aa-e5c2-48b0-b9c4-e0542120e065 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.459 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.464 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000037, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config'
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.468 187212 INFO nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Creating config drive at /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.473 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqe8e12u7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.600 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqe8e12u7" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.606 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.622 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.622 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquired lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.623 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:05:42 compute-0 kernel: tap9357c6a6-eb: entered promiscuous mode
Dec 05 12:05:42 compute-0 NetworkManager[55691]: <info>  [1764936342.6671] manager: (tap9357c6a6-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Dec 05 12:05:42 compute-0 ovn_controller[95610]: 2025-12-05T12:05:42Z|00444|binding|INFO|Claiming lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 for this chassis.
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.669 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:42 compute-0 ovn_controller[95610]: 2025-12-05T12:05:42Z|00445|binding|INFO|9357c6a6-eb6f-4ab9-bfd6-486765004ac5: Claiming fa:16:3e:08:e8:08 10.100.0.14
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.673 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.682 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:e8:08 10.100.0.14'], port_security=['fa:16:3e:08:e8:08 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f4c4888-4b32-4259-8441-31af091e0c7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85037de7275442698e604ee3f6283cbc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed3fff5f-a24a-492e-ba85-8f010d446cfc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac2e7e6b-9342-46f8-a910-5de5a261f0a9, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9357c6a6-eb6f-4ab9-bfd6-486765004ac5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.683 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 in datapath 0f4c4888-4b32-4259-8441-31af091e0c7d bound to our chassis
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.684 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f4c4888-4b32-4259-8441-31af091e0c7d
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.694 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[44607439-04ab-40da-8182-0a822c12dd74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.695 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0f4c4888-41 in ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:05:42 compute-0 systemd-udevd[225498]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.697 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0f4c4888-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.697 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eda87d39-d940-4b9b-bd28-74db3366a334]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.700 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b8cbbf-fce2-4832-94a2-f9c011fdc411]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 NetworkManager[55691]: <info>  [1764936342.7117] device (tap9357c6a6-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:05:42 compute-0 NetworkManager[55691]: <info>  [1764936342.7130] device (tap9357c6a6-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.712 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[614786d7-edeb-4ac4-a189-f184d35d8fdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 systemd-machined[153543]: New machine qemu-59-instance-00000037.
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.722 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:42 compute-0 ovn_controller[95610]: 2025-12-05T12:05:42Z|00446|binding|INFO|Setting lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 ovn-installed in OVS
Dec 05 12:05:42 compute-0 ovn_controller[95610]: 2025-12-05T12:05:42Z|00447|binding|INFO|Setting lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 up in Southbound
Dec 05 12:05:42 compute-0 systemd[1]: Started Virtual Machine qemu-59-instance-00000037.
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.728 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.728 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[97563db9-4daa-4faa-8375-05cafdcc2adb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.742 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.743 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5600MB free_disk=73.25677871704102GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.744 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.744 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.757 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[08f8ceee-a629-45f1-b07b-e131cda7484d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 systemd-udevd[225503]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:05:42 compute-0 NetworkManager[55691]: <info>  [1764936342.7639] manager: (tap0f4c4888-40): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.763 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[17e8c18e-7079-457c-8c90-f131f5cc13df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.791 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[babb764e-a620-4ee9-8273-df70ef7ecb84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.794 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf58116-1f55-4b49-950e-1350b7e8ca25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 NetworkManager[55691]: <info>  [1764936342.8146] device (tap0f4c4888-40): carrier: link connected
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.819 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[22d93e31-f2bf-477a-a049-2a1f52ec51af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.837 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b4fe484b-b450-4823-b67e-e54346ae2797]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f4c4888-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:45:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372337, 'reachable_time': 25070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225532, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.852 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[85b622f7-8cb6-4d85-8896-4bf1f26c576a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:4563'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372337, 'tstamp': 372337}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225533, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.867 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a09b858-131b-471f-b228-e326d7e007eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f4c4888-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:45:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372337, 'reachable_time': 25070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225534, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.895 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c11d78-cf57-48a4-b9bb-4273a582c580]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.952 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1503caec-a647-4de4-8da2-2a19f8ff0720]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.954 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f4c4888-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.955 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.955 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f4c4888-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:42 compute-0 NetworkManager[55691]: <info>  [1764936342.9582] manager: (tap0f4c4888-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:42 compute-0 kernel: tap0f4c4888-40: entered promiscuous mode
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.961 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f4c4888-40, col_values=(('external_ids', {'iface-id': 'b2e28c8a-557d-459b-807e-dd1f5be0a608'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:42 compute-0 ovn_controller[95610]: 2025-12-05T12:05:42Z|00448|binding|INFO|Releasing lport b2e28c8a-557d-459b-807e-dd1f5be0a608 from this chassis (sb_readonly=0)
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.964 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.965 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0f4c4888-4b32-4259-8441-31af091e0c7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0f4c4888-4b32-4259-8441-31af091e0c7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.975 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[39249698-c960-4310-8ea9-160772819d6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.976 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-0f4c4888-4b32-4259-8441-31af091e0c7d
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/0f4c4888-4b32-4259-8441-31af091e0c7d.pid.haproxy
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 0f4c4888-4b32-4259-8441-31af091e0c7d
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:05:42 compute-0 nova_compute[187208]: 2025-12-05 12:05:42.977 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.979 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'env', 'PROCESS_TAG=haproxy-0f4c4888-4b32-4259-8441-31af091e0c7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0f4c4888-4b32-4259-8441-31af091e0c7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.102 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.192 187212 DEBUG nova.compute.manager [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-changed-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.192 187212 DEBUG nova.compute.manager [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Refreshing instance network info cache due to event network-changed-c5cb68aa-e5c2-48b0-b9c4-e0542120e065. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.192 187212 DEBUG oslo_concurrency.lockutils [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.261 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.261 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 472c7e2c-bdad-4230-904b-6937ceb872d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.262 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.262 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 8888dd78-1c78-4065-8536-9a1096bdf57b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.272 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936343.2724597, 472c7e2c-bdad-4230-904b-6937ceb872d2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.273 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] VM Started (Lifecycle Event)
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.289 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b81bb939-d14f-4a72-b7fe-95fc5d8810a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.289 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.290 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.297 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.302 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936343.2733757, 472c7e2c-bdad-4230-904b-6937ceb872d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.302 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] VM Paused (Lifecycle Event)
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.324 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:43 compute-0 podman[225573]: 2025-12-05 12:05:43.329834134 +0000 UTC m=+0.048125854 container create 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.333 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.333 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.339 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.363 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.364 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:05:43 compute-0 systemd[1]: Started libpod-conmon-96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302.scope.
Dec 05 12:05:43 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:05:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/346d0c910feecbfb16f9369239d1a7161a45d173e7171bca2bd39f251f209cca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:05:43 compute-0 podman[225573]: 2025-12-05 12:05:43.307111761 +0000 UTC m=+0.025403521 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:05:43 compute-0 podman[225573]: 2025-12-05 12:05:43.414979542 +0000 UTC m=+0.133271302 container init 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:05:43 compute-0 podman[225573]: 2025-12-05 12:05:43.419882313 +0000 UTC m=+0.138174033 container start 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.431 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:43 compute-0 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [NOTICE]   (225592) : New worker (225594) forked
Dec 05 12:05:43 compute-0 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [NOTICE]   (225592) : Loading success.
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.443 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.448 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.473 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.474 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.474 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.480 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.480 187212 INFO nova.compute.claims [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.608 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.609 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.609 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.610 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.610 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Processing event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.610 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.611 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.611 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.611 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.612 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.612 187212 WARNING nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state building and task_state spawning.
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.612 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-changed-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.612 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Refreshing instance network info cache due to event network-changed-549318e9-e629-4e2c-8cbb-3cd263c2bc34. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.613 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.616 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.621 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936343.6204307, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.621 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Resumed (Lifecycle Event)
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.624 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.628 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance spawned successfully.
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.629 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.645 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.659 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.660 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.660 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.660 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.661 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.661 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.670 187212 DEBUG nova.compute.provider_tree [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.693 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.695 187212 DEBUG nova.scheduler.client.report [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.734 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.735 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.742 187212 INFO nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 12.07 seconds to spawn the instance on the hypervisor.
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.743 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.810 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.810 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.820 187212 INFO nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 12.87 seconds to build instance.
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.835 187212 INFO nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.840 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.999s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.852 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.950 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.952 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.952 187212 INFO nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Creating image(s)
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.953 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.953 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.954 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:43 compute-0 nova_compute[187208]: 2025-12-05 12:05:43.971 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.036 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updating instance_info_cache with network_info: [{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.041 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.042 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.043 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.055 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.076 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Releasing lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.077 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance network_info: |[{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.078 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.079 187212 DEBUG nova.network.neutron [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Refreshing network info cache for port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.082 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start _get_guest_xml network_info=[{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.086 187212 WARNING nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.091 187212 DEBUG nova.virt.libvirt.host [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.092 187212 DEBUG nova.virt.libvirt.host [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.100 187212 DEBUG nova.virt.libvirt.host [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.101 187212 DEBUG nova.virt.libvirt.host [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.102 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.102 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.103 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.103 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.103 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.103 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.104 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.104 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.104 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.105 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.105 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.105 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.109 187212 DEBUG nova.virt.libvirt.vif [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:35Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.109 187212 DEBUG nova.network.os_vif_util [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.110 187212 DEBUG nova.network.os_vif_util [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.111 187212 DEBUG nova.objects.instance [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'pci_devices' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.115 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.116 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.136 187212 DEBUG nova.policy [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.141 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <uuid>cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</uuid>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <name>instance-00000038</name>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-1365452817</nova:name>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:05:44</nova:creationTime>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:05:44 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:05:44 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:05:44 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:05:44 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:05:44 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:05:44 compute-0 nova_compute[187208]:         <nova:user uuid="4f8149b8192e411a9131b103b25862b6">tempest-ListServerFiltersTestJSON-711798252-project-member</nova:user>
Dec 05 12:05:44 compute-0 nova_compute[187208]:         <nova:project uuid="e8f613c8797e432d96e43223fb7c476d">tempest-ListServerFiltersTestJSON-711798252</nova:project>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:05:44 compute-0 nova_compute[187208]:         <nova:port uuid="549318e9-e629-4e2c-8cbb-3cd263c2bc34">
Dec 05 12:05:44 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <system>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <entry name="serial">cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</entry>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <entry name="uuid">cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</entry>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     </system>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <os>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   </os>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <features>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   </features>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:9b:d7:ed"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <target dev="tap549318e9-e6"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/console.log" append="off"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <video>
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     </video>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:05:44 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:05:44 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:05:44 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:05:44 compute-0 nova_compute[187208]: </domain>
Dec 05 12:05:44 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.142 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Preparing to wait for external event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.143 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.143 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.143 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.144 187212 DEBUG nova.virt.libvirt.vif [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:35Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.144 187212 DEBUG nova.network.os_vif_util [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.145 187212 DEBUG nova.network.os_vif_util [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.145 187212 DEBUG os_vif [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.146 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.146 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.146 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.149 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.149 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap549318e9-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.149 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap549318e9-e6, col_values=(('external_ids', {'iface-id': '549318e9-e629-4e2c-8cbb-3cd263c2bc34', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:d7:ed', 'vm-uuid': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.151 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.152 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:44 compute-0 NetworkManager[55691]: <info>  [1764936344.1524] manager: (tap549318e9-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.152 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.170 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.174 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.175 187212 INFO os_vif [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6')
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.216 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.217 187212 DEBUG nova.virt.disk.api [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Checking if we can resize image /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.217 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.263 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.264 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.264 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No VIF found with MAC fa:16:3e:9b:d7:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.265 187212 INFO nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Using config drive
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.291 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.292 187212 DEBUG nova.virt.disk.api [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Cannot resize image /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.293 187212 DEBUG nova.objects.instance [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'migration_context' on Instance uuid b81bb939-d14f-4a72-b7fe-95fc5d8810a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.306 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.306 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Ensure instance console log exists: /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.306 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.307 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.307 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.475 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.732 187212 DEBUG nova.network.neutron [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.733 187212 DEBUG nova.network.neutron [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.749 187212 DEBUG oslo_concurrency.lockutils [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.963 187212 INFO nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Creating config drive at /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config
Dec 05 12:05:44 compute-0 nova_compute[187208]: 2025-12-05 12:05:44.967 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmbmcyo8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.094 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmbmcyo8" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:45 compute-0 NetworkManager[55691]: <info>  [1764936345.1535] manager: (tap549318e9-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Dec 05 12:05:45 compute-0 kernel: tap549318e9-e6: entered promiscuous mode
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.156 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 NetworkManager[55691]: <info>  [1764936345.1690] device (tap549318e9-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:05:45 compute-0 NetworkManager[55691]: <info>  [1764936345.1701] device (tap549318e9-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:05:45 compute-0 ovn_controller[95610]: 2025-12-05T12:05:45Z|00449|binding|INFO|Claiming lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 for this chassis.
Dec 05 12:05:45 compute-0 ovn_controller[95610]: 2025-12-05T12:05:45Z|00450|binding|INFO|549318e9-e629-4e2c-8cbb-3cd263c2bc34: Claiming fa:16:3e:9b:d7:ed 10.100.0.9
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.173 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.174 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.176 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.195 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[550a2b16-f310-4e90-9e26-abd43a4f4f54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.196 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4a2d11fe-a1 in ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.198 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4a2d11fe-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.198 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa92e7be-9178-408b-92db-7f19fa404d78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.200 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92a7bb49-8d73-47a7-a0db-be6e4ab1e32b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.207 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 ovn_controller[95610]: 2025-12-05T12:05:45Z|00451|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 ovn-installed in OVS
Dec 05 12:05:45 compute-0 ovn_controller[95610]: 2025-12-05T12:05:45Z|00452|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 up in Southbound
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.214 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[1214fad4-df87-4eb1-a2e7-21c703e59a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 systemd-machined[153543]: New machine qemu-60-instance-00000038.
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.228 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Updating instance_info_cache with network_info: [{"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:45 compute-0 systemd[1]: Started Virtual Machine qemu-60-instance-00000038.
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.235 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4ed83f89-63b9-444e-b3a3-c48c210e59d6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.252 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Releasing lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.252 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance network_info: |[{"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.253 187212 DEBUG oslo_concurrency.lockutils [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.253 187212 DEBUG nova.network.neutron [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Refreshing network info cache for port c5cb68aa-e5c2-48b0-b9c4-e0542120e065 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.258 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start _get_guest_xml network_info=[{"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': '6e277715-617f-4e35-89c7-208beae9fd5c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.267 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a9fecbcc-0c07-49dd-9af0-1eaeae99f6ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.273 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3d1bef-860f-4a91-9046-11cdb6dae9a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 NetworkManager[55691]: <info>  [1764936345.2774] manager: (tap4a2d11fe-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/189)
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.280 187212 WARNING nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.291 187212 DEBUG nova.virt.libvirt.host [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.292 187212 DEBUG nova.virt.libvirt.host [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.296 187212 DEBUG nova.virt.libvirt.host [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.297 187212 DEBUG nova.virt.libvirt.host [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.297 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.298 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.298 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.298 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.298 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.300 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.300 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.304 187212 DEBUG nova.virt.libvirt.vif [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-2001854085',display_name='tempest-ListServerFiltersTestJSON-instance-2001854085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-2001854085',id=57,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-ubyu8olf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:38Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=8888dd78-1c78-4065-8536-9a1096bdf57b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.305 187212 DEBUG nova.network.os_vif_util [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.306 187212 DEBUG nova.network.os_vif_util [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.306 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca72d05-486a-4969-8f51-78aada80563d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.307 187212 DEBUG nova.objects.instance [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'pci_devices' on Instance uuid 8888dd78-1c78-4065-8536-9a1096bdf57b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.309 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef962ea-f7b9-459a-9d36-ba8732dcda33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.322 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <uuid>8888dd78-1c78-4065-8536-9a1096bdf57b</uuid>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <name>instance-00000039</name>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-2001854085</nova:name>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:05:45</nova:creationTime>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:05:45 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:05:45 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:05:45 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:05:45 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:05:45 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:05:45 compute-0 nova_compute[187208]:         <nova:user uuid="4f8149b8192e411a9131b103b25862b6">tempest-ListServerFiltersTestJSON-711798252-project-member</nova:user>
Dec 05 12:05:45 compute-0 nova_compute[187208]:         <nova:project uuid="e8f613c8797e432d96e43223fb7c476d">tempest-ListServerFiltersTestJSON-711798252</nova:project>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:05:45 compute-0 nova_compute[187208]:         <nova:port uuid="c5cb68aa-e5c2-48b0-b9c4-e0542120e065">
Dec 05 12:05:45 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <system>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <entry name="serial">8888dd78-1c78-4065-8536-9a1096bdf57b</entry>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <entry name="uuid">8888dd78-1c78-4065-8536-9a1096bdf57b</entry>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     </system>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <os>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   </os>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <features>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   </features>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.config"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:8a:a8:16"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <target dev="tapc5cb68aa-e5"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/console.log" append="off"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <video>
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     </video>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:05:45 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:05:45 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:05:45 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:05:45 compute-0 nova_compute[187208]: </domain>
Dec 05 12:05:45 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.324 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Preparing to wait for external event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.324 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.325 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.325 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.326 187212 DEBUG nova.virt.libvirt.vif [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-2001854085',display_name='tempest-ListServerFiltersTestJSON-instance-2001854085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-2001854085',id=57,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-ubyu8olf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:38Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=8888dd78-1c78-4065-8536-9a1096bdf57b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.326 187212 DEBUG nova.network.os_vif_util [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.327 187212 DEBUG nova.network.os_vif_util [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.327 187212 DEBUG os_vif [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.328 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.328 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.328 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.331 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5cb68aa-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.332 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc5cb68aa-e5, col_values=(('external_ids', {'iface-id': 'c5cb68aa-e5c2-48b0-b9c4-e0542120e065', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:a8:16', 'vm-uuid': '8888dd78-1c78-4065-8536-9a1096bdf57b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 NetworkManager[55691]: <info>  [1764936345.3346] manager: (tapc5cb68aa-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:05:45 compute-0 NetworkManager[55691]: <info>  [1764936345.3410] device (tap4a2d11fe-a0): carrier: link connected
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.343 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.344 187212 INFO os_vif [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5')
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.347 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7e51955f-08ae-4ba1-8fde-97da5301b660]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.379 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a5267b2a-ca4d-4c8c-8a6d-a6d0603526d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225658, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.396 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c7cb2f57-127a-4912-b6ef-780c0398530b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:9456'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372590, 'tstamp': 372590}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225659, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.421 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43ce2aed-a120-403f-89af-2375816f71ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225660, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.448 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1486da8e-4c44-48df-8d0e-2bdd1bf96624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.495 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9e54bdfa-965f-4cb4-92c9-e158a6105f70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.496 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.496 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.497 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:45 compute-0 NetworkManager[55691]: <info>  [1764936345.4996] manager: (tap4a2d11fe-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.499 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 kernel: tap4a2d11fe-a0: entered promiscuous mode
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.503 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.504 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.505 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 ovn_controller[95610]: 2025-12-05T12:05:45Z|00453|binding|INFO|Releasing lport 27f6a3c0-dd69-4255-8d00-850605f3016e from this chassis (sb_readonly=0)
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.521 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.522 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4a2d11fe-a91d-4cf5-bde7-283f0aa52f63.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4a2d11fe-a91d-4cf5-bde7-283f0aa52f63.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.523 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c3fdf038-604b-47c2-ba2f-cc52814316c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.523 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/4a2d11fe-a91d-4cf5-bde7-283f0aa52f63.pid.haproxy
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:05:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.525 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'env', 'PROCESS_TAG=haproxy-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4a2d11fe-a91d-4cf5-bde7-283f0aa52f63.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.608 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.609 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.609 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No VIF found with MAC fa:16:3e:8a:a8:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.610 187212 INFO nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Using config drive
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.647 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936345.6473682, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:45 compute-0 nova_compute[187208]: 2025-12-05 12:05:45.648 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Started (Lifecycle Event)
Dec 05 12:05:45 compute-0 podman[225704]: 2025-12-05 12:05:45.878788603 +0000 UTC m=+0.047584219 container create 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:05:45 compute-0 systemd[1]: Started libpod-conmon-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b.scope.
Dec 05 12:05:45 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:05:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0675ec35d20a48a0477b2dc90980940bf1de49a39a39196259a264090cabf69/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:05:45 compute-0 podman[225704]: 2025-12-05 12:05:45.853241718 +0000 UTC m=+0.022037354 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:05:45 compute-0 podman[225704]: 2025-12-05 12:05:45.950846364 +0000 UTC m=+0.119641990 container init 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 05 12:05:45 compute-0 podman[225704]: 2025-12-05 12:05:45.955655383 +0000 UTC m=+0.124450999 container start 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:05:45 compute-0 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [NOTICE]   (225723) : New worker (225725) forked
Dec 05 12:05:45 compute-0 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [NOTICE]   (225723) : Loading success.
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.099 187212 DEBUG nova.compute.manager [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.100 187212 DEBUG oslo_concurrency.lockutils [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.100 187212 DEBUG oslo_concurrency.lockutils [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.100 187212 DEBUG oslo_concurrency.lockutils [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.100 187212 DEBUG nova.compute.manager [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Processing event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.101 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.106 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.109 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.112 187212 INFO nova.virt.libvirt.driver [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance spawned successfully.
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.113 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936345.647611, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.113 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Paused (Lifecycle Event)
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.114 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.144 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.148 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.198 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.198 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936346.1057503, 472c7e2c-bdad-4230-904b-6937ceb872d2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.198 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] VM Resumed (Lifecycle Event)
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.206 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.208 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.209 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.209 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.209 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.210 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.390 187212 DEBUG nova.compute.manager [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.391 187212 DEBUG oslo_concurrency.lockutils [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.391 187212 DEBUG oslo_concurrency.lockutils [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.391 187212 DEBUG oslo_concurrency.lockutils [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.392 187212 DEBUG nova.compute.manager [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Processing event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.393 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.412 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.419 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.423 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance spawned successfully.
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.424 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.427 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.519 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.520 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.521 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.521 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.522 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.523 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.529 187212 INFO nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Took 13.12 seconds to spawn the instance on the hypervisor.
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.529 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.533 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.533 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936346.3967624, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.533 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Resumed (Lifecycle Event)
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.547 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Successfully created port: 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.661 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.908 187212 INFO nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Took 13.80 seconds to build instance.
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.960 187212 INFO nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Took 11.23 seconds to spawn the instance on the hypervisor.
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.960 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:46 compute-0 nova_compute[187208]: 2025-12-05 12:05:46.961 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.134 187212 INFO nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Creating config drive at /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.config
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.140 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo7qit4xa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.258 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.268 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo7qit4xa" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.326 187212 INFO nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Took 12.12 seconds to build instance.
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.340 187212 DEBUG nova.network.neutron [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updated VIF entry in instance network info cache for port 549318e9-e629-4e2c-8cbb-3cd263c2bc34. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.340 187212 DEBUG nova.network.neutron [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updating instance_info_cache with network_info: [{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:47 compute-0 NetworkManager[55691]: <info>  [1764936347.3599] manager: (tapc5cb68aa-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Dec 05 12:05:47 compute-0 kernel: tapc5cb68aa-e5: entered promiscuous mode
Dec 05 12:05:47 compute-0 ovn_controller[95610]: 2025-12-05T12:05:47Z|00454|binding|INFO|Claiming lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 for this chassis.
Dec 05 12:05:47 compute-0 ovn_controller[95610]: 2025-12-05T12:05:47Z|00455|binding|INFO|c5cb68aa-e5c2-48b0-b9c4-e0542120e065: Claiming fa:16:3e:8a:a8:16 10.100.0.13
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.366 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.386 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:47 compute-0 ovn_controller[95610]: 2025-12-05T12:05:47Z|00456|binding|INFO|Setting lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 ovn-installed in OVS
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.400 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:47 compute-0 systemd-udevd[225753]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:05:47 compute-0 systemd-machined[153543]: New machine qemu-61-instance-00000039.
Dec 05 12:05:47 compute-0 NetworkManager[55691]: <info>  [1764936347.4188] device (tapc5cb68aa-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:05:47 compute-0 NetworkManager[55691]: <info>  [1764936347.4196] device (tapc5cb68aa-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:05:47 compute-0 systemd[1]: Started Virtual Machine qemu-61-instance-00000039.
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.434 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:a8:16 10.100.0.13'], port_security=['fa:16:3e:8a:a8:16 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c5cb68aa-e5c2-48b0-b9c4-e0542120e065) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.435 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c5cb68aa-e5c2-48b0-b9c4-e0542120e065 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis
Dec 05 12:05:47 compute-0 ovn_controller[95610]: 2025-12-05T12:05:47Z|00457|binding|INFO|Setting lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 up in Southbound
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.437 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:05:47 compute-0 NetworkManager[55691]: <info>  [1764936347.4479] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.447 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:47 compute-0 NetworkManager[55691]: <info>  [1764936347.4487] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.452 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.453 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.453 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.453 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b26359f0-bb73-4963-b318-b296cc54f55e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.453 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.454 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.454 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] No waiting events found dispatching network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.455 187212 WARNING nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received unexpected event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 for instance with vm_state building and task_state spawning.
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.456 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.484 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[67a22f9b-5a4e-4a90-ac61-69de9cd64684]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.487 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a071a209-75fa-4fd9-85de-86fc5a693deb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.512 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4128689e-8845-4dcc-911a-f7001b6e3779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.532 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd9cf6d-f351-4cea-9d3b-8701d1965570]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225768, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.547 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e371ca39-2be8-478f-a1ee-1ae6880baa2f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225769, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225769, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.549 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.551 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.553 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.553 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.606 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.626 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:47 compute-0 ovn_controller[95610]: 2025-12-05T12:05:47Z|00458|binding|INFO|Releasing lport b2e28c8a-557d-459b-807e-dd1f5be0a608 from this chassis (sb_readonly=0)
Dec 05 12:05:47 compute-0 ovn_controller[95610]: 2025-12-05T12:05:47Z|00459|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:05:47 compute-0 ovn_controller[95610]: 2025-12-05T12:05:47Z|00460|binding|INFO|Releasing lport 27f6a3c0-dd69-4255-8d00-850605f3016e from this chassis (sb_readonly=0)
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.749 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936347.7358027, 8888dd78-1c78-4065-8536-9a1096bdf57b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.749 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] VM Started (Lifecycle Event)
Dec 05 12:05:47 compute-0 nova_compute[187208]: 2025-12-05 12:05:47.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.050 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.055 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936347.7361302, 8888dd78-1c78-4065-8536-9a1096bdf57b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.056 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] VM Paused (Lifecycle Event)
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.093 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.097 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.110 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.171 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.505 187212 DEBUG nova.network.neutron [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Updated VIF entry in instance network info cache for port c5cb68aa-e5c2-48b0-b9c4-e0542120e065. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.506 187212 DEBUG nova.network.neutron [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Updating instance_info_cache with network_info: [{"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.532 187212 DEBUG oslo_concurrency.lockutils [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.727 187212 DEBUG nova.compute.manager [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.728 187212 DEBUG oslo_concurrency.lockutils [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.729 187212 DEBUG oslo_concurrency.lockutils [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.729 187212 DEBUG oslo_concurrency.lockutils [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.729 187212 DEBUG nova.compute.manager [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:48 compute-0 nova_compute[187208]: 2025-12-05 12:05:48.730 187212 WARNING nova.compute.manager [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state active and task_state None.
Dec 05 12:05:49 compute-0 podman[225777]: 2025-12-05 12:05:49.258004159 +0000 UTC m=+0.106285906 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec 05 12:05:49 compute-0 nova_compute[187208]: 2025-12-05 12:05:49.421 187212 DEBUG nova.compute.manager [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:49 compute-0 nova_compute[187208]: 2025-12-05 12:05:49.423 187212 DEBUG nova.compute.manager [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing instance network info cache due to event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:05:49 compute-0 nova_compute[187208]: 2025-12-05 12:05:49.423 187212 DEBUG oslo_concurrency.lockutils [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:49 compute-0 nova_compute[187208]: 2025-12-05 12:05:49.423 187212 DEBUG oslo_concurrency.lockutils [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:49 compute-0 nova_compute[187208]: 2025-12-05 12:05:49.424 187212 DEBUG nova.network.neutron [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:05:50 compute-0 nova_compute[187208]: 2025-12-05 12:05:50.264 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Successfully updated port: 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:05:50 compute-0 nova_compute[187208]: 2025-12-05 12:05:50.333 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:50 compute-0 nova_compute[187208]: 2025-12-05 12:05:50.334 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquired lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:50 compute-0 nova_compute[187208]: 2025-12-05 12:05:50.334 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:05:50 compute-0 nova_compute[187208]: 2025-12-05 12:05:50.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:50 compute-0 nova_compute[187208]: 2025-12-05 12:05:50.657 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.270 187212 DEBUG nova.compute.manager [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-changed-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.271 187212 DEBUG nova.compute.manager [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Refreshing instance network info cache due to event network-changed-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.272 187212 DEBUG oslo_concurrency.lockutils [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.527 187212 DEBUG nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.528 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.529 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.530 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.531 187212 DEBUG nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Processing event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.532 187212 DEBUG nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.533 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.533 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.534 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.535 187212 DEBUG nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] No waiting events found dispatching network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.536 187212 WARNING nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received unexpected event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 for instance with vm_state building and task_state spawning.
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.537 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.555 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936351.5424495, 8888dd78-1c78-4065-8536-9a1096bdf57b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.558 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] VM Resumed (Lifecycle Event)
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.563 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.578 187212 INFO nova.virt.libvirt.driver [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance spawned successfully.
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.579 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.583 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.587 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.601 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.602 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.603 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.603 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.604 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.605 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.611 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.702 187212 INFO nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Took 12.99 seconds to spawn the instance on the hypervisor.
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.702 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.703 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.780 187212 INFO nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Took 13.52 seconds to build instance.
Dec 05 12:05:51 compute-0 nova_compute[187208]: 2025-12-05 12:05:51.802 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.177 187212 DEBUG nova.network.neutron [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated VIF entry in instance network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.178 187212 DEBUG nova.network.neutron [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.199 187212 DEBUG oslo_concurrency.lockutils [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.346 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Updating instance_info_cache with network_info: [{"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.369 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Releasing lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.370 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance network_info: |[{"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.372 187212 DEBUG oslo_concurrency.lockutils [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.372 187212 DEBUG nova.network.neutron [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Refreshing network info cache for port 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.376 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start _get_guest_xml network_info=[{"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.381 187212 WARNING nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.389 187212 DEBUG nova.virt.libvirt.host [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.390 187212 DEBUG nova.virt.libvirt.host [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.393 187212 DEBUG nova.virt.libvirt.host [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.394 187212 DEBUG nova.virt.libvirt.host [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.395 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.395 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='09233d41-3279-4f39-ac6e-a21662b4f176',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.396 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.397 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.397 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.398 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.398 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.399 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.399 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.400 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.400 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.400 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.405 187212 DEBUG nova.virt.libvirt.vif [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1462907521',display_name='tempest-ListServerFiltersTestJSON-instance-1462907521',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1462907521',id=58,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-bzpoia2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:43Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=b81bb939-d14f-4a72-b7fe-95fc5d8810a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.406 187212 DEBUG nova.network.os_vif_util [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.407 187212 DEBUG nova.network.os_vif_util [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.408 187212 DEBUG nova.objects.instance [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'pci_devices' on Instance uuid b81bb939-d14f-4a72-b7fe-95fc5d8810a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.422 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <uuid>b81bb939-d14f-4a72-b7fe-95fc5d8810a1</uuid>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <name>instance-0000003a</name>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <memory>196608</memory>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-1462907521</nova:name>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:05:52</nova:creationTime>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <nova:flavor name="m1.micro">
Dec 05 12:05:52 compute-0 nova_compute[187208]:         <nova:memory>192</nova:memory>
Dec 05 12:05:52 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:05:52 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:05:52 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:05:52 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:05:52 compute-0 nova_compute[187208]:         <nova:user uuid="4f8149b8192e411a9131b103b25862b6">tempest-ListServerFiltersTestJSON-711798252-project-member</nova:user>
Dec 05 12:05:52 compute-0 nova_compute[187208]:         <nova:project uuid="e8f613c8797e432d96e43223fb7c476d">tempest-ListServerFiltersTestJSON-711798252</nova:project>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:05:52 compute-0 nova_compute[187208]:         <nova:port uuid="5683f8a8-691c-43f3-a88f-eb0c30ccb3c5">
Dec 05 12:05:52 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <system>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <entry name="serial">b81bb939-d14f-4a72-b7fe-95fc5d8810a1</entry>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <entry name="uuid">b81bb939-d14f-4a72-b7fe-95fc5d8810a1</entry>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     </system>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <os>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   </os>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <features>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   </features>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.config"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:d3:3c:38"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <target dev="tap5683f8a8-69"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/console.log" append="off"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <video>
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     </video>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:05:52 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:05:52 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:05:52 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:05:52 compute-0 nova_compute[187208]: </domain>
Dec 05 12:05:52 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.431 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Preparing to wait for external event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.431 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.432 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.432 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.433 187212 DEBUG nova.virt.libvirt.vif [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1462907521',display_name='tempest-ListServerFiltersTestJSON-instance-1462907521',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1462907521',id=58,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-bzpoia2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:43Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=b81bb939-d14f-4a72-b7fe-95fc5d8810a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.433 187212 DEBUG nova.network.os_vif_util [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.434 187212 DEBUG nova.network.os_vif_util [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.436 187212 DEBUG os_vif [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.437 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.438 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.438 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.446 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5683f8a8-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.446 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5683f8a8-69, col_values=(('external_ids', {'iface-id': '5683f8a8-691c-43f3-a88f-eb0c30ccb3c5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:3c:38', 'vm-uuid': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.484 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:52 compute-0 NetworkManager[55691]: <info>  [1764936352.4868] manager: (tap5683f8a8-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.489 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.494 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.496 187212 INFO os_vif [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69')
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.573 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.573 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.574 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No VIF found with MAC fa:16:3e:d3:3c:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.574 187212 INFO nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Using config drive
Dec 05 12:05:52 compute-0 nova_compute[187208]: 2025-12-05 12:05:52.610 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:53 compute-0 nova_compute[187208]: 2025-12-05 12:05:53.564 187212 INFO nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Creating config drive at /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.config
Dec 05 12:05:53 compute-0 nova_compute[187208]: 2025-12-05 12:05:53.569 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpojzq8yb5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:53 compute-0 nova_compute[187208]: 2025-12-05 12:05:53.697 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpojzq8yb5" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:53 compute-0 NetworkManager[55691]: <info>  [1764936353.7598] manager: (tap5683f8a8-69): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Dec 05 12:05:53 compute-0 kernel: tap5683f8a8-69: entered promiscuous mode
Dec 05 12:05:53 compute-0 nova_compute[187208]: 2025-12-05 12:05:53.779 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:53 compute-0 ovn_controller[95610]: 2025-12-05T12:05:53Z|00461|binding|INFO|Claiming lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 for this chassis.
Dec 05 12:05:53 compute-0 ovn_controller[95610]: 2025-12-05T12:05:53Z|00462|binding|INFO|5683f8a8-691c-43f3-a88f-eb0c30ccb3c5: Claiming fa:16:3e:d3:3c:38 10.100.0.11
Dec 05 12:05:53 compute-0 nova_compute[187208]: 2025-12-05 12:05:53.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:53 compute-0 ovn_controller[95610]: 2025-12-05T12:05:53Z|00463|binding|INFO|Setting lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 ovn-installed in OVS
Dec 05 12:05:53 compute-0 systemd-udevd[225817]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:05:53 compute-0 nova_compute[187208]: 2025-12-05 12:05:53.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:53 compute-0 systemd-machined[153543]: New machine qemu-62-instance-0000003a.
Dec 05 12:05:53 compute-0 NetworkManager[55691]: <info>  [1764936353.8556] device (tap5683f8a8-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:05:53 compute-0 systemd[1]: Started Virtual Machine qemu-62-instance-0000003a.
Dec 05 12:05:53 compute-0 NetworkManager[55691]: <info>  [1764936353.8568] device (tap5683f8a8-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.030 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:3c:38 10.100.0.11'], port_security=['fa:16:3e:d3:3c:38 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.032 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.034 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:05:54 compute-0 ovn_controller[95610]: 2025-12-05T12:05:54Z|00464|binding|INFO|Setting lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 up in Southbound
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.057 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9919ce7e-053e-446f-aaaf-65126e43c3a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.092 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[657ae797-c2a1-40b8-801e-a4a953664df9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.096 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8813d7f2-8d5f-4cb4-a82e-2fffa0f5c38e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.132 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bbcbc48e-57f0-498c-833c-b984c7e1fd90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.156 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936354.1561198, b81bb939-d14f-4a72-b7fe-95fc5d8810a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.157 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] VM Started (Lifecycle Event)
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.157 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3df95e9c-9c82-405c-a0ed-947c7a23749d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225839, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.175 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[607b2c59-304a-46ba-82e1-a7a836bc01d2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225840, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225840, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.178 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.183 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.183 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.184 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:05:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.184 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.233 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.237 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936354.1593728, b81bb939-d14f-4a72-b7fe-95fc5d8810a1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.238 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] VM Paused (Lifecycle Event)
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.254 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.257 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:05:54 compute-0 nova_compute[187208]: 2025-12-05 12:05:54.276 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:05:55 compute-0 nova_compute[187208]: 2025-12-05 12:05:55.229 187212 DEBUG nova.network.neutron [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Updated VIF entry in instance network info cache for port 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:05:55 compute-0 nova_compute[187208]: 2025-12-05 12:05:55.229 187212 DEBUG nova.network.neutron [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Updating instance_info_cache with network_info: [{"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:05:55 compute-0 nova_compute[187208]: 2025-12-05 12:05:55.257 187212 DEBUG oslo_concurrency.lockutils [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.041 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.042 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.059 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.127 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.127 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.136 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.141 187212 INFO nova.compute.claims [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:05:56 compute-0 podman[225855]: 2025-12-05 12:05:56.230824195 +0000 UTC m=+0.074566885 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 05 12:05:56 compute-0 podman[225854]: 2025-12-05 12:05:56.23379073 +0000 UTC m=+0.079884307 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.314 187212 DEBUG nova.compute.provider_tree [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.328 187212 DEBUG nova.scheduler.client.report [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.351 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.352 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.395 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.396 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.414 187212 INFO nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.428 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.502 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:56 compute-0 ovn_controller[95610]: 2025-12-05T12:05:56Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:5e:ef 10.100.0.5
Dec 05 12:05:56 compute-0 ovn_controller[95610]: 2025-12-05T12:05:56Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:5e:ef 10.100.0.5
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.543 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.545 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.545 187212 INFO nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Creating image(s)
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.546 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.546 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.547 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.560 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.621 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.622 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.623 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.635 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.693 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.694 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.731 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.733 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.733 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.757 187212 DEBUG nova.policy [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.793 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.794 187212 DEBUG nova.virt.disk.api [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Checking if we can resize image /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.795 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.862 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.864 187212 DEBUG nova.virt.disk.api [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Cannot resize image /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.864 187212 DEBUG nova.objects.instance [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'migration_context' on Instance uuid 297d72ef-6b79-45b3-813b-52b5144b522e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.879 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.880 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Ensure instance console log exists: /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.880 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.881 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.881 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:05:56 compute-0 nova_compute[187208]: 2025-12-05 12:05:56.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:57 compute-0 nova_compute[187208]: 2025-12-05 12:05:57.486 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:57 compute-0 nova_compute[187208]: 2025-12-05 12:05:57.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:57 compute-0 ovn_controller[95610]: 2025-12-05T12:05:57Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:08:e8:08 10.100.0.14
Dec 05 12:05:57 compute-0 ovn_controller[95610]: 2025-12-05T12:05:57Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:08:e8:08 10.100.0.14
Dec 05 12:05:58 compute-0 nova_compute[187208]: 2025-12-05 12:05:58.299 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Successfully created port: 821e6243-8d28-4c8c-874c-f1e69c7d3bed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:05:58 compute-0 ovn_controller[95610]: 2025-12-05T12:05:58Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:d7:ed 10.100.0.9
Dec 05 12:05:58 compute-0 ovn_controller[95610]: 2025-12-05T12:05:58Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:d7:ed 10.100.0.9
Dec 05 12:05:58 compute-0 nova_compute[187208]: 2025-12-05 12:05:58.724 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:05:58 compute-0 nova_compute[187208]: 2025-12-05 12:05:58.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:00 compute-0 nova_compute[187208]: 2025-12-05 12:06:00.210 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Successfully updated port: 821e6243-8d28-4c8c-874c-f1e69c7d3bed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:06:00 compute-0 nova_compute[187208]: 2025-12-05 12:06:00.264 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:00 compute-0 nova_compute[187208]: 2025-12-05 12:06:00.265 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquired lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:00 compute-0 nova_compute[187208]: 2025-12-05 12:06:00.265 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:06:00 compute-0 nova_compute[187208]: 2025-12-05 12:06:00.474 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:06:00 compute-0 nova_compute[187208]: 2025-12-05 12:06:00.647 187212 DEBUG nova.compute.manager [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:00 compute-0 nova_compute[187208]: 2025-12-05 12:06:00.647 187212 DEBUG nova.compute.manager [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing instance network info cache due to event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:00 compute-0 nova_compute[187208]: 2025-12-05 12:06:00.648 187212 DEBUG oslo_concurrency.lockutils [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:01 compute-0 podman[225936]: 2025-12-05 12:06:01.234902033 +0000 UTC m=+0.071990230 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:06:01 compute-0 podman[225937]: 2025-12-05 12:06:01.27200003 +0000 UTC m=+0.109998164 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.623 187212 DEBUG nova.compute.manager [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.624 187212 DEBUG oslo_concurrency.lockutils [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.625 187212 DEBUG oslo_concurrency.lockutils [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.625 187212 DEBUG oslo_concurrency.lockutils [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.625 187212 DEBUG nova.compute.manager [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Processing event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.626 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.633 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936361.6320565, b81bb939-d14f-4a72-b7fe-95fc5d8810a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.633 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] VM Resumed (Lifecycle Event)
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.636 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.641 187212 INFO nova.virt.libvirt.driver [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance spawned successfully.
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.642 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.663 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.669 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.674 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.675 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.676 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.676 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.676 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.677 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.715 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.719 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Releasing lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.719 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance network_info: |[{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.720 187212 DEBUG oslo_concurrency.lockutils [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.720 187212 DEBUG nova.network.neutron [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.723 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start _get_guest_xml network_info=[{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.727 187212 WARNING nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.733 187212 DEBUG nova.virt.libvirt.host [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.734 187212 DEBUG nova.virt.libvirt.host [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.740 187212 DEBUG nova.virt.libvirt.host [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.740 187212 DEBUG nova.virt.libvirt.host [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.741 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.741 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.741 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.742 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.742 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.742 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.742 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.743 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.743 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.743 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.743 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.744 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.748 187212 DEBUG nova.virt.libvirt.vif [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-2111676304',display_name='tempest-FloatingIPsAssociationTestJSON-server-2111676304',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-2111676304',id=59,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-3sf4jdpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:56Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=297d72ef-6b79-45b3-813b-52b5144b522e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.748 187212 DEBUG nova.network.os_vif_util [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.749 187212 DEBUG nova.network.os_vif_util [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.750 187212 DEBUG nova.objects.instance [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'pci_devices' on Instance uuid 297d72ef-6b79-45b3-813b-52b5144b522e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.764 187212 INFO nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Took 17.81 seconds to spawn the instance on the hypervisor.
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.765 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.770 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <uuid>297d72ef-6b79-45b3-813b-52b5144b522e</uuid>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <name>instance-0000003b</name>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <nova:name>tempest-FloatingIPsAssociationTestJSON-server-2111676304</nova:name>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:06:01</nova:creationTime>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:06:01 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:06:01 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:06:01 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:06:01 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:06:01 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:06:01 compute-0 nova_compute[187208]:         <nova:user uuid="8cf2534e7c394130b675e44ed567401b">tempest-FloatingIPsAssociationTestJSON-883508882-project-member</nova:user>
Dec 05 12:06:01 compute-0 nova_compute[187208]:         <nova:project uuid="85037de7275442698e604ee3f6283cbc">tempest-FloatingIPsAssociationTestJSON-883508882</nova:project>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:06:01 compute-0 nova_compute[187208]:         <nova:port uuid="821e6243-8d28-4c8c-874c-f1e69c7d3bed">
Dec 05 12:06:01 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <system>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <entry name="serial">297d72ef-6b79-45b3-813b-52b5144b522e</entry>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <entry name="uuid">297d72ef-6b79-45b3-813b-52b5144b522e</entry>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     </system>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <os>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   </os>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <features>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   </features>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.config"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:a6:47:26"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <target dev="tap821e6243-8d"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/console.log" append="off"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <video>
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     </video>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:06:01 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:06:01 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:06:01 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:06:01 compute-0 nova_compute[187208]: </domain>
Dec 05 12:06:01 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.772 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Preparing to wait for external event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.773 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.775 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.775 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.776 187212 DEBUG nova.virt.libvirt.vif [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-2111676304',display_name='tempest-FloatingIPsAssociationTestJSON-server-2111676304',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-2111676304',id=59,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-3sf4jdpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:56Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=297d72ef-6b79-45b3-813b-52b5144b522e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.776 187212 DEBUG nova.network.os_vif_util [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.777 187212 DEBUG nova.network.os_vif_util [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.777 187212 DEBUG os_vif [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.778 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.779 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.779 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.785 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.785 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap821e6243-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.786 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap821e6243-8d, col_values=(('external_ids', {'iface-id': '821e6243-8d28-4c8c-874c-f1e69c7d3bed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a6:47:26', 'vm-uuid': '297d72ef-6b79-45b3-813b-52b5144b522e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:01 compute-0 NetworkManager[55691]: <info>  [1764936361.7888] manager: (tap821e6243-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.788 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.791 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.799 187212 INFO os_vif [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d')
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.840 187212 INFO nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Took 18.42 seconds to build instance.
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.861 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.865 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.865 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.866 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No VIF found with MAC fa:16:3e:a6:47:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:06:01 compute-0 nova_compute[187208]: 2025-12-05 12:06:01.866 187212 INFO nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Using config drive
Dec 05 12:06:02 compute-0 nova_compute[187208]: 2025-12-05 12:06:02.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:02 compute-0 nova_compute[187208]: 2025-12-05 12:06:02.632 187212 INFO nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Creating config drive at /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.config
Dec 05 12:06:02 compute-0 nova_compute[187208]: 2025-12-05 12:06:02.638 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpudr9a0j1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:02 compute-0 nova_compute[187208]: 2025-12-05 12:06:02.765 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpudr9a0j1" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:02 compute-0 NetworkManager[55691]: <info>  [1764936362.8237] manager: (tap821e6243-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Dec 05 12:06:02 compute-0 kernel: tap821e6243-8d: entered promiscuous mode
Dec 05 12:06:02 compute-0 ovn_controller[95610]: 2025-12-05T12:06:02Z|00465|binding|INFO|Claiming lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed for this chassis.
Dec 05 12:06:02 compute-0 ovn_controller[95610]: 2025-12-05T12:06:02Z|00466|binding|INFO|821e6243-8d28-4c8c-874c-f1e69c7d3bed: Claiming fa:16:3e:a6:47:26 10.100.0.9
Dec 05 12:06:02 compute-0 nova_compute[187208]: 2025-12-05 12:06:02.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.844 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:47:26 10.100.0.9'], port_security=['fa:16:3e:a6:47:26 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f4c4888-4b32-4259-8441-31af091e0c7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85037de7275442698e604ee3f6283cbc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed3fff5f-a24a-492e-ba85-8f010d446cfc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac2e7e6b-9342-46f8-a910-5de5a261f0a9, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=821e6243-8d28-4c8c-874c-f1e69c7d3bed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.846 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 821e6243-8d28-4c8c-874c-f1e69c7d3bed in datapath 0f4c4888-4b32-4259-8441-31af091e0c7d bound to our chassis
Dec 05 12:06:02 compute-0 ovn_controller[95610]: 2025-12-05T12:06:02Z|00467|binding|INFO|Setting lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed ovn-installed in OVS
Dec 05 12:06:02 compute-0 ovn_controller[95610]: 2025-12-05T12:06:02Z|00468|binding|INFO|Setting lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed up in Southbound
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.850 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f4c4888-4b32-4259-8441-31af091e0c7d
Dec 05 12:06:02 compute-0 nova_compute[187208]: 2025-12-05 12:06:02.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:02 compute-0 nova_compute[187208]: 2025-12-05 12:06:02.853 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:02 compute-0 systemd-machined[153543]: New machine qemu-63-instance-0000003b.
Dec 05 12:06:02 compute-0 systemd-udevd[226016]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.873 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[44d4931b-6ec8-4c67-b89b-6f4b47276039]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:02 compute-0 NetworkManager[55691]: <info>  [1764936362.8803] device (tap821e6243-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:06:02 compute-0 NetworkManager[55691]: <info>  [1764936362.8815] device (tap821e6243-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:06:02 compute-0 systemd[1]: Started Virtual Machine qemu-63-instance-0000003b.
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.916 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3e451133-b566-431e-8a79-b3aefc2f652e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.920 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9d134e1f-cf08-4461-9f55-de6f482bead6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.946 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a2be27-8442-4baa-bf8b-bc1c34f8eb87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.963 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[adac84a7-f8a8-4931-b53b-9e9010d0e24a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f4c4888-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:45:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372337, 'reachable_time': 25070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226035, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.984 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af027261-317c-42f0-869c-606e62e07f88]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0f4c4888-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372348, 'tstamp': 372348}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226036, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0f4c4888-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372350, 'tstamp': 372350}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226036, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.987 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f4c4888-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:02 compute-0 nova_compute[187208]: 2025-12-05 12:06:02.988 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:02 compute-0 nova_compute[187208]: 2025-12-05 12:06:02.990 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.991 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f4c4888-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.991 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.992 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f4c4888-40, col_values=(('external_ids', {'iface-id': 'b2e28c8a-557d-459b-807e-dd1f5be0a608'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.992 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:03.012 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:03.013 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.554 187212 DEBUG nova.network.neutron [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updated VIF entry in instance network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.555 187212 DEBUG nova.network.neutron [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.578 187212 DEBUG oslo_concurrency.lockutils [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.688 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936363.6872625, 297d72ef-6b79-45b3-813b-52b5144b522e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.688 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] VM Started (Lifecycle Event)
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.700 187212 DEBUG nova.compute.manager [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.701 187212 DEBUG oslo_concurrency.lockutils [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.701 187212 DEBUG oslo_concurrency.lockutils [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.701 187212 DEBUG oslo_concurrency.lockutils [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.701 187212 DEBUG nova.compute.manager [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] No waiting events found dispatching network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.702 187212 WARNING nova.compute.manager [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received unexpected event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 for instance with vm_state active and task_state None.
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.712 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.718 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936363.6874504, 297d72ef-6b79-45b3-813b-52b5144b522e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.718 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] VM Paused (Lifecycle Event)
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.739 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.743 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:03 compute-0 nova_compute[187208]: 2025-12-05 12:06:03.764 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:04 compute-0 ovn_controller[95610]: 2025-12-05T12:06:04Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8a:a8:16 10.100.0.13
Dec 05 12:06:04 compute-0 ovn_controller[95610]: 2025-12-05T12:06:04Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8a:a8:16 10.100.0.13
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.403 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.403 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.430 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.512 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.513 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.518 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.519 187212 INFO nova.compute.claims [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.776 187212 DEBUG nova.compute.provider_tree [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.799 187212 DEBUG nova.scheduler.client.report [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.821 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.822 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.877 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.878 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.913 187212 INFO nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:06:04 compute-0 nova_compute[187208]: 2025-12-05 12:06:04.934 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.032 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.035 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.036 187212 INFO nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Creating image(s)
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.037 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.037 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.038 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.055 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.130 187212 DEBUG nova.policy [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.134 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.135 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.135 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.146 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.220 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.221 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.255 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.256 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.256 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.317 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.318 187212 DEBUG nova.virt.disk.api [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Checking if we can resize image /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.318 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.377 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.378 187212 DEBUG nova.virt.disk.api [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Cannot resize image /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.378 187212 DEBUG nova.objects.instance [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'migration_context' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.399 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.400 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Ensure instance console log exists: /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.400 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.401 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:05 compute-0 nova_compute[187208]: 2025-12-05 12:06:05.401 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:06 compute-0 podman[226059]: 2025-12-05 12:06:06.232201257 +0000 UTC m=+0.076089859 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Dec 05 12:06:06 compute-0 nova_compute[187208]: 2025-12-05 12:06:06.512 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Successfully created port: 2064bfa7-125e-466c-9365-6c0ec6655113 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:06:06 compute-0 nova_compute[187208]: 2025-12-05 12:06:06.789 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:06 compute-0 nova_compute[187208]: 2025-12-05 12:06:06.812 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:06 compute-0 nova_compute[187208]: 2025-12-05 12:06:06.813 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:06 compute-0 nova_compute[187208]: 2025-12-05 12:06:06.833 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:06:06 compute-0 nova_compute[187208]: 2025-12-05 12:06:06.914 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:06 compute-0 nova_compute[187208]: 2025-12-05 12:06:06.915 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:06 compute-0 nova_compute[187208]: 2025-12-05 12:06:06.921 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:06:06 compute-0 nova_compute[187208]: 2025-12-05 12:06:06.922 187212 INFO nova.compute.claims [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.151 187212 DEBUG nova.compute.provider_tree [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.167 187212 DEBUG nova.scheduler.client.report [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.192 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.193 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.259 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.260 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.286 187212 INFO nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.311 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.432 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.434 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.435 187212 INFO nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Creating image(s)
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.436 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.436 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.437 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.461 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.522 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.523 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.523 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.535 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.592 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.599 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.600 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.627 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.648 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.649 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.649 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.709 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.710 187212 DEBUG nova.virt.disk.api [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Checking if we can resize image /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.711 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.766 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.767 187212 DEBUG nova.virt.disk.api [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Cannot resize image /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.767 187212 DEBUG nova.objects.instance [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'migration_context' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.789 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.789 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Ensure instance console log exists: /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.790 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.790 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:07 compute-0 nova_compute[187208]: 2025-12-05 12:06:07.790 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:08 compute-0 nova_compute[187208]: 2025-12-05 12:06:08.113 187212 DEBUG nova.policy [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:06:10 compute-0 nova_compute[187208]: 2025-12-05 12:06:10.074 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:10 compute-0 nova_compute[187208]: 2025-12-05 12:06:10.613 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Successfully updated port: 2064bfa7-125e-466c-9365-6c0ec6655113 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:06:10 compute-0 nova_compute[187208]: 2025-12-05 12:06:10.645 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:10 compute-0 nova_compute[187208]: 2025-12-05 12:06:10.646 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:10 compute-0 nova_compute[187208]: 2025-12-05 12:06:10.646 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.043 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.044 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.072 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.153 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.153 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.160 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.160 187212 INFO nova.compute.claims [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.207 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.400 187212 DEBUG nova.compute.provider_tree [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.419 187212 DEBUG nova.scheduler.client.report [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.442 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.443 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.493 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.494 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.513 187212 INFO nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.528 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.615 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.616 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.617 187212 INFO nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating image(s)
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.617 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.617 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.618 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.633 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.669 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Successfully created port: 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.698 187212 DEBUG nova.policy [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.723 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.724 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.724 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.734 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.792 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.795 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:11 compute-0 nova_compute[187208]: 2025-12-05 12:06:11.795 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.034 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk 1073741824" returned: 0 in 0.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.035 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.036 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.061 187212 DEBUG nova.compute.manager [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-changed-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.062 187212 DEBUG nova.compute.manager [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing instance network info cache due to event network-changed-2064bfa7-125e-466c-9365-6c0ec6655113. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.062 187212 DEBUG oslo_concurrency.lockutils [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.110 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.111 187212 DEBUG nova.virt.disk.api [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Checking if we can resize image /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.111 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.165 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.167 187212 DEBUG nova.virt.disk.api [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Cannot resize image /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.168 187212 DEBUG nova.objects.instance [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'migration_context' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.189 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.190 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Ensure instance console log exists: /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.191 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.191 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.192 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.619 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.655 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.681 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.682 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance network_info: |[{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.682 187212 DEBUG oslo_concurrency.lockutils [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.682 187212 DEBUG nova.network.neutron [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing network info cache for port 2064bfa7-125e-466c-9365-6c0ec6655113 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.685 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start _get_guest_xml network_info=[{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.689 187212 WARNING nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.726 187212 DEBUG nova.virt.libvirt.host [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.727 187212 DEBUG nova.virt.libvirt.host [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.734 187212 DEBUG nova.virt.libvirt.host [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.734 187212 DEBUG nova.virt.libvirt.host [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.734 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.735 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.735 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.735 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.735 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.736 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.736 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.736 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.736 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.737 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.737 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.737 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.740 187212 DEBUG nova.virt.libvirt.vif [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.741 187212 DEBUG nova.network.os_vif_util [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.742 187212 DEBUG nova.network.os_vif_util [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.743 187212 DEBUG nova.objects.instance [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_devices' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.759 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <uuid>25918fc4-05ec-4a16-b77f-ca1d352a2763</uuid>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <name>instance-0000003c</name>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:06:12</nova:creationTime>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:06:12 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:06:12 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:06:12 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:06:12 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:06:12 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:06:12 compute-0 nova_compute[187208]:         <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:06:12 compute-0 nova_compute[187208]:         <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:06:12 compute-0 nova_compute[187208]:         <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec 05 12:06:12 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <system>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <entry name="serial">25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <entry name="uuid">25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     </system>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <os>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   </os>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <features>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   </features>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:7b:68:b7"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <target dev="tap2064bfa7-12"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log" append="off"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <video>
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     </video>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:06:12 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:06:12 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:06:12 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:06:12 compute-0 nova_compute[187208]: </domain>
Dec 05 12:06:12 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.760 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Preparing to wait for external event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.761 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.761 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.761 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.762 187212 DEBUG nova.virt.libvirt.vif [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.762 187212 DEBUG nova.network.os_vif_util [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.763 187212 DEBUG nova.network.os_vif_util [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.763 187212 DEBUG os_vif [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.764 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.767 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2064bfa7-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.767 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2064bfa7-12, col_values=(('external_ids', {'iface-id': '2064bfa7-125e-466c-9365-6c0ec6655113', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:68:b7', 'vm-uuid': '25918fc4-05ec-4a16-b77f-ca1d352a2763'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:12 compute-0 NetworkManager[55691]: <info>  [1764936372.7701] manager: (tap2064bfa7-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.777 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.778 187212 INFO os_vif [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12')
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.850 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Successfully created port: ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:06:12 compute-0 podman[226118]: 2025-12-05 12:06:12.872888374 +0000 UTC m=+0.050346898 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.926 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.927 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.927 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:7b:68:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:06:12 compute-0 nova_compute[187208]: 2025-12-05 12:06:12.928 187212 INFO nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Using config drive
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.574 187212 DEBUG nova.compute.manager [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.616 187212 INFO nova.compute.manager [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] instance snapshotting
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.617 187212 DEBUG nova.objects.instance [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'flavor' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.812 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Successfully updated port: ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.829 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.830 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.831 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.851 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Beginning live snapshot process
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.857 187212 INFO nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Creating config drive at /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.867 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi1gdg89p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:13 compute-0 nova_compute[187208]: 2025-12-05 12:06:13.982 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.012 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi1gdg89p" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:14 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.041 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:14 compute-0 ovn_controller[95610]: 2025-12-05T12:06:14Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:3c:38 10.100.0.11
Dec 05 12:06:14 compute-0 kernel: tap2064bfa7-12: entered promiscuous mode
Dec 05 12:06:14 compute-0 NetworkManager[55691]: <info>  [1764936374.0723] manager: (tap2064bfa7-12): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Dec 05 12:06:14 compute-0 ovn_controller[95610]: 2025-12-05T12:06:14Z|00469|binding|INFO|Claiming lport 2064bfa7-125e-466c-9365-6c0ec6655113 for this chassis.
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.073 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:14 compute-0 ovn_controller[95610]: 2025-12-05T12:06:14Z|00470|binding|INFO|2064bfa7-125e-466c-9365-6c0ec6655113: Claiming fa:16:3e:7b:68:b7 10.100.0.12
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.089 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:68:b7 10.100.0.12'], port_security=['fa:16:3e:7b:68:b7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '38cb0acb-7ac3-4fef-baeb-661c59e2e07c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2064bfa7-125e-466c-9365-6c0ec6655113) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.090 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:14 compute-0 ovn_controller[95610]: 2025-12-05T12:06:14Z|00471|binding|INFO|Setting lport 2064bfa7-125e-466c-9365-6c0ec6655113 up in Southbound
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.091 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2064bfa7-125e-466c-9365-6c0ec6655113 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:06:14 compute-0 ovn_controller[95610]: 2025-12-05T12:06:14Z|00472|binding|INFO|Setting lport 2064bfa7-125e-466c-9365-6c0ec6655113 ovn-installed in OVS
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.093 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.093 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:14 compute-0 ovn_controller[95610]: 2025-12-05T12:06:14Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:3c:38 10.100.0.11
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[afc63d61-e925-48de-933f-4e4b9fded5e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.108 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbfed6fc-31 in ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.111 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbfed6fc-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.111 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6da0ed74-6add-46cd-8a87-81b5ad9d87ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.112 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[494f7270-3b7f-4bc9-81fc-8ff52fa93305]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 systemd-machined[153543]: New machine qemu-64-instance-0000003c.
Dec 05 12:06:14 compute-0 systemd-udevd[226178]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.129 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7636037e-dda2-4f30-833f-78418ed8f4b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 systemd[1]: Started Virtual Machine qemu-64-instance-0000003c.
Dec 05 12:06:14 compute-0 NetworkManager[55691]: <info>  [1764936374.1379] device (tap2064bfa7-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:06:14 compute-0 NetworkManager[55691]: <info>  [1764936374.1408] device (tap2064bfa7-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.148 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.148 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.155 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1398c713-9048-4492-886f-c0df2f6d4b0d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.190 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f563280b-c6fe-4a87-89b4-49aa86167290]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 systemd-udevd[226183]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:14 compute-0 NetworkManager[55691]: <info>  [1764936374.1963] manager: (tapfbfed6fc-30): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.195 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf72da1-39e2-4ffe-b580-16f3fb4ba1bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.228 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.240 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.229 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4ea2dfe0-40dc-4425-b174-162ca3996ecd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.255 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e1cce84e-508c-4543-8211-df7e947b13f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 NetworkManager[55691]: <info>  [1764936374.2774] device (tapfbfed6fc-30): carrier: link connected
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.287 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0f37e0ba-c8eb-4120-8b1d-887f150b5b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.304 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e995db3-12d6-41b9-b361-80a11ba02df3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375484, 'reachable_time': 41038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226215, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.306 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.307 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.319 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e9557ca7-50ae-4e0f-9b1e-068df9ca9a9d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe26:8872'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375484, 'tstamp': 375484}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226218, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.337 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3f339b-7a49-4c92-ac6e-ad6916b7e19c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375484, 'reachable_time': 41038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226220, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.359 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70.delta 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.360 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.368 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d83ee95-ffc6-4882-9a17-51867efb0c1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.416 187212 DEBUG nova.virt.libvirt.guest [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.437 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d76d088-a36a-4a01-8dda-ab47e6241b73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.438 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.438 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.439 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:14 compute-0 NetworkManager[55691]: <info>  [1764936374.4413] manager: (tapfbfed6fc-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Dec 05 12:06:14 compute-0 kernel: tapfbfed6fc-30: entered promiscuous mode
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.440 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.446 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.447 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:14 compute-0 ovn_controller[95610]: 2025-12-05T12:06:14Z|00473|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.462 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.463 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7e2ae5-9a7d-4c33-9670-70068e5468cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.464 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:06:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.465 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'env', 'PROCESS_TAG=haproxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.668 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936374.6677818, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.668 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] VM Started (Lifecycle Event)
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.689 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.693 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936374.6679525, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.694 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] VM Paused (Lifecycle Event)
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.715 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.719 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.737 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:14 compute-0 podman[226273]: 2025-12-05 12:06:14.805973748 +0000 UTC m=+0.049675749 container create b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:06:14 compute-0 systemd[1]: Started libpod-conmon-b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934.scope.
Dec 05 12:06:14 compute-0 podman[226273]: 2025-12-05 12:06:14.780139195 +0000 UTC m=+0.023841226 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:06:14 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:06:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39a6a101eec5b4484b9949c620581716165bc1e8dfc205f2cf46df83cc7fa1cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:06:14 compute-0 podman[226273]: 2025-12-05 12:06:14.911368998 +0000 UTC m=+0.155071019 container init b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 12:06:14 compute-0 podman[226273]: 2025-12-05 12:06:14.918964866 +0000 UTC m=+0.162666867 container start b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.920 187212 DEBUG nova.virt.libvirt.guest [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.925 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:06:14 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [NOTICE]   (226293) : New worker (226296) forked
Dec 05 12:06:14 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [NOTICE]   (226293) : Loading success.
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.969 187212 DEBUG nova.privsep.utils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:06:14 compute-0 nova_compute[187208]: 2025-12-05 12:06:14.970 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70.delta /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.081 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Successfully updated port: 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.103 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.104 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.104 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.394 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70.delta /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.400 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot extracted, beginning image upload
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.604 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.658 187212 DEBUG nova.network.neutron [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updated VIF entry in instance network info cache for port 2064bfa7-125e-466c-9365-6c0ec6655113. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.659 187212 DEBUG nova.network.neutron [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.682 187212 DEBUG oslo_concurrency.lockutils [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.738 187212 DEBUG nova.compute.manager [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.738 187212 DEBUG nova.compute.manager [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing instance network info cache due to event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:15 compute-0 nova_compute[187208]: 2025-12-05 12:06:15.738 187212 DEBUG oslo_concurrency.lockutils [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.070 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.783 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.784 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance network_info: |[{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.784 187212 DEBUG oslo_concurrency.lockutils [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.784 187212 DEBUG nova.network.neutron [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.787 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start _get_guest_xml network_info=[{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.791 187212 WARNING nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.797 187212 DEBUG nova.virt.libvirt.host [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.797 187212 DEBUG nova.virt.libvirt.host [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.804 187212 DEBUG nova.virt.libvirt.host [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.805 187212 DEBUG nova.virt.libvirt.host [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.805 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.805 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.806 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.806 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.806 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.807 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.807 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.807 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.807 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.808 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.808 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.808 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.813 187212 DEBUG nova.virt.libvirt.vif [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHKhL003clvQeWhyQnRnlaccZLUvEBLEhvImBOCB5geqDizgWJsGjayma/8q9qGL/NiGPTPxEZoxanWZnFRBuZklxJy5hDaSwVjbF4FtdnX9ysLeFgNsQAX0H4LK24ei2Q==',key_name='tempest-keypair-105541899',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.813 187212 DEBUG nova.network.os_vif_util [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.814 187212 DEBUG nova.network.os_vif_util [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.815 187212 DEBUG nova.objects.instance [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.832 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <uuid>5d70ac2d-111f-4e1b-ac26-3e02849b0458</uuid>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <name>instance-0000003e</name>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <nova:name>tempest-AttachVolumeShelveTestJSON-server-795100487</nova:name>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:06:16</nova:creationTime>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:06:16 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:06:16 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:06:16 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:06:16 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:06:16 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:06:16 compute-0 nova_compute[187208]:         <nova:user uuid="bc4332be3b424a5e996b61b244505cfc">tempest-AttachVolumeShelveTestJSON-1858452545-project-member</nova:user>
Dec 05 12:06:16 compute-0 nova_compute[187208]:         <nova:project uuid="6d62df5807554f499d26b5fc77ec8603">tempest-AttachVolumeShelveTestJSON-1858452545</nova:project>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:06:16 compute-0 nova_compute[187208]:         <nova:port uuid="ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b">
Dec 05 12:06:16 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <system>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <entry name="serial">5d70ac2d-111f-4e1b-ac26-3e02849b0458</entry>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <entry name="uuid">5d70ac2d-111f-4e1b-ac26-3e02849b0458</entry>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     </system>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <os>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   </os>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <features>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   </features>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:6a:c5:99"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <target dev="tapac02dd63-5a"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/console.log" append="off"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <video>
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     </video>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:06:16 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:06:16 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:06:16 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:06:16 compute-0 nova_compute[187208]: </domain>
Dec 05 12:06:16 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.834 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Preparing to wait for external event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.834 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.834 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.834 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.835 187212 DEBUG nova.virt.libvirt.vif [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHKhL003clvQeWhyQnRnlaccZLUvEBLEhvImBOCB5geqDizgWJsGjayma/8q9qGL/NiGPTPxEZoxanWZnFRBuZklxJy5hDaSwVjbF4FtdnX9ysLeFgNsQAX0H4LK24ei2Q==',key_name='tempest-keypair-105541899',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.835 187212 DEBUG nova.network.os_vif_util [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.836 187212 DEBUG nova.network.os_vif_util [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.836 187212 DEBUG os_vif [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.837 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.837 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.839 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.840 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac02dd63-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.840 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapac02dd63-5a, col_values=(('external_ids', {'iface-id': 'ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:c5:99', 'vm-uuid': '5d70ac2d-111f-4e1b-ac26-3e02849b0458'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.841 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:16 compute-0 NetworkManager[55691]: <info>  [1764936376.8429] manager: (tapac02dd63-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.849 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.854 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.855 187212 INFO os_vif [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a')
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.902 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.902 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.903 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No VIF found with MAC fa:16:3e:6a:c5:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:06:16 compute-0 nova_compute[187208]: 2025-12-05 12:06:16.903 187212 INFO nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Using config drive
Dec 05 12:06:17 compute-0 nova_compute[187208]: 2025-12-05 12:06:17.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:17 compute-0 nova_compute[187208]: 2025-12-05 12:06:17.856 187212 INFO nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating config drive at /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config
Dec 05 12:06:17 compute-0 nova_compute[187208]: 2025-12-05 12:06:17.865 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsu2ryloe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:17 compute-0 nova_compute[187208]: 2025-12-05 12:06:17.993 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsu2ryloe" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:18 compute-0 kernel: tapac02dd63-5a: entered promiscuous mode
Dec 05 12:06:18 compute-0 NetworkManager[55691]: <info>  [1764936378.0778] manager: (tapac02dd63-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 ovn_controller[95610]: 2025-12-05T12:06:18Z|00474|binding|INFO|Claiming lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for this chassis.
Dec 05 12:06:18 compute-0 ovn_controller[95610]: 2025-12-05T12:06:18Z|00475|binding|INFO|ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b: Claiming fa:16:3e:6a:c5:99 10.100.0.8
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.086 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c5:99 10.100.0.8'], port_security=['fa:16:3e:6a:c5:99 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d62df5807554f499d26b5fc77ec8603', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5a04f4af-e81b-4661-95ed-5737ffc98cae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a7d298f-265e-44c5-a73a-18dd9ed0b171, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.088 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b in datapath fc6ce614-d0f7-413f-bc3e-26f7271993d9 bound to our chassis
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.091 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec 05 12:06:18 compute-0 ovn_controller[95610]: 2025-12-05T12:06:18Z|00476|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b ovn-installed in OVS
Dec 05 12:06:18 compute-0 ovn_controller[95610]: 2025-12-05T12:06:18Z|00477|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b up in Southbound
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.095 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.098 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.104 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce3627a7-f3d6-4b18-9a7e-c9d02ee8b4fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.105 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfc6ce614-d1 in ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.107 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfc6ce614-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5de38a04-4fd3-4623-9712-2c1cfae14d42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.108 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0906a4b1-50d4-4679-9b10-d48f713a5b49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.121 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[10a0bd83-b779-4382-b34d-67695d5d4156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 systemd-udevd[226339]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:18 compute-0 systemd-machined[153543]: New machine qemu-65-instance-0000003e.
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.138 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[90554aae-5851-4cbf-b7c0-8d8d1f56eb2b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 systemd[1]: Started Virtual Machine qemu-65-instance-0000003e.
Dec 05 12:06:18 compute-0 NetworkManager[55691]: <info>  [1764936378.1447] device (tapac02dd63-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:06:18 compute-0 NetworkManager[55691]: <info>  [1764936378.1460] device (tapac02dd63-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.169 187212 DEBUG nova.compute.manager [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.169 187212 DEBUG nova.compute.manager [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing instance network info cache due to event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.169 187212 DEBUG oslo_concurrency.lockutils [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.167 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0120a36d-9ab0-4fbd-a173-66f1e64da165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.176 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f86e91ca-e99e-4a4f-a078-0ab5dab2b86d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 NetworkManager[55691]: <info>  [1764936378.1771] manager: (tapfc6ce614-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/205)
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.208 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[33788a94-f500-4c55-9bbe-930f00588e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.212 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc8d872-8986-43a5-af94-b50fccc13e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.231 187212 DEBUG nova.network.neutron [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updated VIF entry in instance network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.231 187212 DEBUG nova.network.neutron [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:18 compute-0 NetworkManager[55691]: <info>  [1764936378.2418] device (tapfc6ce614-d0): carrier: link connected
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.248 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce09589-08a1-48c1-9cfa-765b06e7b640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.253 187212 DEBUG oslo_concurrency.lockutils [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.267 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2fd0be-f9ee-4bd4-bdde-6ca4e9d0fb93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc6ce614-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:6b:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375880, 'reachable_time': 38901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226372, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.281 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a818749b-4ee2-4aeb-a069-e3702fcf1cfb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:6b90'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375880, 'tstamp': 375880}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226373, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.298 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ea07bec4-e61f-4978-8f04-c665338ce61c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc6ce614-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:6b:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375880, 'reachable_time': 38901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226374, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.315 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.332 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.332 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance network_info: |[{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.332 187212 DEBUG oslo_concurrency.lockutils [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.333 187212 DEBUG nova.network.neutron [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.335 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start _get_guest_xml network_info=[{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.336 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[13abc249-864d-4fb6-b8b5-bd5c2de15a5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.339 187212 WARNING nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.346 187212 DEBUG nova.virt.libvirt.host [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.347 187212 DEBUG nova.virt.libvirt.host [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.353 187212 DEBUG nova.virt.libvirt.host [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.353 187212 DEBUG nova.virt.libvirt.host [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.353 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.353 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.354 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.354 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.354 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.354 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.356 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.359 187212 DEBUG nova.virt.libvirt.vif [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-2105634627',display_name='tempest-AttachInterfacesUnderV243Test-server-2105634627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-2105634627',id=61,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNE7kQOo1iw7msO5U3UKQiYNUNOuR3489N27cA8/7AyK9hUMINDB4EKPtuAqKWiOpLa6/9d1/JcrFvBfelk3gje2Ue6XSif/X6uD8HtKgekiyZF9ENjW4HKYytyiU96vgQ==',key_name='tempest-keypair-865071651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5285f99befb24ac285be8e4fc1d18e69',ramdisk_id='',reservation_id='r-93zclce8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1358924829',owner_user_name='tempest-AttachInterfacesUnderV243Test-1358924829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6b73160d333a43ed94d4258262e3c2b5',uuid=bcdca3f9-3e24-4209-808c-8093b55e5c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.359 187212 DEBUG nova.network.os_vif_util [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converting VIF {"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.360 187212 DEBUG nova.network.os_vif_util [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.361 187212 DEBUG nova.objects.instance [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'pci_devices' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.377 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <uuid>bcdca3f9-3e24-4209-808c-8093b55e5c2d</uuid>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <name>instance-0000003d</name>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <nova:name>tempest-AttachInterfacesUnderV243Test-server-2105634627</nova:name>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:06:18</nova:creationTime>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:06:18 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:06:18 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:06:18 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:06:18 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:06:18 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:06:18 compute-0 nova_compute[187208]:         <nova:user uuid="6b73160d333a43ed94d4258262e3c2b5">tempest-AttachInterfacesUnderV243Test-1358924829-project-member</nova:user>
Dec 05 12:06:18 compute-0 nova_compute[187208]:         <nova:project uuid="5285f99befb24ac285be8e4fc1d18e69">tempest-AttachInterfacesUnderV243Test-1358924829</nova:project>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:06:18 compute-0 nova_compute[187208]:         <nova:port uuid="88c7b630-e84b-4a35-8c8f-f934e7cabaf6">
Dec 05 12:06:18 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <system>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <entry name="serial">bcdca3f9-3e24-4209-808c-8093b55e5c2d</entry>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <entry name="uuid">bcdca3f9-3e24-4209-808c-8093b55e5c2d</entry>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     </system>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <os>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   </os>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <features>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   </features>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.config"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:bb:19:b7"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <target dev="tap88c7b630-e8"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/console.log" append="off"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <video>
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     </video>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:06:18 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:06:18 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:06:18 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:06:18 compute-0 nova_compute[187208]: </domain>
Dec 05 12:06:18 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.377 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Preparing to wait for external event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.378 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.378 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.378 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.379 187212 DEBUG nova.virt.libvirt.vif [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-2105634627',display_name='tempest-AttachInterfacesUnderV243Test-server-2105634627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-2105634627',id=61,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNE7kQOo1iw7msO5U3UKQiYNUNOuR3489N27cA8/7AyK9hUMINDB4EKPtuAqKWiOpLa6/9d1/JcrFvBfelk3gje2Ue6XSif/X6uD8HtKgekiyZF9ENjW4HKYytyiU96vgQ==',key_name='tempest-keypair-865071651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5285f99befb24ac285be8e4fc1d18e69',ramdisk_id='',reservation_id='r-93zclce8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1358924829',owner_user_name='tempest-AttachInterfacesUnderV243Test-1358924829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6b73160d333a43ed94d4258262e3c2b5',uuid=bcdca3f9-3e24-4209-808c-8093b55e5c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.379 187212 DEBUG nova.network.os_vif_util [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converting VIF {"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.379 187212 DEBUG nova.network.os_vif_util [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.380 187212 DEBUG os_vif [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.380 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.380 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.380 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.382 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.382 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88c7b630-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.383 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap88c7b630-e8, col_values=(('external_ids', {'iface-id': '88c7b630-e84b-4a35-8c8f-f934e7cabaf6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:19:b7', 'vm-uuid': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.384 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 NetworkManager[55691]: <info>  [1764936378.3855] manager: (tap88c7b630-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.386 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.390 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.390 187212 INFO os_vif [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8')
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.396 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.397 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.413 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fae879e6-1f58-428f-abc8-d73bcde91af1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.415 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6ce614-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.416 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.416 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc6ce614-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.418 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:06:18 compute-0 NetworkManager[55691]: <info>  [1764936378.4199] manager: (tapfc6ce614-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Dec 05 12:06:18 compute-0 kernel: tapfc6ce614-d0: entered promiscuous mode
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.421 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.423 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.427 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc6ce614-d0, col_values=(('external_ids', {'iface-id': '1b193bb7-c39e-445c-9a2c-dd8ee58553b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:18 compute-0 ovn_controller[95610]: 2025-12-05T12:06:18Z|00478|binding|INFO|Releasing lport 1b193bb7-c39e-445c-9a2c-dd8ee58553b9 from this chassis (sb_readonly=0)
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.428 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.440 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.445 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.446 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.447 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[26fb0e5b-6fe5-4cba-b660-d2a4adfdcdd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.447 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:06:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.448 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'env', 'PROCESS_TAG=haproxy-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fc6ce614-d0f7-413f-bc3e-26f7271993d9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.470 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.471 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.471 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] No VIF found with MAC fa:16:3e:bb:19:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.471 187212 INFO nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Using config drive
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.501 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.501 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.518 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.518 187212 INFO nova.compute.claims [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.705 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936378.7054462, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.706 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Started (Lifecycle Event)
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.731 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.735 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936378.7056284, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.735 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Paused (Lifecycle Event)
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.737 187212 DEBUG nova.compute.provider_tree [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.759 187212 DEBUG nova.scheduler.client.report [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.763 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.766 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.785 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.786 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.789 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.836 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.836 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:06:18 compute-0 podman[226418]: 2025-12-05 12:06:18.838783623 +0000 UTC m=+0.056644740 container create 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.855 187212 INFO nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.872 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:06:18 compute-0 systemd[1]: Started libpod-conmon-161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c.scope.
Dec 05 12:06:18 compute-0 podman[226418]: 2025-12-05 12:06:18.803712325 +0000 UTC m=+0.021573472 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:06:18 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:06:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f29ee6b8791a73cdc634bfdec56c186b809e5d9ad3d5a603996e4c487e56a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:06:18 compute-0 podman[226418]: 2025-12-05 12:06:18.938363016 +0000 UTC m=+0.156224153 container init 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:06:18 compute-0 podman[226418]: 2025-12-05 12:06:18.944132271 +0000 UTC m=+0.161993388 container start 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 12:06:18 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [NOTICE]   (226438) : New worker (226440) forked
Dec 05 12:06:18 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [NOTICE]   (226438) : Loading success.
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.975 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.976 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.977 187212 INFO nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Creating image(s)
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.977 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.978 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.978 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:18 compute-0 nova_compute[187208]: 2025-12-05 12:06:18.991 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.047 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.048 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.049 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.061 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.084 187212 INFO nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Creating config drive at /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.config
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.092 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzxlj9h5z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.126 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot image upload complete
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.129 187212 INFO nova.compute.manager [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 5.48 seconds to snapshot the instance on the hypervisor.
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.133 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.133 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.172 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.173 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.173 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.200 187212 DEBUG nova.policy [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.225 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzxlj9h5z" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.231 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.233 187212 DEBUG nova.virt.disk.api [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Checking if we can resize image /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.233 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.297 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.298 187212 DEBUG nova.virt.disk.api [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Cannot resize image /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.299 187212 DEBUG nova.objects.instance [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'migration_context' on Instance uuid e9f9bf08-7688-4213-91ff-74f2271ec71d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:19 compute-0 kernel: tap88c7b630-e8: entered promiscuous mode
Dec 05 12:06:19 compute-0 NetworkManager[55691]: <info>  [1764936379.3116] manager: (tap88c7b630-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Dec 05 12:06:19 compute-0 ovn_controller[95610]: 2025-12-05T12:06:19Z|00479|binding|INFO|Claiming lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 for this chassis.
Dec 05 12:06:19 compute-0 ovn_controller[95610]: 2025-12-05T12:06:19Z|00480|binding|INFO|88c7b630-e84b-4a35-8c8f-f934e7cabaf6: Claiming fa:16:3e:bb:19:b7 10.100.0.7
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.314 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:19 compute-0 systemd-udevd[226357]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.316 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.316 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Ensure instance console log exists: /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.317 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.317 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.317 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.322 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:19:b7 10.100.0.7'], port_security=['fa:16:3e:bb:19:b7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0566af06-3837-49db-a95c-47b9857e4e90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5285f99befb24ac285be8e4fc1d18e69', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a5c5fedc-8874-4d17-85d6-f832393ee546', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b689627-4043-49f3-b45a-0160a35a0a18, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=88c7b630-e84b-4a35-8c8f-f934e7cabaf6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.323 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 in datapath 0566af06-3837-49db-a95c-47b9857e4e90 bound to our chassis
Dec 05 12:06:19 compute-0 ovn_controller[95610]: 2025-12-05T12:06:19Z|00481|binding|INFO|Setting lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 ovn-installed in OVS
Dec 05 12:06:19 compute-0 ovn_controller[95610]: 2025-12-05T12:06:19Z|00482|binding|INFO|Setting lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 up in Southbound
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.326 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0566af06-3837-49db-a95c-47b9857e4e90
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.326 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:19 compute-0 NetworkManager[55691]: <info>  [1764936379.3339] device (tap88c7b630-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:06:19 compute-0 NetworkManager[55691]: <info>  [1764936379.3349] device (tap88c7b630-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.337 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d5b403-9c27-4c11-9b1e-15f5b7da8c9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.337 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0566af06-31 in ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.339 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0566af06-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.339 187212 DEBUG nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.340 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e875b99e-e9d4-41c3-b15e-8aa07b6f91d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.340 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.340 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.340 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.341 187212 DEBUG nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Processing event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.341 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[077ddd6b-f96a-4477-84f6-a83390c150b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.341 187212 DEBUG nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.341 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.341 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.342 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.342 187212 DEBUG nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.342 187212 WARNING nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 for instance with vm_state building and task_state spawning.
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.343 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.344 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.353 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[f08ff02a-b9db-47ce-9d1e-5e3716e75599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.358 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936379.3576407, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.359 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] VM Resumed (Lifecycle Event)
Dec 05 12:06:19 compute-0 systemd-machined[153543]: New machine qemu-66-instance-0000003d.
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.365 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:06:19 compute-0 systemd[1]: Started Virtual Machine qemu-66-instance-0000003d.
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.369 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[53eca702-e6fc-47e0-ac1b-9ca4abed9026]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.372 187212 INFO nova.virt.libvirt.driver [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance spawned successfully.
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.373 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.389 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.396 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.401 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.401 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.401 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.401 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[16118044-a113-4b53-bbbb-3d7d47684d8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.403 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.403 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.403 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:19 compute-0 NetworkManager[55691]: <info>  [1764936379.4122] manager: (tap0566af06-30): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.409 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[da376776-7135-4a9f-a08c-e6f93f63d831]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.429 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.432 187212 DEBUG nova.compute.manager [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found 1 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.445 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[35060fb1-4e7e-4b86-8e89-2fe96f356020]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.450 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ada9f199-9492-49df-9bb9-027f549c1063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 podman[226473]: 2025-12-05 12:06:19.456171302 +0000 UTC m=+0.152883526 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.468 187212 INFO nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Took 14.43 seconds to spawn the instance on the hypervisor.
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.468 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:19 compute-0 NetworkManager[55691]: <info>  [1764936379.4789] device (tap0566af06-30): carrier: link connected
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.491 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[493ee2d1-06fe-40e3-8e2c-3c0c4a2e4047]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.508 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3615ea8b-3c17-4601-96e3-b5d16ec709b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0566af06-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:cb:64'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376004, 'reachable_time': 18575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226519, 'error': None, 'target': 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.523 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3234b35d-9c5f-48cd-996f-cfc62630dc06]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee2:cb64'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376004, 'tstamp': 376004}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226520, 'error': None, 'target': 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.537 187212 INFO nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Took 15.06 seconds to build instance.
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.543 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78a813ad-1a66-4202-97a6-84dfaa85abff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0566af06-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:cb:64'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376004, 'reachable_time': 18575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226521, 'error': None, 'target': 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.556 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.573 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b331aec8-4c66-4317-a102-89f1cfbc4f81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.637 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62dd7de7-022a-471c-a4c9-9f566a23909a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.639 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0566af06-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.639 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.639 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0566af06-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:19 compute-0 NetworkManager[55691]: <info>  [1764936379.6422] manager: (tap0566af06-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Dec 05 12:06:19 compute-0 kernel: tap0566af06-30: entered promiscuous mode
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.645 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.646 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0566af06-30, col_values=(('external_ids', {'iface-id': '08ca2eb6-40e5-4c40-8c65-26542a1d3b4d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.647 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:19 compute-0 ovn_controller[95610]: 2025-12-05T12:06:19Z|00483|binding|INFO|Releasing lport 08ca2eb6-40e5-4c40-8c65-26542a1d3b4d from this chassis (sb_readonly=0)
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.654 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936379.6542509, bcdca3f9-3e24-4209-808c-8093b55e5c2d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.655 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] VM Started (Lifecycle Event)
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.660 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.663 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.663 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0566af06-3837-49db-a95c-47b9857e4e90.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0566af06-3837-49db-a95c-47b9857e4e90.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.664 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aa396f47-89c3-4220-9838-345a494e8efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.665 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-0566af06-3837-49db-a95c-47b9857e4e90
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/0566af06-3837-49db-a95c-47b9857e4e90.pid.haproxy
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 0566af06-3837-49db-a95c-47b9857e4e90
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:06:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.666 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'env', 'PROCESS_TAG=haproxy-0566af06-3837-49db-a95c-47b9857e4e90', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0566af06-3837-49db-a95c-47b9857e4e90.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.672 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.677 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936379.6544235, bcdca3f9-3e24-4209-808c-8093b55e5c2d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.677 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] VM Paused (Lifecycle Event)
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.694 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.699 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:19 compute-0 nova_compute[187208]: 2025-12-05 12:06:19.725 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:20 compute-0 podman[226559]: 2025-12-05 12:06:20.076535486 +0000 UTC m=+0.069404756 container create abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:06:20 compute-0 systemd[1]: Started libpod-conmon-abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4.scope.
Dec 05 12:06:20 compute-0 podman[226559]: 2025-12-05 12:06:20.032758468 +0000 UTC m=+0.025627768 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:06:20 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:06:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b07d63045405ef887718f7278761beb53f95ff32a7c0a77fd47c0591ae50b1a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:06:20 compute-0 podman[226559]: 2025-12-05 12:06:20.159512642 +0000 UTC m=+0.152381942 container init abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:06:20 compute-0 podman[226559]: 2025-12-05 12:06:20.166527103 +0000 UTC m=+0.159396373 container start abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 05 12:06:20 compute-0 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [NOTICE]   (226578) : New worker (226580) forked
Dec 05 12:06:20 compute-0 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [NOTICE]   (226578) : Loading success.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.679 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.679 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.679 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Processing event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] No waiting events found dispatching network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 WARNING nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received unexpected event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed for instance with vm_state building and task_state spawning.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Processing event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.683 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.683 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.683 187212 WARNING nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state building and task_state spawning.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.684 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance event wait completed in 16 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.684 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.698 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936380.6972528, 297d72ef-6b79-45b3-813b-52b5144b522e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.699 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] VM Resumed (Lifecycle Event)
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.702 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.703 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.708 187212 INFO nova.virt.libvirt.driver [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance spawned successfully.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.709 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.711 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance spawned successfully.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.711 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.725 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.732 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.739 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.739 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.740 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.740 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.741 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.741 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.747 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.747 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.748 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.748 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.749 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.749 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.754 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.754 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936380.6976998, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.754 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Resumed (Lifecycle Event)
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.792 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.795 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.821 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.826 187212 INFO nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Took 9.21 seconds to spawn the instance on the hypervisor.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.827 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.830 187212 INFO nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Took 24.29 seconds to spawn the instance on the hypervisor.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.830 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.916 187212 INFO nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Took 9.78 seconds to build instance.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.921 187212 INFO nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Took 24.82 seconds to build instance.
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.939 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:20 compute-0 nova_compute[187208]: 2025-12-05 12:06:20.940 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.898s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:21 compute-0 nova_compute[187208]: 2025-12-05 12:06:21.158 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Successfully created port: 48b30c48-7858-408b-aeab-df46f6277546 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:06:21 compute-0 nova_compute[187208]: 2025-12-05 12:06:21.375 187212 DEBUG nova.network.neutron [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updated VIF entry in instance network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:21 compute-0 nova_compute[187208]: 2025-12-05 12:06:21.376 187212 DEBUG nova.network.neutron [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:21 compute-0 nova_compute[187208]: 2025-12-05 12:06:21.394 187212 DEBUG oslo_concurrency.lockutils [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:21 compute-0 nova_compute[187208]: 2025-12-05 12:06:21.558 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.329 187212 DEBUG nova.compute.manager [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.370 187212 INFO nova.compute.manager [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] instance snapshotting
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.372 187212 DEBUG nova.objects.instance [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'flavor' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.479 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Successfully updated port: 48b30c48-7858-408b-aeab-df46f6277546 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.493 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.494 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquired lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.494 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.624 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.635 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Beginning live snapshot process
Dec 05 12:06:22 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.767 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.827 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.829 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.884 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.902 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.966 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.970 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:22 compute-0 nova_compute[187208]: 2025-12-05 12:06:22.971 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.166 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07.delta 1073741824" returned: 0 in 0.195s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.168 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.224 187212 DEBUG nova.virt.libvirt.guest [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.311 187212 DEBUG nova.compute.manager [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-changed-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.312 187212 DEBUG nova.compute.manager [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing instance network info cache due to event network-changed-48b30c48-7858-408b-aeab-df46f6277546. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.313 187212 DEBUG oslo_concurrency.lockutils [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.385 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.727 187212 DEBUG nova.virt.libvirt.guest [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.731 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.770 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.774 187212 DEBUG nova.privsep.utils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.775 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07.delta /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.800 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Releasing lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.801 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance network_info: |[{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.802 187212 DEBUG oslo_concurrency.lockutils [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.802 187212 DEBUG nova.network.neutron [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing network info cache for port 48b30c48-7858-408b-aeab-df46f6277546 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.805 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start _get_guest_xml network_info=[{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.809 187212 WARNING nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.816 187212 DEBUG nova.virt.libvirt.host [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.817 187212 DEBUG nova.virt.libvirt.host [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.825 187212 DEBUG nova.virt.libvirt.host [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.826 187212 DEBUG nova.virt.libvirt.host [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.826 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.826 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.827 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.827 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.828 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.828 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.828 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.829 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.829 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.829 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.830 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.830 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.838 187212 DEBUG nova.virt.libvirt.vif [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1685847021',display_name='tempest-SecurityGroupsTestJSON-server-1685847021',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1685847021',id=63,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-52243xe8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:18Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=e9f9bf08-7688-4213-91ff-74f2271ec71d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.839 187212 DEBUG nova.network.os_vif_util [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.841 187212 DEBUG nova.network.os_vif_util [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:23 compute-0 nova_compute[187208]: 2025-12-05 12:06:23.842 187212 DEBUG nova.objects.instance [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'pci_devices' on Instance uuid e9f9bf08-7688-4213-91ff-74f2271ec71d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:24 compute-0 nova_compute[187208]: 2025-12-05 12:06:24.282 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07.delta /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:24 compute-0 nova_compute[187208]: 2025-12-05 12:06:24.287 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot extracted, beginning image upload
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.229 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <uuid>e9f9bf08-7688-4213-91ff-74f2271ec71d</uuid>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <name>instance-0000003f</name>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <nova:name>tempest-SecurityGroupsTestJSON-server-1685847021</nova:name>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:06:23</nova:creationTime>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:06:25 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:06:25 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:06:25 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:06:25 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:06:25 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:06:25 compute-0 nova_compute[187208]:         <nova:user uuid="8db061f8c48141d1ac1c3216db1cc7f8">tempest-SecurityGroupsTestJSON-549628149-project-member</nova:user>
Dec 05 12:06:25 compute-0 nova_compute[187208]:         <nova:project uuid="442a804e3368417d9de1636d533a25e0">tempest-SecurityGroupsTestJSON-549628149</nova:project>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:06:25 compute-0 nova_compute[187208]:         <nova:port uuid="48b30c48-7858-408b-aeab-df46f6277546">
Dec 05 12:06:25 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <system>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <entry name="serial">e9f9bf08-7688-4213-91ff-74f2271ec71d</entry>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <entry name="uuid">e9f9bf08-7688-4213-91ff-74f2271ec71d</entry>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     </system>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <os>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   </os>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <features>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   </features>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.config"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:62:bb:58"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <target dev="tap48b30c48-78"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/console.log" append="off"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <video>
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     </video>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:06:25 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:06:25 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:06:25 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:06:25 compute-0 nova_compute[187208]: </domain>
Dec 05 12:06:25 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.235 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Preparing to wait for external event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.235 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.236 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.236 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.237 187212 DEBUG nova.virt.libvirt.vif [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1685847021',display_name='tempest-SecurityGroupsTestJSON-server-1685847021',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1685847021',id=63,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-52243xe8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:18Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=e9f9bf08-7688-4213-91ff-74f2271ec71d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.237 187212 DEBUG nova.network.os_vif_util [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.238 187212 DEBUG nova.network.os_vif_util [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.239 187212 DEBUG os_vif [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.240 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.240 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.243 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.243 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48b30c48-78, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.244 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap48b30c48-78, col_values=(('external_ids', {'iface-id': '48b30c48-7858-408b-aeab-df46f6277546', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:bb:58', 'vm-uuid': 'e9f9bf08-7688-4213-91ff-74f2271ec71d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.245 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:25 compute-0 NetworkManager[55691]: <info>  [1764936385.2467] manager: (tap48b30c48-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.248 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.253 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.254 187212 INFO os_vif [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78')
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.309 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.309 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.310 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No VIF found with MAC fa:16:3e:62:bb:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.310 187212 INFO nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Using config drive
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.940 187212 INFO nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Creating config drive at /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.config
Dec 05 12:06:25 compute-0 nova_compute[187208]: 2025-12-05 12:06:25.945 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgs9wzbdu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.075 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgs9wzbdu" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:26 compute-0 kernel: tap48b30c48-78: entered promiscuous mode
Dec 05 12:06:26 compute-0 NetworkManager[55691]: <info>  [1764936386.1348] manager: (tap48b30c48-78): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Dec 05 12:06:26 compute-0 ovn_controller[95610]: 2025-12-05T12:06:26Z|00484|binding|INFO|Claiming lport 48b30c48-7858-408b-aeab-df46f6277546 for this chassis.
Dec 05 12:06:26 compute-0 ovn_controller[95610]: 2025-12-05T12:06:26Z|00485|binding|INFO|48b30c48-7858-408b-aeab-df46f6277546: Claiming fa:16:3e:62:bb:58 10.100.0.8
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.136 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.143 187212 DEBUG nova.compute.manager [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.144 187212 DEBUG oslo_concurrency.lockutils [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.144 187212 DEBUG oslo_concurrency.lockutils [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.144 187212 DEBUG oslo_concurrency.lockutils [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.144 187212 DEBUG nova.compute.manager [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Processing event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.145 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:06:26 compute-0 ovn_controller[95610]: 2025-12-05T12:06:26Z|00486|binding|INFO|Setting lport 48b30c48-7858-408b-aeab-df46f6277546 ovn-installed in OVS
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.152 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936386.1508393, bcdca3f9-3e24-4209-808c-8093b55e5c2d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.152 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] VM Resumed (Lifecycle Event)
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.155 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.157 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:06:26 compute-0 systemd-udevd[226638]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:26 compute-0 NetworkManager[55691]: <info>  [1764936386.1757] device (tap48b30c48-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:06:26 compute-0 NetworkManager[55691]: <info>  [1764936386.1763] device (tap48b30c48-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.179 187212 INFO nova.virt.libvirt.driver [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance spawned successfully.
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.180 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:06:26 compute-0 systemd-machined[153543]: New machine qemu-67-instance-0000003f.
Dec 05 12:06:26 compute-0 systemd[1]: Started Virtual Machine qemu-67-instance-0000003f.
Dec 05 12:06:26 compute-0 ovn_controller[95610]: 2025-12-05T12:06:26Z|00487|binding|INFO|Setting lport 48b30c48-7858-408b-aeab-df46f6277546 up in Southbound
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.207 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bb:58 10.100.0.8'], port_security=['fa:16:3e:62:bb:58 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=48b30c48-7858-408b-aeab-df46f6277546) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.209 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 48b30c48-7858-408b-aeab-df46f6277546 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 bound to our chassis
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.211 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.224 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0607a6-e73e-49c0-9036-61a1c33505cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.225 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd355bd0-51 in ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.228 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.228 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd355bd0-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.229 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a6316c00-0ee9-4900-a51f-95efaa27aaeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.233 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a38dce-7909-4086-9169-9f4a5268f56f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.234 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.240 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.240 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.241 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.241 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.242 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.242 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.246 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7538f2b1-64af-4931-a288-6f3aee96c7b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.271 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.284 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[81e4b2f4-defc-450b-8aa7-aaaceb16e479]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.317 187212 INFO nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Took 18.88 seconds to spawn the instance on the hypervisor.
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.318 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.330 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6627e29c-a548-4b4e-a737-be263b6a7349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.336 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[63754ac1-8382-4c73-ab7e-ecc38790081b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 NetworkManager[55691]: <info>  [1764936386.3394] manager: (tapdd355bd0-50): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.381 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2332bc-2b2a-4780-a58d-c9a3e76d9d22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 podman[226651]: 2025-12-05 12:06:26.385411255 +0000 UTC m=+0.107037479 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.386 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0034bd26-61e8-4dac-b558-4f8f4938800d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 podman[226652]: 2025-12-05 12:06:26.41306148 +0000 UTC m=+0.135572019 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:06:26 compute-0 NetworkManager[55691]: <info>  [1764936386.4149] device (tapdd355bd0-50): carrier: link connected
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.420 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4446d0e6-0f9e-403e-af44-807dd8b74f1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.443 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7da0cf96-9f7e-4824-9059-bfaec4896e90]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226709, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.457 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72050fbc-f1f6-48dc-8d21-7cb2d1945b65]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:3ad'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376697, 'tstamp': 376697}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226710, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.474 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5f63e2-b7fc-4e60-b29a-faf011b26c2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226711, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.508 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e20e0f-9a67-4745-9e4b-a2162289689f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.577 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7eaf48fa-616c-4a5f-9079-c9f58f8cb867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.584 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:26 compute-0 NetworkManager[55691]: <info>  [1764936386.5866] manager: (tapdd355bd0-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Dec 05 12:06:26 compute-0 kernel: tapdd355bd0-50: entered promiscuous mode
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.589 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:26 compute-0 ovn_controller[95610]: 2025-12-05T12:06:26Z|00488|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.592 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd355bd0-560e-4b18-a504-3a5134c930f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd355bd0-560e-4b18-a504-3a5134c930f4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.593 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92c58665-f8f7-427e-9df2-cd285ea698e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.594 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-dd355bd0-560e-4b18-a504-3a5134c930f4
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/dd355bd0-560e-4b18-a504-3a5134c930f4.pid.haproxy
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID dd355bd0-560e-4b18-a504-3a5134c930f4
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:06:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.594 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'env', 'PROCESS_TAG=haproxy-dd355bd0-560e-4b18-a504-3a5134c930f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd355bd0-560e-4b18-a504-3a5134c930f4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.585 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.609 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:26 compute-0 nova_compute[187208]: 2025-12-05 12:06:26.707 187212 INFO nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Took 19.82 seconds to build instance.
Dec 05 12:06:27 compute-0 nova_compute[187208]: 2025-12-05 12:06:27.045 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:27 compute-0 podman[226741]: 2025-12-05 12:06:27.071770616 +0000 UTC m=+0.120104943 container create 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:06:27 compute-0 podman[226741]: 2025-12-05 12:06:26.990574742 +0000 UTC m=+0.038909089 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:06:27 compute-0 systemd[1]: Started libpod-conmon-3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3.scope.
Dec 05 12:06:27 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:06:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243fd1c5de96a14826aa2f40632d2c7e7d72bd7fcfdbb36dbcf9215a94d0a31f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:06:27 compute-0 podman[226741]: 2025-12-05 12:06:27.174628093 +0000 UTC m=+0.222962440 container init 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:06:27 compute-0 podman[226741]: 2025-12-05 12:06:27.182816539 +0000 UTC m=+0.231150866 container start 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:06:27 compute-0 nova_compute[187208]: 2025-12-05 12:06:27.223 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936387.2226713, e9f9bf08-7688-4213-91ff-74f2271ec71d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:27 compute-0 nova_compute[187208]: 2025-12-05 12:06:27.223 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] VM Started (Lifecycle Event)
Dec 05 12:06:27 compute-0 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [NOTICE]   (226767) : New worker (226769) forked
Dec 05 12:06:27 compute-0 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [NOTICE]   (226767) : Loading success.
Dec 05 12:06:27 compute-0 nova_compute[187208]: 2025-12-05 12:06:27.626 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:28 compute-0 nova_compute[187208]: 2025-12-05 12:06:28.041 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:28 compute-0 nova_compute[187208]: 2025-12-05 12:06:28.045 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936387.2227778, e9f9bf08-7688-4213-91ff-74f2271ec71d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:28 compute-0 nova_compute[187208]: 2025-12-05 12:06:28.046 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] VM Paused (Lifecycle Event)
Dec 05 12:06:28 compute-0 nova_compute[187208]: 2025-12-05 12:06:28.413 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:28 compute-0 nova_compute[187208]: 2025-12-05 12:06:28.417 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:28 compute-0 nova_compute[187208]: 2025-12-05 12:06:28.698 187212 DEBUG nova.network.neutron [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updated VIF entry in instance network info cache for port 48b30c48-7858-408b-aeab-df46f6277546. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:28 compute-0 nova_compute[187208]: 2025-12-05 12:06:28.699 187212 DEBUG nova.network.neutron [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:28 compute-0 nova_compute[187208]: 2025-12-05 12:06:28.846 187212 DEBUG oslo_concurrency.lockutils [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:28 compute-0 nova_compute[187208]: 2025-12-05 12:06:28.848 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.213 187212 DEBUG nova.compute.manager [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-changed-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.214 187212 DEBUG nova.compute.manager [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing instance network info cache due to event network-changed-2064bfa7-125e-466c-9365-6c0ec6655113. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.215 187212 DEBUG oslo_concurrency.lockutils [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.215 187212 DEBUG oslo_concurrency.lockutils [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.215 187212 DEBUG nova.network.neutron [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing network info cache for port 2064bfa7-125e-466c-9365-6c0ec6655113 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.246 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.452 187212 DEBUG nova.compute.manager [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.453 187212 DEBUG oslo_concurrency.lockutils [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.454 187212 DEBUG oslo_concurrency.lockutils [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.454 187212 DEBUG oslo_concurrency.lockutils [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.454 187212 DEBUG nova.compute.manager [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] No waiting events found dispatching network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:30 compute-0 nova_compute[187208]: 2025-12-05 12:06:30.455 187212 WARNING nova.compute.manager [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received unexpected event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 for instance with vm_state active and task_state None.
Dec 05 12:06:31 compute-0 nova_compute[187208]: 2025-12-05 12:06:31.517 187212 DEBUG oslo_concurrency.lockutils [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:31 compute-0 nova_compute[187208]: 2025-12-05 12:06:31.518 187212 DEBUG oslo_concurrency.lockutils [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:31 compute-0 nova_compute[187208]: 2025-12-05 12:06:31.518 187212 DEBUG nova.compute.manager [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:31 compute-0 nova_compute[187208]: 2025-12-05 12:06:31.522 187212 DEBUG nova.compute.manager [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Dec 05 12:06:31 compute-0 nova_compute[187208]: 2025-12-05 12:06:31.523 187212 DEBUG nova.objects.instance [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'flavor' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:31 compute-0 nova_compute[187208]: 2025-12-05 12:06:31.548 187212 DEBUG nova.virt.libvirt.driver [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:06:32 compute-0 nova_compute[187208]: 2025-12-05 12:06:32.032 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot image upload complete
Dec 05 12:06:32 compute-0 nova_compute[187208]: 2025-12-05 12:06:32.033 187212 INFO nova.compute.manager [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 9.63 seconds to snapshot the instance on the hypervisor.
Dec 05 12:06:32 compute-0 nova_compute[187208]: 2025-12-05 12:06:32.136 187212 DEBUG nova.network.neutron [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updated VIF entry in instance network info cache for port 2064bfa7-125e-466c-9365-6c0ec6655113. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:32 compute-0 nova_compute[187208]: 2025-12-05 12:06:32.137 187212 DEBUG nova.network.neutron [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:32 compute-0 nova_compute[187208]: 2025-12-05 12:06:32.159 187212 DEBUG oslo_concurrency.lockutils [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:32 compute-0 podman[226799]: 2025-12-05 12:06:32.232038453 +0000 UTC m=+0.086829810 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:06:32 compute-0 podman[226800]: 2025-12-05 12:06:32.250044476 +0000 UTC m=+0.105079291 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:06:32 compute-0 nova_compute[187208]: 2025-12-05 12:06:32.289 187212 DEBUG nova.compute.manager [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found 2 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Dec 05 12:06:32 compute-0 nova_compute[187208]: 2025-12-05 12:06:32.628 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:32 compute-0 ovn_controller[95610]: 2025-12-05T12:06:32Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:68:b7 10.100.0.12
Dec 05 12:06:32 compute-0 ovn_controller[95610]: 2025-12-05T12:06:32Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:68:b7 10.100.0.12
Dec 05 12:06:32 compute-0 ovn_controller[95610]: 2025-12-05T12:06:32Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a6:47:26 10.100.0.9
Dec 05 12:06:32 compute-0 ovn_controller[95610]: 2025-12-05T12:06:32Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a6:47:26 10.100.0.9
Dec 05 12:06:33 compute-0 nova_compute[187208]: 2025-12-05 12:06:33.680 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:33 compute-0 nova_compute[187208]: 2025-12-05 12:06:33.680 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:33 compute-0 nova_compute[187208]: 2025-12-05 12:06:33.695 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:06:33 compute-0 kernel: tap549318e9-e6 (unregistering): left promiscuous mode
Dec 05 12:06:33 compute-0 NetworkManager[55691]: <info>  [1764936393.7578] device (tap549318e9-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:06:33 compute-0 nova_compute[187208]: 2025-12-05 12:06:33.769 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:33 compute-0 nova_compute[187208]: 2025-12-05 12:06:33.770 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:33 compute-0 nova_compute[187208]: 2025-12-05 12:06:33.778 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:06:33 compute-0 nova_compute[187208]: 2025-12-05 12:06:33.779 187212 INFO nova.compute.claims [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:06:33 compute-0 ovn_controller[95610]: 2025-12-05T12:06:33Z|00489|binding|INFO|Releasing lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 from this chassis (sb_readonly=0)
Dec 05 12:06:33 compute-0 ovn_controller[95610]: 2025-12-05T12:06:33Z|00490|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 down in Southbound
Dec 05 12:06:33 compute-0 ovn_controller[95610]: 2025-12-05T12:06:33Z|00491|binding|INFO|Removing iface tap549318e9-e6 ovn-installed in OVS
Dec 05 12:06:33 compute-0 nova_compute[187208]: 2025-12-05 12:06:33.956 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:33.965 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:33.966 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis
Dec 05 12:06:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:33.968 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:06:33 compute-0 nova_compute[187208]: 2025-12-05 12:06:33.969 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:33.989 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7097bb31-9163-4b4e-a712-b3dbd3f187f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:33 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000038.scope: Deactivated successfully.
Dec 05 12:06:33 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000038.scope: Consumed 14.503s CPU time.
Dec 05 12:06:33 compute-0 systemd-machined[153543]: Machine qemu-60-instance-00000038 terminated.
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.022 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[488531f2-73f9-4ef1-9582-5c7e92db95ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.025 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e0102e0e-0399-4f8d-9ce6-2ab4712fff5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.043 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.043 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.044 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.044 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Processing event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] No waiting events found dispatching network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 WARNING nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received unexpected event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 for instance with vm_state building and task_state spawning.
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.047 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.047 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.048 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.051 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936394.0514553, e9f9bf08-7688-4213-91ff-74f2271ec71d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.051 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] VM Resumed (Lifecycle Event)
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.054 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.055 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0832c860-da92-4076-a30e-953f7dc0df31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.071 187212 INFO nova.virt.libvirt.driver [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance spawned successfully.
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.072 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.088 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.092 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.100 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.100 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.100 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.101 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.101 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.092 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1d3aa2d8-80bf-4af8-a9ac-900fa6c37abd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226889, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.101 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.118 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad39cb6-dca0-450a-a268-d43085e329cf]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226890, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226890, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.120 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.121 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.125 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.125 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.126 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.126 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.126 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.130 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.132 187212 DEBUG nova.compute.provider_tree [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.155 187212 DEBUG nova.scheduler.client.report [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.168 187212 INFO nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Took 15.19 seconds to spawn the instance on the hypervisor.
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.168 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.177 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.407s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.177 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:06:34 compute-0 systemd-udevd[226881]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:34 compute-0 kernel: tap549318e9-e6: entered promiscuous mode
Dec 05 12:06:34 compute-0 NetworkManager[55691]: <info>  [1764936394.1856] manager: (tap549318e9-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.188 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00492|binding|INFO|Claiming lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 for this chassis.
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00493|binding|INFO|549318e9-e629-4e2c-8cbb-3cd263c2bc34: Claiming fa:16:3e:9b:d7:ed 10.100.0.9
Dec 05 12:06:34 compute-0 kernel: tap549318e9-e6 (unregistering): left promiscuous mode
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.201 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.203 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.207 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00494|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 ovn-installed in OVS
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00495|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 up in Southbound
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00496|binding|INFO|Releasing lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 from this chassis (sb_readonly=1)
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00497|if_status|INFO|Dropped 5 log messages in last 88 seconds (most recently, 88 seconds ago) due to excessive rate
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00498|if_status|INFO|Not setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 down as sb is readonly
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.220 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00499|binding|INFO|Removing iface tap549318e9-e6 ovn-installed in OVS
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.222 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c40efcd-56c3-45ca-996d-a856ae9cfbbe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00500|binding|INFO|Releasing lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 from this chassis (sb_readonly=0)
Dec 05 12:06:34 compute-0 ovn_controller[95610]: 2025-12-05T12:06:34Z|00501|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 down in Southbound
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.231 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.258 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c34f156f-5c4e-411e-ac2d-79b34898f457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.262 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0f1d7f5a-56ef-410c-bc00-f8374ba86bda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.279 187212 INFO nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Took 15.80 seconds to build instance.
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.280 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.281 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.304 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f90f95-3dc2-46f8-b43a-4a5b66afb232]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.306 187212 INFO nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.309 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.912s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.320 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2112ddc-f8fa-4e04-97c4-73a9045e409f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226907, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.327 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.356 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0397192e-0598-4948-a3b2-8534ebb15987]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226908, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226908, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.357 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.359 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.364 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.364 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.365 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.366 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.366 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.368 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.372 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.387 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5c8000-edd5-41b8-b663-68b0ca9618b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.418 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d25d2ddc-96ae-4cbb-b1cf-5599038892b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.423 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[52f6c2b0-d524-4ea7-a85b-f9105ad44d3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.440 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.441 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.442 187212 INFO nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Creating image(s)
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.442 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.443 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.443 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.451 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a04ccb74-fd5b-4ce6-9abe-711c99c0c225]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.463 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.481 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f9764a8c-0b5d-4128-aefc-d49f7112232d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226926, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.494 187212 DEBUG nova.policy [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '077bcce844cb42a197dcd6100549b7d3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dc1fd38e325f4a2caa75aeab79da75d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.497 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91157c58-c0d5-466c-9517-128c1394ca18]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226928, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226928, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.499 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.502 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.505 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.505 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.506 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.506 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.507 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.538 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.539 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.539 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.551 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.591 187212 DEBUG nova.compute.manager [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.591 187212 DEBUG nova.compute.manager [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing instance network info cache due to event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.593 187212 DEBUG oslo_concurrency.lockutils [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.593 187212 DEBUG oslo_concurrency.lockutils [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.593 187212 DEBUG nova.network.neutron [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.598 187212 INFO nova.virt.libvirt.driver [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance shutdown successfully after 3 seconds.
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.606 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance destroyed successfully.
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.607 187212 DEBUG nova.objects.instance [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'numa_topology' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.620 187212 DEBUG nova.compute.manager [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.631 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.632 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:34 compute-0 nova_compute[187208]: 2025-12-05 12:06:34.671 187212 DEBUG oslo_concurrency.lockutils [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.247 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.302 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk 1073741824" returned: 0 in 0.669s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.314 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.775s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.315 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:35.362 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0d7840929940419ee8ea88b703037ee31cfdee552fea31f4c91af9a9732801d7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.380 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.381 187212 DEBUG nova.virt.disk.api [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Checking if we can resize image /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.382 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.458 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.460 187212 DEBUG nova.virt.disk.api [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Cannot resize image /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.461 187212 DEBUG nova.objects.instance [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lazy-loading 'migration_context' on Instance uuid ed00d159-9d70-481e-93be-ea180fea04ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.477 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.477 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Ensure instance console log exists: /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.478 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.478 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.479 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:35 compute-0 nova_compute[187208]: 2025-12-05 12:06:35.640 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Successfully created port: d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.071 187212 DEBUG nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.110 187212 INFO nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] instance snapshotting
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.112 187212 DEBUG nova.objects.instance [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'flavor' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:36.215 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 05 Dec 2025 12:06:35 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-2365d6d9-9806-4799-9de8-d299053ad6df x-openstack-request-id: req-2365d6d9-9806-4799-9de8-d299053ad6df _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 05 12:06:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:36.215 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "09233d41-3279-4f39-ac6e-a21662b4f176", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}]}, {"id": "dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 05 12:06:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:36.216 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-2365d6d9-9806-4799-9de8-d299053ad6df request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 05 12:06:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:36.218 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0d7840929940419ee8ea88b703037ee31cfdee552fea31f4c91af9a9732801d7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 05 12:06:36 compute-0 ovn_controller[95610]: 2025-12-05T12:06:36Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:c5:99 10.100.0.8
Dec 05 12:06:36 compute-0 ovn_controller[95610]: 2025-12-05T12:06:36Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:c5:99 10.100.0.8
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.271 187212 DEBUG nova.network.neutron [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updated VIF entry in instance network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.272 187212 DEBUG nova.network.neutron [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.297 187212 DEBUG oslo_concurrency.lockutils [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.302 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.303 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.319 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.320 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.320 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing instance network info cache due to event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.321 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.321 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.321 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.398 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Beginning live snapshot process
Dec 05 12:06:36 compute-0 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.558 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.624 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.625 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.691 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.710 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.773 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:36 compute-0 nova_compute[187208]: 2025-12-05 12:06:36.775 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.008 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Successfully updated port: d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.031 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.032 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquired lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.032 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.050 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb.delta 1073741824" returned: 0 in 0.274s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.051 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Quiescing instance not available: QEMU guest agent is not enabled.
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.118 187212 DEBUG nova.virt.libvirt.guest [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:06:37 compute-0 podman[226963]: 2025-12-05 12:06:37.154205046 +0000 UTC m=+0.066366007 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.171 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.523 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updated VIF entry in instance network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.523 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.547 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.551 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.551 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.552 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.552 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.552 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.663 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Fri, 05 Dec 2025 12:06:36 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-ace03037-28fd-4f02-b3f3-7eb510a85e5c x-openstack-request-id: req-ace03037-28fd-4f02-b3f3-7eb510a85e5c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.664 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "09233d41-3279-4f39-ac6e-a21662b4f176", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.664 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176 used request id req-ace03037-28fd-4f02-b3f3-7eb510a85e5c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.666 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.667 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8f613c8797e432d96e43223fb7c476d', 'user_id': '4f8149b8192e411a9131b103b25862b6', 'hostId': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.671 187212 DEBUG nova.virt.libvirt.guest [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.671 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6d62df5807554f499d26b5fc77ec8603', 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'hostId': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.674 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'name': 'tempest-ListServerFiltersTestJSON-instance-1365452817', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'e8f613c8797e432d96e43223fb7c476d', 'user_id': '4f8149b8192e411a9131b103b25862b6', 'hostId': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.674 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Skipping quiescing instance: QEMU guest agent is not enabled.
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.677 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000037', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '85037de7275442698e604ee3f6283cbc', 'user_id': '8cf2534e7c394130b675e44ed567401b', 'hostId': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.680 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98681240c47b41cba28d91e1c11fd71f', 'user_id': '242b773b0af24caf814e2a84178332d5', 'hostId': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.683 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '85037de7275442698e604ee3f6283cbc', 'user_id': '8cf2534e7c394130b675e44ed567401b', 'hostId': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.683 187212 DEBUG nova.compute.manager [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-changed-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.683 187212 DEBUG nova.compute.manager [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Refreshing instance network info cache due to event network-changed-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.684 187212 DEBUG oslo_concurrency.lockutils [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.685 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5285f99befb24ac285be8e4fc1d18e69', 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'hostId': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.687 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '442a804e3368417d9de1636d533a25e0', 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'hostId': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.766 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000036', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58cbd93e463049988ccd6d013893e7d6', 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'hostId': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.770 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000039', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8f613c8797e432d96e43223fb7c476d', 'user_id': '4f8149b8192e411a9131b103b25862b6', 'hostId': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.775 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b81bb939-d14f-4a72-b7fe-95fc5d8810a1 / tap5683f8a8-69 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.775 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.783 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5d70ac2d-111f-4e1b-ac26-3e02849b0458 / tapac02dd63-5a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.784 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.786 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.789 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 472c7e2c-bdad-4230-904b-6937ceb872d2 / tap9357c6a6-eb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.789 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.793 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 25918fc4-05ec-4a16-b77f-ca1d352a2763 / tap2064bfa7-12 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.793 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.797 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 297d72ef-6b79-45b3-813b-52b5144b522e / tap821e6243-8d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.798 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.802 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for bcdca3f9-3e24-4209-808c-8093b55e5c2d / tap88c7b630-e8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.802 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.806 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e9f9bf08-7688-4213-91ff-74f2271ec71d / tap48b30c48-78 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.806 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.813 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 24358eea-14fb-4863-a6c4-aadcdb495f54 / tap2e9efd6c-74 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.813 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.813 187212 DEBUG nova.privsep.utils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:06:37 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.814 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb.delta /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.817 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8888dd78-1c78-4065-8536-9a1096bdf57b / tapc5cb68aa-e5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.818 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c48d5a33-8e3b-4485-b17c-d4b9de0e41d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'd9e9415a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'e62dbf65ef2d06872dcfea3e9125814d7dc4b0161c8976b8ab08428d3bb41c42'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'd9ea8894-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': 'b29590fcfcbf4a8b2c8c6000c6397afdaca0e68ec518201e04a8a3f9674ccb52'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'd9eb4608-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': 'c2053d974d6485f604d6aa08fc4720aeaefbc2dd1902cc1f05759011dafa2cf8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'd9ebf0f8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '636e15e50014ebbf03b763aded8b952669ba6152a68d2ad723829b6571894c25'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'd9eca2fa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': 'edd71c0e5e7458a763903fa2dc7be7e661963774ad781ace042f4a8a4a3fa56e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: -05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'd9ed5038-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'c56d71772530d8474b93cbf25f3006ff8b270dc4590da2b4fae14bfb3031c4f4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'd9edeef8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '2b90f6ac461b51278d6ad7279a6922141a97ebb4b3a0b1de58cf98a2ea24ae96'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'd9eef10e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '942e38d850feac17dee95db273004d1fcaf10dc0633b58a677e23d6297f730e8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'd9efaa0e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '23163a3b0e08bfff9d4f3062cbf3fba3dc603afcf80b031cd7d7e4587a1e6add'}]}, 'timestamp': '2025-12-05 12:06:37.818724', '_unique_id': '3b9e28d5f3fa449380ba59f944a17b02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.821 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.822 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.823 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.823 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.824 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.824 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.824 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.825 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.825 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.825 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdc3f1af-ce9a-4aa6-9c35-ea70b3f19ac5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'd9f031b8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '7e5beea0b3bd7e7bc42a3854e74b9472ff698a0d23b73cfb0e3e48e5f78a2482'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'd9f03fd2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '14563c3dc2c702fbcc20fe9ca5de3d4105bb760cf1a8bd8de1b899af1df88fee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'd9f07a92-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': 'a21a759062871747ae65544b60c1216e4a7ff1d1a9784cd453e9bb7a432734ac'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'd9f08b54-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': 'cd93397b619f9723cdd58c6c16aeae2e2ef1480dcbf524a2a2b2321aab30494e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'd9f09aea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': 'ca509850007a9355008d165dcbffd6dff2353fe84cf2c5266487550626214212'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-Att
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: achInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'd9f0a92c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'ba5d4f51d4bb7bce8f42ce64c780436d57acedf4127f636783cf43afcc3edade'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'd9f0b502-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'e59efda236c11c7febf6fb079e91c0cc047eaa6b5077333a242a5789a57bc707'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'd9f0c416-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': 'e7ddf023b894eacbe3879484afabeb17e52ae402469ba8ab40005bbc5d70bbb2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'd9f0d2a8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'a4f23fd44ad5f0b6135e2a5efff0cb759f3c1b16d2d8609ef8cd4aedbe7fa80b'}]}, 'timestamp': '2025-12-05 12:06:37.826280', '_unique_id': '99842ab20c2546ce810a2309b61e4477'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.829 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.829 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.830 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.831 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.831 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.832 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.832 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.832 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.833 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.833 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.833 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f81d009b-26d8-4201-9eab-7faf8713bcb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'd9f1651a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '834d5d97067c22c33606ccadc307515b58e36aadcc2a1b55d11dc6aa0a346f5e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'd9f17604-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '8b73d82d3f31cea02cf632f5d1471aef7cd748433a8d06260c67e9c1cc0df3fd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'd9f1b402-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '9be6f0c67c24a56c64e0b7f9234ed8639b5c107e475dd21405e10b49e8316b0e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'd9f1c550-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '376e901cef0430b04385728518d642d76e46ade79307bf6a06976ba40942b4fd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'd9f1d0c2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '9b8356ef7fff6f92bc114212e110ac66c5052e718a91b7fd6c494e06aabdaf32'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12-05T12
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: :06:37.829548', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'd9f1e08a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': '4b7a65c2d9589e393707231b9e4ae6e6b04c032382f424a6feae683db470e5b6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'd9f1eeb8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '9fdd0c782c14a5246b1ed2d1e9396c6bf1d352432e17ca5579767b2a6c0b52cb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'd9f1f9d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '4a0afffab8989a902de80b71bf23ff3fed62843d66864921c8569daec1ff28fc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'd9f209b6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '3ca7b82f625e62ecfbca5367494b426ff3be95ae3ac02892aa6f9ddd293ea127'}]}, 'timestamp': '2025-12-05 12:06:37.834240', '_unique_id': '572c8d5fcfcb40a0a46f7f8f4d5c84b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.837 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.849 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.850 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.863 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.863 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.865 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.878 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.879 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.892 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.893 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.905 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.905 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.918 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.allocation volume: 29237248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.918 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.931 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.931 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.943 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.944 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.959 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.959 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2efa5488-1e6c-4c2c-a349-a3205641f3f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9f4778c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': 'e2cbbd561a7a15deab89449919dffdc91a6ca5b41729325246f75858252264bb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9f4872c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': '63faed4943498561b63aaa4334f5fba59961f21d40f45d0617a0e32cb01c508e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9f68df6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': '78124a62242b797be892bee511d039eb0e6ec3ca8d5c70cc8cc04c321d8ff6cd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9f6a020-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': 'c333cf579407119fd160fa611775aa787594bfed8218aff62318baa460e4b005'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9f8eaec-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': '481fda08c6a5417983d80d3eb27811b55ce5f63a003cb2db39be4978174d079b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb':
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]:  1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9f8f6ea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': '7553d31d41a76d81f01aea0323cc698eea550a4d30aadab482ed3f9fa6353dc7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9fb0ade-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': '6b4567b728212a1c9d33752acbaad2a7988d83c8417c5d828127ca0bb07b8577'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9fb16dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': 'e0cde58f24eb6bb4c9d76442eb5e95bd3f7fd60d41a3d09a0775c79527b49006'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9fcfc5e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': '9d46b3c8b57eb1e51258ca2fc88f426ff3b7f14fb61a9a39632de1a4bca2f486'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9fd0744-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': 'e97ff2ae98a5e9ace515af9c02d9de3448ab1254350770c0d2756ab85cd72337'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29237248, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9fee96a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': '761ca7d16af80d82dcdb3338fb04617cf329f8a59cc04530405ac8c369517e60'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, '
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: message_id': 'd9fef3e2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': 'a423e91cf46cc30e9878c30fe76c1fe9860b8b3efba727807f8c6d781b7c08db'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da00e6c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': 'eec7066328e89ff3897294c2bbf391f557fa4a6ef735c8ed29d6e0c33381c250'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da00f192-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': '809ff3176ed24408bc7d75e73b9f187e0b94bac5fffac6f79f1b303475d815dc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da02d32c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': 'bc235102f6e298f75beca6371c7c538b42cec9f177a970438b7fa96cdd9ebeb2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da02e100-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': '655e7f840ace546bb2fb25c2ff2c78c6d0f836af73b4e91cec79e1f3b1e390d7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da053194-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.565594991, 'message_signature': '688f420812f6908041f8f666fdec7f0b16a3bf9466fb1b0f85017d160abb0263'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da053d1a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.565594
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 991, 'message_signature': '692b8d0ca9c10fb8532f298990bd433ed011ce7a79751839a757003c15e8fa42'}]}, 'timestamp': '2025-12-05 12:06:37.960045', '_unique_id': 'dfd8b1b37df3477d8a980dfe180dc0ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.962 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.packets volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.962 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.965 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.965 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.965 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.packets volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b3845cc-15a5-475a-b6af-b712e46793a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 20, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da05a9a8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'dd081b8beb7931cc913f649b1699bc82b86b9ce5babbba37f4c08ccdfcc0ee80'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da05b22c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '11ea683ae7af00f2ce46617f9da93877af76d89869588c2bc3cdffaa1680f2cf'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da05e7e2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '905837230251bd64434fa955172616a5650f4064549ff559152bab7fbd3b271d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da05f03e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '318c8d22631c02497d850ee0932a67f017fdc58facb23298980da12fb2f3cbe6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da05f822-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '51a2d9be47007469724f92a22140dd77caa2622cef623b4e06f5d0d2cd73020e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12-05T12:06:37.962521', 'resource
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: _metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da060290-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'fe1a14da30884607b00263c5c5767c11c631c27c7f425b461acacae767202569'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da060e16-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'ad534aa1c21e2a1dc61c963fab0729aef9214ad01b063d85fe03974ae719911d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da06176c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '4c9c334d27913233966d18d2fd76873c3af59220ab11c082906e1219ea2157df'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 22, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da061fc8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'fb21937107a860f932e598d29f472494b89ed095b04cb286cc80b1a1e90c0b71'}]}, 'timestamp': '2025-12-05 12:06:37.965822', '_unique_id': 'd7148a875ffa460abb956b1ff0ea68e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.967 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 12:06:37 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.986 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/memory.usage volume: 44.96484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:37.999 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updating instance_info_cache with network_info: [{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.002 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/memory.usage volume: 40.46875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.003 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.018 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/memory.usage volume: 42.640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.020 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Releasing lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.021 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance network_info: |[{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.022 187212 DEBUG oslo_concurrency.lockutils [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.022 187212 DEBUG nova.network.neutron [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Refreshing network info cache for port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.025 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start _get_guest_xml network_info=[{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.043 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/memory.usage volume: 40.43359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.043 187212 WARNING nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.048 187212 DEBUG nova.virt.libvirt.host [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.049 187212 DEBUG nova.virt.libvirt.host [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is:  1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9f8f6e [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: message_id': 'd9fef3e2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.5272 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.054 187212 DEBUG nova.virt.libvirt.host [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.055 187212 DEBUG nova.virt.libvirt.host [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.056 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.056 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.057 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.057 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.057 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.057 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.058 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.058 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.058 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.058 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.059 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.059 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.063 187212 DEBUG nova.virt.libvirt.vif [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=64,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATHw79fzCFS1LAWHUiavQB3gUFaXpS81QU/Ce6wZ4HmvTj5LBGoan0DqDckMccItIq/MaTr8w95EnUae9L4Bz4KldjVTS0oi0uLUNfFAJiLjBukcvGPiZbx9R9d1EWHww==',key_name='tempest-keypair-599091465',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc1fd38e325f4a2caa75aeab79da75d3',ramdisk_id='',reservation_id='r-cei648o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-303309807',owner_user_name='tempest-ServersV294TestFqdnHostnames-303309807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='077bcce844cb42a197dcd6100549b7d3',uuid=ed00d159-9d70-481e-93be-ea180fea04ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.064 187212 DEBUG nova.network.os_vif_util [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converting VIF {"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.064 187212 DEBUG nova.network.os_vif_util [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.065 187212 DEBUG nova.objects.instance [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid ed00d159-9d70-481e-93be-ea180fea04ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.066 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/memory.usage volume: 40.4609375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.067 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.067 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.068 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.083 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.087 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <uuid>ed00d159-9d70-481e-93be-ea180fea04ba</uuid>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <name>instance-00000040</name>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <nova:name>guest-instance-1</nova:name>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:06:38</nova:creationTime>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:06:38 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:06:38 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:06:38 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:06:38 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:06:38 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:06:38 compute-0 nova_compute[187208]:         <nova:user uuid="077bcce844cb42a197dcd6100549b7d3">tempest-ServersV294TestFqdnHostnames-303309807-project-member</nova:user>
Dec 05 12:06:38 compute-0 nova_compute[187208]:         <nova:project uuid="dc1fd38e325f4a2caa75aeab79da75d3">tempest-ServersV294TestFqdnHostnames-303309807</nova:project>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:06:38 compute-0 nova_compute[187208]:         <nova:port uuid="d10caa85-dfcd-49ce-8ff7-2c2a68d1d731">
Dec 05 12:06:38 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <system>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <entry name="serial">ed00d159-9d70-481e-93be-ea180fea04ba</entry>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <entry name="uuid">ed00d159-9d70-481e-93be-ea180fea04ba</entry>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     </system>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <os>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   </os>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <features>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   </features>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.config"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:cc:8d:e9"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <target dev="tapd10caa85-df"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/console.log" append="off"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <video>
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     </video>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:06:38 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:06:38 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:06:38 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:06:38 compute-0 nova_compute[187208]: </domain>
Dec 05 12:06:38 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.088 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Preparing to wait for external event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.088 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.089 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.089 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.090 187212 DEBUG nova.virt.libvirt.vif [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=64,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATHw79fzCFS1LAWHUiavQB3gUFaXpS81QU/Ce6wZ4HmvTj5LBGoan0DqDckMccItIq/MaTr8w95EnUae9L4Bz4KldjVTS0oi0uLUNfFAJiLjBukcvGPiZbx9R9d1EWHww==',key_name='tempest-keypair-599091465',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc1fd38e325f4a2caa75aeab79da75d3',ramdisk_id='',reservation_id='r-cei648o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-303309807',owner_user_name='tempest-ServersV294TestFqdnHostnames-303309807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='077bcce844cb42a197dcd6100549b7d3',uuid=ed00d159-9d70-481e-93be-ea180fea04ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.090 187212 DEBUG nova.network.os_vif_util [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converting VIF {"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.091 187212 DEBUG nova.network.os_vif_util [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.091 187212 DEBUG os_vif [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.092 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.093 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.099 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.100 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.100 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.103 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd10caa85-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.103 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd10caa85-df, col_values=(('external_ids', {'iface-id': 'd10caa85-dfcd-49ce-8ff7-2c2a68d1d731', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:8d:e9', 'vm-uuid': 'ed00d159-9d70-481e-93be-ea180fea04ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.104 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.104 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance e9f9bf08-7688-4213-91ff-74f2271ec71d: ceilometer.compute.pollsters.NoVolumeException
Dec 05 12:06:38 compute-0 NetworkManager[55691]: <info>  [1764936398.1062] manager: (tapd10caa85-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.115 187212 INFO os_vif [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df')
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.125 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/memory.usage volume: 42.59765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.144 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/memory.usage volume: 42.6796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e3addb1-ba85-442e-aad8-0d820d082763', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.96484375, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da094b12-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.606937381, 'message_signature': '3118c861f074d266b0ea45aa5a2844d88abfb36a6dc7f1a18ec48a94afdae88d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.46875, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da0bbb86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.623031778, 'message_signature': 'b8b5d31784110a44e7e2b2c7ec65aba44a02fbf6f48143c94387529405957935'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.640625, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da0e3032-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.639152616, 'message_signature': 'b5fe262bfffb552279a9f7a25f610865aea361436ca0509ac5648e51d736e6e2'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.43359375, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da11fd7a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.664076019, 'message_signature': '3856babe968395abf9103e4f2a3d942215a4f2448ca3d28f4f48b4026f562852'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4609375, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da158738-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.68719179, 'message_signature': 'b64435b6a1c12925640aa1e012c56cffd28a6b9e36d2d1e5ef7980a6908e0931'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da183e6a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.704908164, 'message_signature': 'd41e02a3df
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: b48ce7f8c02419f1ed91413f0e4a7718d4727fc45d6346c63758d7'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.59765625, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da1e85d6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.746155211, 'message_signature': 'd75b0be015532be3bc7e9223ca70db24040dff2aa06ee10f92026d563074ce17'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.6796875, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da216ca6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.765158702, 'message_signature': 'aa5799b0ecddebf571031d10e148e490283edb51b6cd37198d6981df749dd51b'}]}, 'timestamp': '2025-12-05 12:06:38.144850', '_unique_id': '43c3a5d04ec94f78b9acb19b93db0c42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.147 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.147 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.148 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.148 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.149 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.149 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.149 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.149 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.150 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.150 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1dce17ff-3de3-415a-87bb-ce5d594b5ab1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da21d88a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '7f7f9070f8561380076f98097de48e2693f60f3aefb732ad7163e3c71043f9ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da21e41a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': 'ae296f00c5187bd93672b20cebc865b7931d8ce6f9a3df4ddc55032b52d82aaf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da221408-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '87d7552d491b1d60d9d288f5cef36e2c10ee0d668b6d2db84c8b42808b1d1cf3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da221fa2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '1ed61acdf5938fcc85adac9e5bfe5c903d9a4570553eaef7ba78b78fd197b140'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da2229e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '08665768e3330293b2ae756fc8f5cc632080c1a42cfa90a150392b1a2312bea5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da223492-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'f4fcd3284db8c9f76892817f525e427f32c47c38b51a1d5e55b4cb667471ff75'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da223fdc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '4d161bb5adf601fc5783f99281b858c717a86b6643e8aa553028fa003fddf765'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da224a86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': 'c8e65dcc1b97dc7505ed596fcc78e4e238a63083c0ded76a332767fad8c473b0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da225486-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'f0cc58717e842520999e1152c76320e76fe5aede3e7be2972a29e58b546ef910'}]}, 'timestamp': '2025-12-05 12:06:38.150708', '_unique_id': 'b4a0d70801ec47948370026045fc335f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.152 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.153 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.153 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.153 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.154 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.154 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.155 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.155 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.155 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.155 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.156 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.156 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.156 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.156 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.157 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.157 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.157 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.157 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.158 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '174acca3-093b-4f08-94ba-95d3024c9d02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da22b52a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': 'b8fb840b2e64e728cadffc5c6ac04f18389a87fb71f1105d6f5711065137117f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da22bfde-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': '274decb854b14e3292e7383ebe4cf2547ae6d928b7bb38cab66a8977668730e8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da22c9ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': '914b799a2f3e146fc7285e70d7fc1e064bbb788c1160ac8936cd8f6591aafe55'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da22d3c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': '420178c144c3ea0c7d262c8511b3702abb2222275e30ecc6f98a92b72c768633'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da22fd82-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': 'f8077042dfe204873d580ca0cd08ee0496cec079b7dfddc3c3d3952ef6b192d7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_g
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: b': 1, 'disk_name': 'sda'}, 'message_id': 'da230a5c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': 'f51ac98e8f3b9ce3ff10b37915346645151b35ddfa4d620e32523ec21be654f2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da23140c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': 'fdb66cbd7f97afe0c8a712a21e8b60acb72b2d683b9abee802b7c961b4b7d83a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da231dc6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': '790246eae059a2ff2f2a84fb6b03b122f857a6c14fbc9792ae07f12b409724e9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da2327da-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': 'acc777fe84828fb24217621c4262946af76c190fd118a7956f061afbc138dd95'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da233284-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': '1503a68dc5e32234a681170c7c3cd4944b52e3b80a4970feefbc1abbaab2886f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da233ee6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': 'fcd0df19fa03bcfbca1f8754b8f945e00f8bd9cbb03e9e5f2fba9d2ff2f05707'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da234990-d1d2-11f0-8572-fa163e006c52', 'monoto
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: nic_time': 3778.527239948, 'message_signature': '26f146b58e1ca66d291fc811eea7d620660d2f543e1a40a15f3fd8e2cec246b3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da23550c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': '754ba1a53031bf04583c9ffc0f6de6090a8842513b085e333daee00c61365be5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da235ebc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': '5a8ee1d49dd87e490c3b1bfa34d01589ebe5080db8cfe224c9f1f607c3cba41f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da236830-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': '24316d3399557a241374a16a4482ca4db39efd632b817273a66178fd7173c1a4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da2371f4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': 'afb0f3aeac1cb070279f847b69648f0a04f3494814f6b2dda72c7fbdb9744e65'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da237c94-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.565594991, 'message_signature': 'c8eee073d4ce2a9e0504cec75ebe07bd7ff4cd3a7383aa3ce02484df4e321ed3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da23861c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.565594991, 'message_signature': '4b83860f2150213e905c19704a51aa447c0dae670f76a681c4bef5738da8215
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 6'}]}, 'timestamp': '2025-12-05 12:06:38.158523', '_unique_id': '8899e3de0ad44e8db24bcf2349e26826'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.161 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.161 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.bytes volume: 1194 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.162 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.162 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.162 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.bytes volume: 1396 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.163 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.bytes volume: 1368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.163 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.163 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.163 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.164 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af260438-7f2b-47a4-a6b0-e66f241dcf92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da23f4e4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'd944c4bea5b837de1081ae6fd862cb1ae23983d97e9fc5afaec1d920d895db52'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1194, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da24004c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '560afd91ce48ef32f61c55b7a9d58c3b101143dcbd9b20e413d5761a6eda513e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da242f2c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '29437fa858fe79c8706f280121f235b859472d1370390e18e1ad767f9f755a52'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1396, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da243a26-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': 'dcca929856bdd6efb79a56b9912d3028087664066ae6411a1298a3619f13a97c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1368, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da244552-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '95345139cbbc19efbc2f9b5f6efdf111baad4c98821f018850fa867e237cb035'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'te
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: mpest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da244f8e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': '37fe8e46403dd3ce67983c09c1bedcc0e400b2e85cd732d22d1b50125852d27a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da2459e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'd31742fb53bd6465d382a233051d948d0091be57c1fb49e3d1dd53d8c7a727bd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da2464e2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': 'c0a3091078fa12a92330f122638007c6c530414d0a391c2b746efe3077abd97f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da246f32-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '7ed2adfeb9a625cd710631f09e67db50d8530ea332dda717a575f394ccf2f011'}]}, 'timestamp': '2025-12-05 12:06:38.164518', '_unique_id': '0fd1aa92f2c54d0ebff12368200d8055'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.166 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.166 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.167 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.167 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.168 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.168 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.168 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.169 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.169 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.169 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.169 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.170 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.170 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.170 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.170 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.171 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.171 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.171 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.171 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c9d39c2-eeb3-4243-aa2c-3dc402783af5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da24cfb8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': '3a602cc9abc61d839620c59e858b3e0265b58bf3f71a70426367779f07469e2e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da24db02-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': '1912db51bfa659853833e234430d46ffef96c3dadb8e5b10527dd587f43ce78f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da24e4da-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': '2232da64dbf2a91d46dc4c9d8432d34c307458984741b887c92af83568da6c5d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da24ee62-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': 'f208bf2ea970b413154d163ca6ace4a86446a2e8929e99922f6a42ceffb82010'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da251e3c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': 'eda559a9bdbd1910b84ffec3f852d2877313e74da82af07948dce25ea741ad04'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'e
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: phemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da25290e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': '6d5bb07c4a6d4ebedf34b77465396d9dd09348a6d066e996cbb12b773947cd97'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da253278-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': '7e6383bdfd9d2d0bbd90572db4750c4e5b971df7ae6ab63b8b09b8cf6b45c6e9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da253b42-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': 'd86841d772bc3968ebe454949d94c19a027aaf74d76a55c7425530fb1bbb03c7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da254542-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': '3b9f297834832541c21b6e3a3d7fd89f707c7b07fd069ec34d658ffb74f1d251'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da254efc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': '0c4ba25a10ff1912eeebf87856e0919d73f5db046afff01e2518be8bec3f1648'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da2558ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': 'a4d4c3e7bc7874e5939fb5f111bfb3a64e60ec2dcad35f4f6f685f6be6c0e606'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id':
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]:  'da2562ac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': 'e9ca3ab3aa0bc8f5d2d40479d992f092a962bf4524271a1e23745ec3cc0c96f3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da256bf8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': '502ee0e147eef004d4516edc0143ae5d9e9de8339cef8a451811230c05277f95'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da2575f8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': 'fc851ce5bf0ae3f2307be5e631b1389426548b3d11e014f75b86caf32c329b01'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da257f58-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': '7a047fd1359346e17a2916280d602d8f7c839d8b4efeeaa6fa0f95467e0962eb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da2588a4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': 'cd2629b41495b66d5fd55eb0fe6045e40071dd03c50aff9a169cae0bd7d9c6d2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da2591dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.565594991, 'message_signature': 'a67a902f0f5eff7e1a4322a5096b470a3b5fa636601c90966de762289ccaa190'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da259bb4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.565594991, 'message_si
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: gnature': 'cbc906ebecd0a4fe330ac16e9af25c290a9ce9c1228d58ecc928cba34ce4d277'}]}, 'timestamp': '2025-12-05 12:06:38.172172', '_unique_id': '2638dd16b78c40909453df7198aa101e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.174 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.174 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.199 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.requests volume: 380 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.199 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.232 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.requests volume: 300 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.232 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.233 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.258 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.259 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.284 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.requests volume: 288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.285 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.299 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb.delta /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.305 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot extracted, beginning image upload
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.315 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.requests volume: 299 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.316 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: b': 1, 'disk_name': 'sda'}, 'message_id': 'da230a5c-d1d2-11f0-8572-fa163e006c52' [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: nic_time': 3778.527239948, 'message_signature': '26f146b58e1ca66d291fc811eea7d62 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: phemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da25290e-d1d2 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is:  'da2562ac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'mess [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.338 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.340 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.341 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] No VIF found with MAC fa:16:3e:cc:8d:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.342 187212 INFO nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Using config drive
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.361 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.requests volume: 232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.361 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.376 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'flavor' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.401 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.401 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.412 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.412 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquired lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.413 187212 DEBUG nova.network.neutron [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.413 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'info_cache' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.430 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.431 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.458 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.requests volume: 278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.458 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f074ed24-c051-437f-a600-eff335bd0f54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 380, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da29d224-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '881577464ff67a8faa98087569a1a1658c1442b3ee6e39e50709321109653f1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da29e138-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '3b94fd97245146b8ffb619e1206eb6f8dc617fdfdf9b82ed8cbc9009f1f2a2d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 300, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da2ed6ac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '5f8445008d26050b3fc574106526b78fcaa37d917e8838c4a720bf24e9fb2a81'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da2ee44e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'fd0a442051750bb47ec07f4d3ce183313e5ffc6184c80349e024ce6bbc421ecb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da32db9e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '680330195e0cd353b6cecc593c01c0b7bb23b48902ebd1ab685459388c839213'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: ', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da32e6ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '5fcdc5a5b45e143d80ace757d996fc9a4469f27aa5e783dd5f5f25d1878eab45'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 288, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da36e0ae-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': 'cb5f485c055f67b4cb4db370dc9c7bf89c51fdeb061279697a2d4c4aded79b7b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da36f404-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '6312fb80274a6a678033232189ac3d107d80dc21f4546c950508254594fcb49c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 299, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da3b9176-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '3a1348315d5f5f62f121465b2987a4666ca5e15559f9711f8647d5e0d759c88d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da3ba5c6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '40c43f6c4bf2a0daffe01191a251fd4d643ac9970268975293a516d91e12883e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 232, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da428e86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'ebfc046ee61b3749753e389042b28d876df5ba48f1c667dc96e35f949f29a8c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: ', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da429d36-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'fdaa812690cd86bc97aecf29a9a07e00100064398768e4c1525f22124384564a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da48a4ce-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '24a2dd73dd89496afff1cd42c708aa188b669f69b3db9c12e17fe7c5f67a832b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da48afc8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '207d0084c3a97a15c63a6998003ccde7e8a9d3810d5e8ee0d98d7c85bfb2aae4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 317, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da4d1b80-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '352608a539b824669d798f2f199dfda027a3cf121106395d07f55621a943dbde'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da4d26ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': 'ae1bc3e1932fb4e999e896f9b4ca475ba019f3c59a6383b80c717a704d2c3350'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 278, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da515574-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '0d3fb12e3ae71cc9da53d079b01e692e05555beaa071da32649465a91970f51a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: pus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da516320-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': 'e58770076d80c30cbf90a879dc016fa096844ab2c9e7ee3bca1c0478a976ca04'}]}, 'timestamp': '2025-12-05 12:06:38.459253', '_unique_id': '18c9c4df3784464487cec060657b5665'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.462 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.462 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.462 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.464 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.464 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.464 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.465 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.465 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.465 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.466 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.466 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f307985-ac56-4936-a793-00fff57dcae3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da51f1c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '2f9b98dc945be1f3ef5e27028aed65c297ea684c28db63d281ace0b385fc3f07'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da51ff9c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '03945b2680c718a15ccb8744ada8cec228fd14e56295e1686de01e964c849a9d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da523dc2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': 'f6ff2f6c8206b2994322c32c0ca4d9c5dcf4a35aff044229a302dca45ae33eba'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da524a7e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': 'dad0a419e2109698c7f52089655a12995495d0936bb4458efe3eee06ff1462f3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da525df2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '13eeb742ce74eeee19f02d42738ab824c802c6abc65ea42085b9d67b2ef4df71'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12-05T12
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: :06:38.462401', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da526892-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': '5023f7889213e72254f839bca250742e1ed92a6a23511d89a98e7f1cbd640e73'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da5271b6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '6649680740cfcff854b7eda3a95f4f278ee5df12e09031ad9c52253e057a78da'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da527d14-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '1edca1b40fdb9f3c8d1b53c13410200e4297d360234641cef439c5ddaa95107f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da528f8e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '6a309efbfadae74bcb8b3f87559d1244dc87deaaba5f5a2336d55e036d37f7c8'}]}, 'timestamp': '2025-12-05 12:06:38.466788', '_unique_id': '0e1df6ec9e7d40ef975021892b58970c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: ', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: ', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.579 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.579 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.579 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.582 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.582 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.582 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e0ca005-bec2-44d4-96c0-9a577f633adc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da63d320-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '1934e9a37cbac1a80b55cc244ce31d3c0ef6e062378df87e5463d4d8c930e823'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da63e018-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '4699dd4f40e0f6705f1ea52b80006c1b8bd2f1db40df12dd9b8c4af108ab1a87'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da640f02-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '26cd36e48f490023a9542b0d4e0f01c7dfa8f2271fd47c8b56a7ef1e9c325e48'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da641768-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '70ccc423012816323018100a6242ca4b037908f88c084029d6a9b9df6684c828'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da641ff6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '2558964e1ac4818308f550898060774491df49b6f8ae363c1ce04a5ae65bef0c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-Att
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: achInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da642906-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'a1afd76e076d7988e0a54917245a44499a75c120ddc6a53e6622906ab9d2825b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da643248-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'aaa4cd224a2d12964829f6961ffb2b3a6c36a4dcf187ce333f7459edfa1afb9a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da643a18-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '2553e717ce387c292d185337ea9588c95f211ebe9b366313c6decce1efd10ceb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da6441e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'a889164b624332d50dc3a6224f2ca84ff96ea44a1b4105566319750bf4cd4909'}]}, 'timestamp': '2025-12-05 12:06:38.582736', '_unique_id': '2cb39730261349bdb88dec18a4492b8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.584 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.latency volume: 3600348889 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.584 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.585 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.latency volume: 16389433442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.585 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.586 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.586 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.latency volume: 3570009297 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.586 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.586 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.latency volume: 3788709110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.latency volume: 4092721146 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.latency volume: 4788852769 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.latency volume: 5295983790 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.latency volume: 3506766134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.589 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72744bd6-19e0-4bec-b9d1-df6a3d6e0deb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3600348889, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64951c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '68313a01fff7dd2a9af7b867f0b4141726c6c2c1ce25c4a7eb689bf0a9ec9f52'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da649daa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': 'ed1c80da3d74499e0a5580d36f50d8b0e954bb0199572d4bd523aaf99ae3b15b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16389433442, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64a85e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'a862b80a56d2cad757e64aeb95452a54ed3f00f03f97e685fa8764815eaddc49'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da64b010-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '9a82d068d8780ee520922a58d8ce6e9c826e84d1c2dd2e3bdfe97060e16dd6ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3570009297, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64d9aa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'bb832b50d5ef1d92471d31f761d26040fbc5d652f011a9a7277d17fd49d11290'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da64e1f2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '9db56b271052870e2b197d891e651b3b9476b6f5a272d1c32a126ce5e353ed46'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3788709110, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64e972-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '7500b63040199368fdba8cedc87da211e3c8b7908d6714bc7d2647e1d9851d8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da64f19c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': 'c499ce6ca27d0127fe3f3e79f2b2d6facf7b4b355973458c16757cf0fd58e197'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4092721146, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64fb38-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': 'e4043307c0244cefe56007aab657fe74d5a6da43d8ded0fd71595a6fb76960f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da650290-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '2eb2cd3ff7369b2c0bd931a49503cfda015686cf1e87a9e1060bc34f0bc9ba22'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4788852769, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6509c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '4e188b3ce4c0c60152331bcf421ea6443462d07ccd7a19b3a8c560ce7d833788'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus':
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]:  1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6510be-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'ea68c80cbd89f5470f692cea6e0ab335d4e1a54341b53bc30474b47b2d11e22a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da651898-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '3bd413ac6c83274db8a5d936557a4186c419080a0da3647c5d91cb8eec994a20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6521e4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '053d963794e8b3bd649bd96677f87ac94fff5f81b10e9bc61e1034b3d1724d1f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5295983790, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da652914-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '20227070c85a58b44be3b6ca9675f44e78222e58f58e5b3a263c895855344444'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da653012-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '1ed83f727b7863883c6b4037cbd2291157a169816ca253fbfcfe18aaf222dcbc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3506766134, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6538e6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '9d68deea3d63457108382ad432e698d886fc2d490a4e544c8bff308c2990ea0e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: _gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da654700-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '7768fae8c2e0387a06d0c71e34241413c4b78499875949c0d1b0ed8356034982'}]}, 'timestamp': '2025-12-05 12:06:38.589424', '_unique_id': 'c60cc5ab571343dc86af1c26b9dfe704'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.591 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.591 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.591 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.594 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.594 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.594 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a44540d6-b759-4ebb-b7f8-108f459439f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da65a93e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'c0c91ba1f3785f730fc830f6c82ac662e2a9cb78e9e8f4230bb024a79278f2c9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da65b56e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '1ab85df985ecdfff60878ce297fa8af6196e08b02ad2861cf076de0f3f905e15'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da65e2c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '341685fcd2befbd3b829a839b1b7bfa5d3cfb0039952467f3332dd3dfaca90cf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da65ee44-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '4a7a98485f0bc4c27a49840f81ec18fd70ce11174ceb072191e10111febfcff4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da65f826-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': 'e9125a0e05a426cb699b1f80d91e60f5a4f16d0e3b71a6724a1e3182602a97dc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: -05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da660334-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': '738a39a79339e00266887a783bd951347fe6309b6a61b3b47e93266a5e386084'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da660d52-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'a7274a75733fb48bedf6bc53ab418262b8c0f3b986bbc2d2bbaa6651eb451cd5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da6615cc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '756965e19c2a2b872eab2e7f955cfa398bdfd8b11b2c89feb170981f985f980b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da661e5a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'dcf2efa7d66146309ea58164c4d3f1f10d92397cdae1536664f70c35a9028e37'}]}, 'timestamp': '2025-12-05 12:06:38.594968', '_unique_id': '5bc27ab1ba7e43b8b6cf0f059624bce0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.596 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.596 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.597 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.597 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/cpu volume: 11310000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.597 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/cpu volume: 12410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.598 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.598 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/cpu volume: 11710000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.598 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/cpu volume: 11320000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/cpu volume: 11800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/cpu volume: 10760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/cpu volume: 3620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/cpu volume: 11860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/cpu volume: 12530000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is:  1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name' [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aaaba2f-7866-4cb5-8a2b-d06ba0867d17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11310000000, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66861a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.606937381, 'message_signature': '8810c1c9c4a2f1b1b1f42091e3841be5f1788fcfee56054a75b65f667fbcca49'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12410000000, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66915a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.623031778, 'message_signature': '60f42b4baa717c61ba3a497dbef9b20ce652d5ad8957a535e51f2611def5260f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11710000000, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66b6ee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.639152616, 'message_signature': '83b19c943ed97697a4508ba1daf1996cd3c95c2dc525315e5d2db2baa6440266'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11320000000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66be96-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.664076019, 'message_signature': 'ebb89db0d6e1191a4cc2cbbb3ebd68edc1b1216e7d0319c19e3b7c83d034612c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11800000000, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66c79c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.68719179, 'message_signature': '9a73e04d7e29efe04dfcab5a76e6f9f0c70477d9004f0853eb144550d270fec6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10760000000, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66cecc-d1d2-11f0-85
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 72-fa163e006c52', 'monotonic_time': 3778.704908164, 'message_signature': '1a2e97726bdc926befb9641ea0c189a4f8adb5cb3890b478f29df9210d1387cf'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3620000000, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66d7dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.725271815, 'message_signature': 'a1a854ff4601e22515b4de9cfb3e8d8fc4677e1999092d229fad6e50fe4dd987'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11860000000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66df48-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.746155211, 'message_signature': 'df38ac90ad8fc555585b74ff5ea1a5a74c4cb758b30bd2c9e6e362cd56bb93d6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12530000000, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66e65a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.765158702, 'message_signature': '6fe9cbe6cf74862e27f044a5a4d7d401c66fa9026a9d29de5dd8924bec8c9a53'}]}, 'timestamp': '2025-12-05 12:06:38.600054', '_unique_id': 'e28780f9a1a2477a9e3b5cb493219c67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.601 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.bytes volume: 28776960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.602 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.602 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.bytes volume: 29788672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.602 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.603 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.603 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.bytes volume: 29989376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.604 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.604 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.bytes volume: 30439936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.604 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.605 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.bytes volume: 30042624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.605 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.605 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.bytes volume: 27073536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.605 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.bytes volume: 176446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.606 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.606 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.606 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.bytes volume: 30104064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.606 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.607 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.607 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa33a740-c9fa-40be-ace1-a10250f287f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28776960, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6734ac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': 'b56c22a32da4dd9a6b1b88daaeee1b2937c0223570285fbdff3868649cf50e17'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da673e2a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': 'a57bd2ef92619c4c69b09335d17a721940af802b83d6b209a2d8b7421c30153f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29788672, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da674870-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'c43b943e7b15c13708bafbe3f79ea429d924f7702f2baf33d181e609b30ce2d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6754be-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '9efb780fa0fb7789dce553bedc7e4704ac292d58b36039e007fa5e39b5ee82af'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29989376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67872c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'b859a4359fe57d0c416cd170c81ddfea281651f9b506a24e87271f686c6db8ea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6791f4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '2c85c94c1cfdc865c57de546a2f0407e82379b320b5daaa3369020e328c47174'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30439936, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da679c12-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '44a007676c6265bb58b24a415cfc89c5b4dd1286aa6309edd431f4d9da93a318'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67a9f0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '33a0414c4e4df388dcba2fd4d1ca0f1ae2c56a091d78fc0b0ffa9ec0794bfc49'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30042624, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67b332-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '3782f23eec008dcd947f3b40086e61fcd4e7e1f423cfcb0d428fc075959890a8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67ba6c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '351d0ea75eb9030a85c7eca07fad5dae8cd71fef21d515e5a54c290210316d47'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27073536, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67c250-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '391f8e2c9b435c7ff3493a2334fa97119fd9a23acd6208d2078ed34d0c9ba332'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 176446, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb'
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: : 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67ce76-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'cf7a95b4fbdf839d7f7ce4d6cec7d3bfbfc43d2e7522c349a8036b9464ecc7c0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67d8d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '349b3fc5f8e69983d82584a3712c6e4e022de89e8521351ad3a0d6a28d4bb95c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67e348-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'db82e7118c4db83409af5fc27b24fb6c93809b9417d3d5b99ee756ccab452f22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30104064, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67eee2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': 'a336da76c0a3d4b5a105e7c2f11eafb2c1c58060070fb1cd2f3870d8b1ff248b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67f810-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '7e90345c9f1a737355570118c95615228207e97a9bcda1cf7dd731c401162fa7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31005184, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6802ba-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '12c3efb730d4ff5f34bbe2360aec2466571a761d5b603b17158ce939de49718b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sd
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: a'}, 'message_id': 'da680aa8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '77c91146c8fbf306780a8a5fa4b4d842355ead79ee7a3a5649405b9d656a425b'}]}, 'timestamp': '2025-12-05 12:06:38.607524', '_unique_id': '3c104dc1254c4217a948a16be3cf7c66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.609 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.bytes volume: 72880128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.609 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.610 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.bytes volume: 72773632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.610 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.611 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.611 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.bytes volume: 72900608 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.611 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.611 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.bytes volume: 72888320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.614 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '134553d5-3cb5-4e85-850a-2bc9536f63e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72880128, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da686a20-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': 'a4f9c7bc215aac52f11f65e2a6ec6bb3e756cd12981e33def80024c5fcf26ed3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68752e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '8cc24c82cc75911dd42ca334793f48f0aeaf472c04cb400aa7e5bd1a8c252a4a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72773632, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da687cae-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '4495c114c2261232c6359a00965b737e6f4718274bc35b0bb3bc14e5d8247f41'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da688550-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'a5d8d94f81b7532d15537928715e4d06789686ae892620db78288bece6e81e1d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72900608, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68a954-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '9de072bd67bb561db2ac38659cfeb9964458464dd6308acb2818fbd89a5149bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memor
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: y_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68b2fa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'e75689838f2388f9ea0eec05fffb59644ba5d64970050894fdaffcc8017d07b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68bb24-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': 'cb850020b89fa379e8fb1226733ede4fd1b0970e4ca68e18faf2e9c8cbda87f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68c81c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '012fedfdc7a120ab3709be2d708434b9dd64a8cbf067552ffa28438df335256b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68cfc4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '62c1d5315630a2b0661a9d71a1e3d3dbfa2b444294f099522968002771f55932'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68d6cc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': 'a034913498d8cc79e87b56351f7cdfe3c5ecd5c0ee799a8cf8fcbbf1a3fca10b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68de88-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'a10098387bff5f5c9e8a178f6285915f79386869790ef9fd23335e1440f1447e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: ': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68eac2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'e63a44d2c73ea666e893be7ecd280bb917509840706a3824150017879b3b35f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68f21a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'd9682717ae0f4a5afc54c3e27c8852114db18fee0a22751ea5ddb95e3d1a2b20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68f922-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '146baac9c74544486cf1f5d63b0f9c05a5bf6220ab9468de3915e79970d31823'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72998912, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da690034-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': 'e4c96918746aa4390b7359b4e5f2dea74580ddb1c75c86908c0b48e118904615'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da69071e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '741d69d9baef648cd0f215b3950d61a8b60f70d17ba84cfd3a04b22db1309217'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72888320, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da69110a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '7b87464ce26a143ab928edc4318bf2db596cc792536a841139a40b8cd240dcce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da69192a-d1d
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '5554f81e09338e5ba011cf392dde6fdb1cc04770e8973ffcd02b3d58cc46bd4a'}]}, 'timestamp': '2025-12-05 12:06:38.614451', '_unique_id': 'fbd85d8d75554bb8972b1ab6c2563e32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.616 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.bytes volume: 1766 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.616 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.bytes volume: 1478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.618 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.618 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.618 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.bytes volume: 1850 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c69ea4d3-9302-4114-9a10-6f7acb36abb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1766, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da6961c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'dca3df8f7acf0528a755c2809133749089a3bb056a4a45f337eadcd55065531c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da696a1a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '6b2e75fc8bd7bba2f5e38c646dfa751fb0fd7935acbd0b2609cd902d3e055701'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1478, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da698df6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': 'f4b3d87ed6964fb2fadbb4501c0de9aead8fb8d93f4a87478459015d27179e7c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da6995ee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': 'c040a305fb600038f12981f6d87b3cae12b01484e2c4fe1f4c7d131a27d8c969'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da699d82-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '1901ca4c0c610ba987c381afb4dd8e8740e3631570fb22e6c74b3f89368f2792'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 't
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: empest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da69a50c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'e15920ed7dd9a03fe0da8658b67f4e8606e93696e6b252da7cc36d2d0d43c084'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da69b024-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '9d443e7e4e0ffe061b1d90e84bdea6d3a4213405f09f78201649395ea45649b6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da69b812-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': 'e266145f296d8f7345737189a6ed32f1c4b8c98d9560a7840b74d60405067109'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1850, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da69bf9c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '333baf47bab16109880f582d6670e5c9dbc037d4e3801a61d3e0ebaec7117b88'}]}, 'timestamp': '2025-12-05 12:06:38.618709', '_unique_id': 'a0fe6515efdb43da9723453aab0bbef0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: : 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67ce [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.620 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.latency volume: 296278000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.620 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.latency volume: 22596550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.620 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.latency volume: 268442554 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.621 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.latency volume: 39800296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.621 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.621 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.latency volume: 194632019 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.622 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.latency volume: 22825632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.622 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.latency volume: 252632619 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.622 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.latency volume: 43738699 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.622 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.latency volume: 305354112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.623 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.latency volume: 32449780 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.623 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.latency volume: 298073877 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.623 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.latency volume: 256339525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.624 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.latency volume: 366072663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.624 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.latency volume: 1761181 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.624 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.latency volume: 227447368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.624 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.latency volume: 33644734 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.625 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.latency volume: 375710481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.626 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.latency volume: 23531071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa21226f-43fe-4ce3-8925-7f73f4fc3c3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 296278000, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a03bc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '63130161c56e925cf4a53ea4a8332429d34ace337a062308502cb21d8ce1a131'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22596550, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a0e16-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '9ac32984f8f04d74480477022a455d85dd9e257453ebc2b9df8a44baeefe47c3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 268442554, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a1762-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'a48e55fb57506c0b77e4c5a6ca663103507b344e4d0e52b0a795816a1399cc3c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39800296, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a2432-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '4854277bbe74fd530657e045703102c375c1f566f30bc2f7008e058aa162ba25'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 194632019, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a4958-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '956034644a86cce7a8caa05c28403e57b644775f3177eef3651242555d680a23'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22825632, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a548e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'f1320dc0b4ac67f806acab01047aa4a8564b012eeb5efbf3b0c8466316689db0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 252632619, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a5bdc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '82b92c30fcaf49c3c49bb2ce5d72ef21c8af0725ae1196f2188be4ecbef5df32'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43738699, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a6302-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '4d567a632c27128b8ad785a889b3afddf7aa8f8e5841696073b6f4d26812d82f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305354112, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a6b22-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '92788ba187893435bb3e56b91f79b88c8630e1bca84b84317af0668bea756c1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 32449780, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a78d8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '34c5a542c06ac49c60e74b89e03836dc7f8d50bc1ad96d087b380e4e787af5c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 298073877, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a835a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '5ad5c80388fe8967d4b618b24c0c0cd1ca5b95d6fae31d3a46b93ebb8fe8f031'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 256339525, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'o
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: s_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a8d32-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '91f5ee97039720bfb511409b99afdbf8c83b4b45222299099b9ed5459644bb0c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 366072663, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a97fa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'd35bef26877e8adec6e4b20477ff01bc5a4c6fdc38672ad541cef06e6dc938f6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1761181, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6aa3d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'ba3c7e2cbc1b0d9c87a9dfd962e2b6761de24a6a9a853a1e074b92b34f1219a0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 227447368, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6aadda-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '83368324e0fa15e7b08fcef9f734da3e1de48e45e3d87bc6327ac760921a8719'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33644734, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6ad6d4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '0427784b4b03afec5690ce0965daf9d807c28d78d7c735c21f9a120f006ceb5c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 375710481, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6ae534-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '8beff11ef08d59174b26b6eccb11ef354dbc714f87f1f1dd3a323de539179c91'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23531071, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: , 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6af10a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': 'bbbdbf079256f79790c8da023741086e43f4329741de4e4c187c870fd65c1223'}]}, 'timestamp': '2025-12-05 12:06:38.626580', '_unique_id': '6caec8cac496469f96ab22828d69d5a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: y_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'},  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: ': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68eac2-d1d2-11f0-8572- [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.requests volume: 1035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.630 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.630 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.requests volume: 1072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.630 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.631 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.631 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.requests volume: 1073 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.632 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.632 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.632 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.633 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.requests volume: 1076 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.633 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.633 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.requests volume: 926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.634 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.requests volume: 73 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.634 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.634 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.635 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.requests volume: 1087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.635 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.637 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.637 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90c8ea52-c9c3-4021-91ff-bd09aa563bd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1035, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6b7a08-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '74de7eb892269e4f278bdc76e100314c6f729b709cb222c9fb48c969c17f603d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6b876e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '47fcf20b081ec0c32c8088b00037d6f8f6af673bd90593a62deb9ad55b9e73e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1072, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6b9268-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'e7200723662aecf7a74c308c18e7de4a106c5581db9fb4388679eb3eabf0eaf5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6b9f92-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '50e2c98cde323cab826c3076e4e661609d9b7df94ea14fbe0819dfa842e555d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1073, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6bcd5a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '01763b5d7deda5aa14d6321eca8ea20a4208429b7f2a8e86606b943437a809b6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: _64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6bd944-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'd8db34c1e2df5ab7b0cce71f20cc968aaa804fc409321426956acea0681c65d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6be420-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '52f026c35ab82136e3b6dfd875bc2c1f926c53efe7d4e2b6062b4f37c026cee9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6bf1e0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '217519ac7265777e0c0f343fd803ba73872b611942071bc26388e566deded3d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1076, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6bfdd4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': 'c758ae2b7f7bee709c9fc5401f1c5e6df8da4fbda3f09f460738620cb6a66b29'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6c0888-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': 'eb2a3e6cb02c39ad34092b6fd28dd87debd18c24ead8a80717cfa7ed82510a57'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 926, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6c135a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '1ad4de852ae5273c4824edbe4daa346f1169118dd9cda6b19a048ee3b46133e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 73, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x8
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 6_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6c22c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '232c0e890fe5f59d14e8035ef2553674db4d3f4cf34404042b7250f9d37274fb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6c2e3a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '6515a70f62151837a4abff79a24d6de49c246e5bc8ceff0769601fa563f541b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6c3934-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'ebb2aab4622e44c73a04ce3317f82f3813f006331204a2fec5bfd30f63b235ed'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1087, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6c48fc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '00ed4b679438c767cf162c86ac682104adb863ca9a705f1f5535e2acc635aab5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6c80ec-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': 'b36d1244334471aec5bec8f0bad946dd67d9e5e994b56c7c980ca9ae0b495a93'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6ca3ec-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '66c8171b811437072a356eb39f71664c45a91d8b277c5efddbaa916547acc973'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: ', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6cad24-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': 'eb2bf9c380188a0039e26faa4320698bf8d1fb2ca669f5230e7ff7ae749781eb'}]}, 'timestamp': '2025-12-05 12:06:38.637937', '_unique_id': '8e278e94251542f89677423aa1959fa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.641 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:06:38 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.641 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: s_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, ' [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: _64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_g [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 6_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_ [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.867 187212 INFO nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Creating config drive at /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.config
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.875 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ddna3ky execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.966 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.967 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.988 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.989 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.989 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing instance network info cache due to event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.990 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.990 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:38 compute-0 nova_compute[187208]: 2025-12-05 12:06:38.990 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.010 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ddna3ky" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:39 compute-0 kernel: tapd10caa85-df: entered promiscuous mode
Dec 05 12:06:39 compute-0 NetworkManager[55691]: <info>  [1764936399.0714] manager: (tapd10caa85-df): new Tun device (/org/freedesktop/NetworkManager/Devices/217)
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:39 compute-0 ovn_controller[95610]: 2025-12-05T12:06:39Z|00502|binding|INFO|Claiming lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 for this chassis.
Dec 05 12:06:39 compute-0 ovn_controller[95610]: 2025-12-05T12:06:39Z|00503|binding|INFO|d10caa85-dfcd-49ce-8ff7-2c2a68d1d731: Claiming fa:16:3e:cc:8d:e9 10.100.0.10
Dec 05 12:06:39 compute-0 ovn_controller[95610]: 2025-12-05T12:06:39Z|00504|binding|INFO|Setting lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 ovn-installed in OVS
Dec 05 12:06:39 compute-0 ovn_controller[95610]: 2025-12-05T12:06:39Z|00505|binding|INFO|Setting lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 up in Southbound
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.145 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:8d:e9 10.100.0.10'], port_security=['fa:16:3e:cc:8d:e9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ed00d159-9d70-481e-93be-ea180fea04ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59233d66-44e6-47b3-b612-4f7d677af03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc1fd38e325f4a2caa75aeab79da75d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cb353a76-4787-4857-933e-e95743324e9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f37497c0-7b03-4b0b-94d8-7ed5a2c705cb, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.147 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 in datapath 59233d66-44e6-47b3-b612-4f7d677af03d bound to our chassis
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.152 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 59233d66-44e6-47b3-b612-4f7d677af03d
Dec 05 12:06:39 compute-0 ovn_controller[95610]: 2025-12-05T12:06:39Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bb:19:b7 10.100.0.7
Dec 05 12:06:39 compute-0 ovn_controller[95610]: 2025-12-05T12:06:39Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bb:19:b7 10.100.0.7
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.166 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb9f4c3-ce3f-4ef5-8929-02fd699ed5e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.167 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap59233d66-41 in ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:06:39 compute-0 systemd-machined[153543]: New machine qemu-68-instance-00000040.
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.170 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap59233d66-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6695cd-bcd7-4188-936d-e1c062efb5c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.173 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dca72c4a-15b2-45d0-97da-b10f05f24ec1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 systemd[1]: Started Virtual Machine qemu-68-instance-00000040.
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.186 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[115314f5-617d-4695-92b7-b735289b384d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 systemd-udevd[227041]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:39 compute-0 NetworkManager[55691]: <info>  [1764936399.2027] device (tapd10caa85-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:06:39 compute-0 NetworkManager[55691]: <info>  [1764936399.2039] device (tapd10caa85-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.205 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[790cc2fd-28ed-46f9-956a-9e8d36b6cbb3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.241 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[23ddb75d-881a-4ccb-b995-e17c55b1c9cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 systemd-udevd[227044]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:39 compute-0 NetworkManager[55691]: <info>  [1764936399.2486] manager: (tap59233d66-40): new Veth device (/org/freedesktop/NetworkManager/Devices/218)
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.247 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d9dfdb30-515b-489b-98b0-218c77c8acc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.286 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3043ad1f-a879-4a42-b5f5-54f1bf4ed3e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.290 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4020b9f1-e42c-498e-b7f0-f6d566030ce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 NetworkManager[55691]: <info>  [1764936399.3125] device (tap59233d66-40): carrier: link connected
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.319 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff067d2-a0af-42e1-b4d4-176e3e535734]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.339 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8325463-7746-4690-b125-70012715f990]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59233d66-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:70:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377987, 'reachable_time': 23411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227075, 'error': None, 'target': 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.356 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a5f1289b-563f-4c06-8c85-ec4437542bdc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:7074'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377987, 'tstamp': 377987}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227076, 'error': None, 'target': 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.378 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[931b7a72-94eb-4e6f-9e0f-55e621da2341]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59233d66-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:70:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377987, 'reachable_time': 23411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227077, 'error': None, 'target': 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.394 187212 DEBUG nova.network.neutron [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updated VIF entry in instance network info cache for port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.395 187212 DEBUG nova.network.neutron [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updating instance_info_cache with network_info: [{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.408 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54c5eb9b-a04c-4938-a139-fcb908714311]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.424 187212 DEBUG oslo_concurrency.lockutils [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.470 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[47968083-6110-4d0b-b040-3f46a55bce53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.471 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59233d66-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.471 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.472 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap59233d66-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:39 compute-0 NetworkManager[55691]: <info>  [1764936399.4748] manager: (tap59233d66-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Dec 05 12:06:39 compute-0 kernel: tap59233d66-40: entered promiscuous mode
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.479 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.480 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap59233d66-40, col_values=(('external_ids', {'iface-id': '229f26d0-355d-483b-86df-f1f319e2601e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.482 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:39 compute-0 ovn_controller[95610]: 2025-12-05T12:06:39Z|00506|binding|INFO|Releasing lport 229f26d0-355d-483b-86df-f1f319e2601e from this chassis (sb_readonly=0)
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.497 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.498 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/59233d66-44e6-47b3-b612-4f7d677af03d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/59233d66-44e6-47b3-b612-4f7d677af03d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.500 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[24c22ac3-7515-44cc-a902-5c617503f4a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.501 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-59233d66-44e6-47b3-b612-4f7d677af03d
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/59233d66-44e6-47b3-b612-4f7d677af03d.pid.haproxy
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 59233d66-44e6-47b3-b612-4f7d677af03d
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:06:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.504 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'env', 'PROCESS_TAG=haproxy-59233d66-44e6-47b3-b612-4f7d677af03d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/59233d66-44e6-47b3-b612-4f7d677af03d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.595 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.698 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.699 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.699 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.699 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.706 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936399.7061496, ed00d159-9d70-481e-93be-ea180fea04ba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.707 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] VM Started (Lifecycle Event)
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.766 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.772 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936399.7107768, ed00d159-9d70-481e-93be-ea180fea04ba => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.773 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] VM Paused (Lifecycle Event)
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.816 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.819 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:39 compute-0 nova_compute[187208]: 2025-12-05 12:06:39.861 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:39 compute-0 podman[227121]: 2025-12-05 12:06:39.934534457 +0000 UTC m=+0.060385203 container create 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:06:39 compute-0 systemd[1]: Started libpod-conmon-44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b.scope.
Dec 05 12:06:39 compute-0 podman[227121]: 2025-12-05 12:06:39.905414592 +0000 UTC m=+0.031265368 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:06:40 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:06:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0d77ec8b1cc024940e91355c3fecada2e5d7bf69ad2fc36a67c677303a54e3e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:06:40 compute-0 podman[227121]: 2025-12-05 12:06:40.02733033 +0000 UTC m=+0.153181106 container init 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 12:06:40 compute-0 podman[227121]: 2025-12-05 12:06:40.036460565 +0000 UTC m=+0.162311311 container start 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 12:06:40 compute-0 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [NOTICE]   (227141) : New worker (227143) forked
Dec 05 12:06:40 compute-0 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [NOTICE]   (227141) : Loading success.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.264 187212 DEBUG nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.266 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.267 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.267 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.267 187212 DEBUG nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.268 187212 WARNING nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state powering-on.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.268 187212 DEBUG nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.269 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.269 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.269 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.270 187212 DEBUG nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Processing event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.271 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.280 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.282 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936400.2779553, ed00d159-9d70-481e-93be-ea180fea04ba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.282 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] VM Resumed (Lifecycle Event)
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.289 187212 INFO nova.virt.libvirt.driver [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance spawned successfully.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.290 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.320 187212 DEBUG nova.network.neutron [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updating instance_info_cache with network_info: [{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.358 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.359 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Releasing lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.370 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.371 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.372 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.372 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.373 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.374 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.378 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.393 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance destroyed successfully.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.395 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'numa_topology' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.437 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.439 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'resources' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.467 187212 DEBUG nova.virt.libvirt.vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:34Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.468 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.469 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.469 187212 DEBUG os_vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.473 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.473 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap549318e9-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.475 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.477 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.478 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.480 187212 INFO os_vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6')
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.490 187212 DEBUG nova.virt.libvirt.driver [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start _get_guest_xml network_info=[{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.496 187212 WARNING nova.virt.libvirt.driver [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.502 187212 DEBUG nova.virt.libvirt.host [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.504 187212 DEBUG nova.virt.libvirt.host [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.511 187212 DEBUG nova.virt.libvirt.host [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.512 187212 DEBUG nova.virt.libvirt.host [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.513 187212 DEBUG nova.virt.libvirt.driver [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.513 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.514 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.515 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.515 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.516 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.516 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.517 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.518 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.518 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.519 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.520 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.520 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'vcpu_model' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.525 187212 INFO nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Took 6.09 seconds to spawn the instance on the hypervisor.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.526 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.543 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.607 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.608 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.609 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.610 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.611 187212 DEBUG nova.virt.libvirt.vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:34Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.611 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.613 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.614 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'pci_devices' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.683 187212 INFO nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Took 6.94 seconds to build instance.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.690 187212 DEBUG nova.virt.libvirt.driver [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <uuid>cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</uuid>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <name>instance-00000038</name>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-1365452817</nova:name>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:06:40</nova:creationTime>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:06:40 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:06:40 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:06:40 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:06:40 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:06:40 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:06:40 compute-0 nova_compute[187208]:         <nova:user uuid="4f8149b8192e411a9131b103b25862b6">tempest-ListServerFiltersTestJSON-711798252-project-member</nova:user>
Dec 05 12:06:40 compute-0 nova_compute[187208]:         <nova:project uuid="e8f613c8797e432d96e43223fb7c476d">tempest-ListServerFiltersTestJSON-711798252</nova:project>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:06:40 compute-0 nova_compute[187208]:         <nova:port uuid="549318e9-e629-4e2c-8cbb-3cd263c2bc34">
Dec 05 12:06:40 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <system>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <entry name="serial">cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</entry>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <entry name="uuid">cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</entry>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     </system>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <os>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   </os>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <features>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   </features>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:9b:d7:ed"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <target dev="tap549318e9-e6"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/console.log" append="off"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <video>
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     </video>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:06:40 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:06:40 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:06:40 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:06:40 compute-0 nova_compute[187208]: </domain>
Dec 05 12:06:40 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.696 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.725 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.768 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.769 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.834 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.837 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'trusted_certs' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.853 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.913 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updated VIF entry in instance network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.914 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.924 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.925 187212 DEBUG nova.virt.disk.api [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Checking if we can resize image /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.925 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.950 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.951 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.952 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.952 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.953 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.953 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.953 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.954 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.954 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.954 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.954 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.955 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.955 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.955 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.956 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.956 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.956 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.956 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.957 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.957 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.957 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.958 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.958 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.958 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.958 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.959 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.959 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.959 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.960 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.960 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.960 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.997 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.998 187212 DEBUG nova.virt.disk.api [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Cannot resize image /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:06:40 compute-0 nova_compute[187208]: 2025-12-05 12:06:40.999 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'migration_context' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.012 187212 DEBUG nova.virt.libvirt.vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:34Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.012 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.014 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.014 187212 DEBUG os_vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.016 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.016 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.020 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap549318e9-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.020 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap549318e9-e6, col_values=(('external_ids', {'iface-id': '549318e9-e629-4e2c-8cbb-3cd263c2bc34', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:d7:ed', 'vm-uuid': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:41 compute-0 NetworkManager[55691]: <info>  [1764936401.0350] manager: (tap549318e9-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.036 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.044 187212 DEBUG nova.compute.manager [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-changed-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.044 187212 DEBUG nova.compute.manager [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing instance network info cache due to event network-changed-48b30c48-7858-408b-aeab-df46f6277546. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.045 187212 DEBUG oslo_concurrency.lockutils [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.045 187212 DEBUG oslo_concurrency.lockutils [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.045 187212 DEBUG nova.network.neutron [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing network info cache for port 48b30c48-7858-408b-aeab-df46f6277546 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.046 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.048 187212 INFO os_vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6')
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:06:41 compute-0 NetworkManager[55691]: <info>  [1764936401.1301] manager: (tap549318e9-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Dec 05 12:06:41 compute-0 kernel: tap549318e9-e6: entered promiscuous mode
Dec 05 12:06:41 compute-0 ovn_controller[95610]: 2025-12-05T12:06:41Z|00507|binding|INFO|Claiming lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 for this chassis.
Dec 05 12:06:41 compute-0 ovn_controller[95610]: 2025-12-05T12:06:41Z|00508|binding|INFO|549318e9-e629-4e2c-8cbb-3cd263c2bc34: Claiming fa:16:3e:9b:d7:ed 10.100.0.9
Dec 05 12:06:41 compute-0 systemd-udevd[227067]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.135 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.145 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '7', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.146 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.149 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:06:41 compute-0 NetworkManager[55691]: <info>  [1764936401.1518] device (tap549318e9-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:06:41 compute-0 NetworkManager[55691]: <info>  [1764936401.1529] device (tap549318e9-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:06:41 compute-0 ovn_controller[95610]: 2025-12-05T12:06:41Z|00509|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 ovn-installed in OVS
Dec 05 12:06:41 compute-0 ovn_controller[95610]: 2025-12-05T12:06:41Z|00510|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 up in Southbound
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.177 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af2d3eb1-9c85-4e8a-918f-0c13fb0ce2fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:41 compute-0 systemd-machined[153543]: New machine qemu-69-instance-00000038.
Dec 05 12:06:41 compute-0 systemd[1]: Started Virtual Machine qemu-69-instance-00000038.
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.210 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ab650c-c533-400f-b01b-21a7c59ff973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.217 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[54b0af44-f8b3-4e3d-82a9-77a83b506309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.256 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[798bd06c-103e-4840-8ea2-9e7d307faa17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.309 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed5a05ab-b243-4c8d-b91b-d52f00f3e9c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227195, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.328 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[03ea06a1-8006-4cf0-915f-144e4fa00539]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227197, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227197, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.330 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.332 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.335 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.335 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.336 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.336 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.643 187212 DEBUG nova.compute.manager [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.644 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for cbcd4733-8c53-4696-9bc0-6e5c516c9dcf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.645 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936401.642585, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.646 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Resumed (Lifecycle Event)
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.650 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance rebooted successfully.
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.651 187212 DEBUG nova.compute.manager [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.712 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.718 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.756 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936401.6429155, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.757 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Started (Lifecycle Event)
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.816 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:41 compute-0 nova_compute[187208]: 2025-12-05 12:06:41.822 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.068 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.237 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot image upload complete
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.238 187212 INFO nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 6.10 seconds to snapshot the instance on the hypervisor.
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.502 187212 DEBUG nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found 3 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.503 187212 DEBUG nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Rotating out 1 backups _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4458
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.504 187212 DEBUG nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deleting image aa21033c-b586-4741-8de3-906338ad12ee _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4463
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.625 187212 DEBUG nova.network.neutron [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updated VIF entry in instance network info cache for port 48b30c48-7858-408b-aeab-df46f6277546. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.626 187212 DEBUG nova.network.neutron [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.652 187212 DEBUG oslo_concurrency.lockutils [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:42 compute-0 nova_compute[187208]: 2025-12-05 12:06:42.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.082 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.083 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.084 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.084 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.164 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.166 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.166 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.167 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.167 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] No waiting events found dispatching network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.168 187212 WARNING nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received unexpected event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 for instance with vm_state active and task_state None.
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.168 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-changed-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.168 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing instance network info cache due to event network-changed-48b30c48-7858-408b-aeab-df46f6277546. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.169 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.169 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.170 187212 DEBUG nova.network.neutron [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing network info cache for port 48b30c48-7858-408b-aeab-df46f6277546 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:43 compute-0 podman[227206]: 2025-12-05 12:06:43.24224075 +0000 UTC m=+0.093289918 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.286 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.358 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.359 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.423 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.431 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.516 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.518 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.597 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.605 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.682 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.683 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.708 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.709 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.710 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.710 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.710 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.712 187212 INFO nova.compute.manager [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Terminating instance
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.713 187212 DEBUG nova.compute.manager [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:06:43 compute-0 kernel: tap821e6243-8d (unregistering): left promiscuous mode
Dec 05 12:06:43 compute-0 NetworkManager[55691]: <info>  [1764936403.7407] device (tap821e6243-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:06:43 compute-0 ovn_controller[95610]: 2025-12-05T12:06:43Z|00511|binding|INFO|Releasing lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed from this chassis (sb_readonly=0)
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.755 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:43 compute-0 ovn_controller[95610]: 2025-12-05T12:06:43Z|00512|binding|INFO|Setting lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed down in Southbound
Dec 05 12:06:43 compute-0 ovn_controller[95610]: 2025-12-05T12:06:43Z|00513|binding|INFO|Removing iface tap821e6243-8d ovn-installed in OVS
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.763 187212 DEBUG nova.compute.manager [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.764 187212 DEBUG nova.compute.manager [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing instance network info cache due to event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.764 187212 DEBUG oslo_concurrency.lockutils [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.764 187212 DEBUG oslo_concurrency.lockutils [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.765 187212 DEBUG nova.network.neutron [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.782 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.786 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.791 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:43 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Dec 05 12:06:43 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000003b.scope: Consumed 13.696s CPU time.
Dec 05 12:06:43 compute-0 systemd-machined[153543]: Machine qemu-63-instance-0000003b terminated.
Dec 05 12:06:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.886 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:47:26 10.100.0.9'], port_security=['fa:16:3e:a6:47:26 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f4c4888-4b32-4259-8441-31af091e0c7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85037de7275442698e604ee3f6283cbc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed3fff5f-a24a-492e-ba85-8f010d446cfc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac2e7e6b-9342-46f8-a910-5de5a261f0a9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=821e6243-8d28-4c8c-874c-f1e69c7d3bed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.887 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 821e6243-8d28-4c8c-874c-f1e69c7d3bed in datapath 0f4c4888-4b32-4259-8441-31af091e0c7d unbound from our chassis
Dec 05 12:06:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.890 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f4c4888-4b32-4259-8441-31af091e0c7d
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.892 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.897 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.905 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[abcaf5c0-e078-43e1-9d40-177753fdb042]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.951 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e56e4c08-8222-4e88-8128-3264533fcd19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.956 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b087bb03-78d8-41d8-806d-86acaaf9e04c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.984 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:43 compute-0 nova_compute[187208]: 2025-12-05 12:06:43.994 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.997 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3a3ed0ee-22fd-4552-b756-cc3e1f956995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.018 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c80c2de7-dbfa-4db9-8181-4a7c6a3a75f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f4c4888-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:45:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372337, 'reachable_time': 25070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227283, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.019 187212 INFO nova.virt.libvirt.driver [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance destroyed successfully.
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.020 187212 DEBUG nova.objects.instance [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'resources' on Instance uuid 297d72ef-6b79-45b3-813b-52b5144b522e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.036 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7495b3-c7aa-4de8-a742-2e91dd005233]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0f4c4888-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372348, 'tstamp': 372348}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227285, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0f4c4888-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372350, 'tstamp': 372350}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227285, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.037 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f4c4888-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.038 187212 DEBUG nova.virt.libvirt.vif [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-2111676304',display_name='tempest-FloatingIPsAssociationTestJSON-server-2111676304',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-2111676304',id=59,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-3sf4jdpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:20Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=297d72ef-6b79-45b3-813b-52b5144b522e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.040 187212 DEBUG nova.network.os_vif_util [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.041 187212 DEBUG nova.network.os_vif_util [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.042 187212 DEBUG os_vif [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.044 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.044 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f4c4888-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.045 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.045 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f4c4888-40, col_values=(('external_ids', {'iface-id': 'b2e28c8a-557d-459b-807e-dd1f5be0a608'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.045 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap821e6243-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.045 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.049 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.051 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.054 187212 INFO os_vif [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d')
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.054 187212 INFO nova.virt.libvirt.driver [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Deleting instance files /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e_del
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.055 187212 INFO nova.virt.libvirt.driver [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Deletion of /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e_del complete
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.065 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.066 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.128 187212 INFO nova.compute.manager [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Took 0.41 seconds to destroy the instance on the hypervisor.
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.129 187212 DEBUG oslo.service.loopingcall [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.130 187212 DEBUG nova.compute.manager [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.130 187212 DEBUG nova.network.neutron [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.134 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.137 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Error from libvirt while getting description of instance-0000003b: [Error Code 42] Domain not found: no domain with matching uuid '297d72ef-6b79-45b3-813b-52b5144b522e' (instance-0000003b): libvirt.libvirtError: Domain not found: no domain with matching uuid '297d72ef-6b79-45b3-813b-52b5144b522e' (instance-0000003b)
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.142 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.169 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.170 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.217 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.218 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.289 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.298 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.369 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.370 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.450 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.458 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.534 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.537 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.600 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.609 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.672 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.674 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.737 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.744 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.805 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.808 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:06:44 compute-0 nova_compute[187208]: 2025-12-05 12:06:44.871 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.172 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.174 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=3978MB free_disk=72.98876953125GB free_vcpus=-3 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.174 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.175 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.272 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.273 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 472c7e2c-bdad-4230-904b-6937ceb872d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.273 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.274 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 8888dd78-1c78-4065-8536-9a1096bdf57b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.274 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b81bb939-d14f-4a72-b7fe-95fc5d8810a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.274 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 297d72ef-6b79-45b3-813b-52b5144b522e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.274 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.275 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance bcdca3f9-3e24-4209-808c-8093b55e5c2d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.275 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 5d70ac2d-111f-4e1b-ac26-3e02849b0458 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.275 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance e9f9bf08-7688-4213-91ff-74f2271ec71d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.276 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance ed00d159-9d70-481e-93be-ea180fea04ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.278 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 11 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.278 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1984MB phys_disk=79GB used_disk=11GB total_vcpus=8 used_vcpus=11 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.388 187212 DEBUG nova.network.neutron [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.417 187212 INFO nova.compute.manager [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Took 1.29 seconds to deallocate network for instance.
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.469 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.490 187212 DEBUG nova.network.neutron [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updated VIF entry in instance network info cache for port 48b30c48-7858-408b-aeab-df46f6277546. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.491 187212 DEBUG nova.network.neutron [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.510 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.511 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.511 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.512 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.512 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.513 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.513 187212 WARNING nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state active and task_state None.
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.513 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.514 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.514 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.514 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.515 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.515 187212 WARNING nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state active and task_state None.
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.582 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.991 187212 DEBUG nova.network.neutron [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updated VIF entry in instance network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:45 compute-0 nova_compute[187208]: 2025-12-05 12:06:45.992 187212 DEBUG nova.network.neutron [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.075 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.081 187212 DEBUG oslo_concurrency.lockutils [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.104 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.105 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.106 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:46.173 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.294 187212 DEBUG nova.compute.provider_tree [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.381 187212 DEBUG nova.scheduler.client.report [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.411 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.521 187212 INFO nova.scheduler.client.report [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Deleted allocations for instance 297d72ef-6b79-45b3-813b-52b5144b522e
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.614 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.691 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.692 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:46 compute-0 nova_compute[187208]: 2025-12-05 12:06:46.692 187212 DEBUG nova.objects.instance [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.101 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.102 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.135 187212 DEBUG nova.objects.instance [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.152 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.337 187212 DEBUG nova.policy [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:47 compute-0 ovn_controller[95610]: 2025-12-05T12:06:47Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:bb:58 10.100.0.8
Dec 05 12:06:47 compute-0 ovn_controller[95610]: 2025-12-05T12:06:47Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:bb:58 10.100.0.8
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.896 187212 DEBUG nova.compute.manager [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-unplugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.897 187212 DEBUG oslo_concurrency.lockutils [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.898 187212 DEBUG oslo_concurrency.lockutils [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.898 187212 DEBUG oslo_concurrency.lockutils [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.898 187212 DEBUG nova.compute.manager [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] No waiting events found dispatching network-vif-unplugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.899 187212 WARNING nova.compute.manager [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received unexpected event network-vif-unplugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed for instance with vm_state deleted and task_state None.
Dec 05 12:06:47 compute-0 nova_compute[187208]: 2025-12-05 12:06:47.999 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Successfully created port: 8749491f-af83-499c-b823-14496cf1872d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:06:48 compute-0 nova_compute[187208]: 2025-12-05 12:06:48.905 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Successfully updated port: 8749491f-af83-499c-b823-14496cf1872d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:06:48 compute-0 nova_compute[187208]: 2025-12-05 12:06:48.931 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:48 compute-0 nova_compute[187208]: 2025-12-05 12:06:48.932 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:48 compute-0 nova_compute[187208]: 2025-12-05 12:06:48.932 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:06:49 compute-0 nova_compute[187208]: 2025-12-05 12:06:49.050 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:49 compute-0 nova_compute[187208]: 2025-12-05 12:06:49.101 187212 WARNING nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it
Dec 05 12:06:50 compute-0 nova_compute[187208]: 2025-12-05 12:06:50.197 187212 DEBUG nova.compute.manager [req-7b28ae3d-2feb-4d2b-8b25-52d37732b78e req-3e2b296e-1e46-47c5-9810-292404045315 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-deleted-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:50 compute-0 nova_compute[187208]: 2025-12-05 12:06:50.197 187212 INFO nova.compute.manager [req-7b28ae3d-2feb-4d2b-8b25-52d37732b78e req-3e2b296e-1e46-47c5-9810-292404045315 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Neutron deleted interface 821e6243-8d28-4c8c-874c-f1e69c7d3bed; detaching it from the instance and deleting it from the info cache
Dec 05 12:06:50 compute-0 nova_compute[187208]: 2025-12-05 12:06:50.197 187212 DEBUG nova.network.neutron [req-7b28ae3d-2feb-4d2b-8b25-52d37732b78e req-3e2b296e-1e46-47c5-9810-292404045315 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Dec 05 12:06:50 compute-0 nova_compute[187208]: 2025-12-05 12:06:50.200 187212 DEBUG nova.compute.manager [req-7b28ae3d-2feb-4d2b-8b25-52d37732b78e req-3e2b296e-1e46-47c5-9810-292404045315 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Detach interface failed, port_id=821e6243-8d28-4c8c-874c-f1e69c7d3bed, reason: Instance 297d72ef-6b79-45b3-813b-52b5144b522e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Dec 05 12:06:50 compute-0 podman[227340]: 2025-12-05 12:06:50.227262053 +0000 UTC m=+0.075407309 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.243 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.472 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.475 187212 DEBUG nova.virt.libvirt.vif [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.476 187212 DEBUG nova.network.os_vif_util [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.477 187212 DEBUG nova.network.os_vif_util [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.477 187212 DEBUG os_vif [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.478 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.478 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.479 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.481 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.481 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8749491f-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.482 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8749491f-af, col_values=(('external_ids', {'iface-id': '8749491f-af83-499c-b823-14496cf1872d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:50:d2', 'vm-uuid': '25918fc4-05ec-4a16-b77f-ca1d352a2763'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.484 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 NetworkManager[55691]: <info>  [1764936412.4852] manager: (tap8749491f-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/222)
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.487 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.489 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.491 187212 INFO os_vif [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af')
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.491 187212 DEBUG nova.virt.libvirt.vif [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.492 187212 DEBUG nova.network.os_vif_util [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.492 187212 DEBUG nova.network.os_vif_util [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.495 187212 DEBUG nova.virt.libvirt.guest [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:83:50:d2"/>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <target dev="tap8749491f-af"/>
Dec 05 12:06:52 compute-0 nova_compute[187208]: </interface>
Dec 05 12:06:52 compute-0 nova_compute[187208]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 05 12:06:52 compute-0 kernel: tap8749491f-af: entered promiscuous mode
Dec 05 12:06:52 compute-0 NetworkManager[55691]: <info>  [1764936412.5137] manager: (tap8749491f-af): new Tun device (/org/freedesktop/NetworkManager/Devices/223)
Dec 05 12:06:52 compute-0 ovn_controller[95610]: 2025-12-05T12:06:52Z|00514|binding|INFO|Claiming lport 8749491f-af83-499c-b823-14496cf1872d for this chassis.
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 ovn_controller[95610]: 2025-12-05T12:06:52Z|00515|binding|INFO|8749491f-af83-499c-b823-14496cf1872d: Claiming fa:16:3e:83:50:d2 10.100.0.14
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.516 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 ovn_controller[95610]: 2025-12-05T12:06:52Z|00516|binding|INFO|Setting lport 8749491f-af83-499c-b823-14496cf1872d ovn-installed in OVS
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.534 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 systemd-udevd[227379]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.546 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:50:d2 10.100.0.14'], port_security=['fa:16:3e:83:50:d2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8749491f-af83-499c-b823-14496cf1872d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:52 compute-0 ovn_controller[95610]: 2025-12-05T12:06:52Z|00517|binding|INFO|Setting lport 8749491f-af83-499c-b823-14496cf1872d up in Southbound
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.548 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8749491f-af83-499c-b823-14496cf1872d in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.551 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:06:52 compute-0 NetworkManager[55691]: <info>  [1764936412.5572] device (tap8749491f-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:06:52 compute-0 NetworkManager[55691]: <info>  [1764936412.5592] device (tap8749491f-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.568 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cf2230c1-f93b-4ab4-a5ab-740e3a4741ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.604 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6f6b7613-375b-485b-97c7-116b255543e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.608 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[689c56a0-620d-43b4-9d93-a5bceabe7e4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.614 187212 DEBUG nova.virt.libvirt.driver [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.614 187212 DEBUG nova.virt.libvirt.driver [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.614 187212 DEBUG nova.virt.libvirt.driver [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:7b:68:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.615 187212 DEBUG nova.virt.libvirt.driver [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:83:50:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.637 187212 DEBUG nova.virt.libvirt.guest [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:06:52</nova:creationTime>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:06:52 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec 05 12:06:52 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     <nova:port uuid="8749491f-af83-499c-b823-14496cf1872d">
Dec 05 12:06:52 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:06:52 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:06:52 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:06:52 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:06:52 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.646 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f88d806e-6dd9-462a-85cc-d3d185f1de23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.666 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[55493d3d-31af-400a-97e0-f7af2146d062]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375484, 'reachable_time': 41038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227388, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.674 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.687 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3d00ca-837c-4a95-be62-86b59a27ec41]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375496, 'tstamp': 375496}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227392, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375499, 'tstamp': 375499}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227392, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.689 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.690 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 nova_compute[187208]: 2025-12-05 12:06:52.692 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.695 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.695 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.695 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.696 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.111 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.281 187212 DEBUG nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.281 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.282 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.282 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.282 187212 DEBUG nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] No waiting events found dispatching network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.282 187212 WARNING nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received unexpected event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed for instance with vm_state deleted and task_state None.
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-changed-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Refreshing instance network info cache due to event network-changed-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:53 compute-0 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG nova.network.neutron [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Refreshing network info cache for port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:54 compute-0 ovn_controller[95610]: 2025-12-05T12:06:54Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:50:d2 10.100.0.14
Dec 05 12:06:54 compute-0 ovn_controller[95610]: 2025-12-05T12:06:54Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:50:d2 10.100.0.14
Dec 05 12:06:54 compute-0 ovn_controller[95610]: 2025-12-05T12:06:54Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cc:8d:e9 10.100.0.10
Dec 05 12:06:54 compute-0 ovn_controller[95610]: 2025-12-05T12:06:54Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cc:8d:e9 10.100.0.10
Dec 05 12:06:54 compute-0 nova_compute[187208]: 2025-12-05 12:06:54.529 187212 DEBUG nova.compute.manager [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:54 compute-0 nova_compute[187208]: 2025-12-05 12:06:54.530 187212 DEBUG nova.compute.manager [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:54 compute-0 nova_compute[187208]: 2025-12-05 12:06:54.530 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:54 compute-0 nova_compute[187208]: 2025-12-05 12:06:54.531 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:54 compute-0 nova_compute[187208]: 2025-12-05 12:06:54.531 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:55 compute-0 ovn_controller[95610]: 2025-12-05T12:06:55Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:d7:ed 10.100.0.9
Dec 05 12:06:56 compute-0 nova_compute[187208]: 2025-12-05 12:06:56.719 187212 DEBUG nova.network.neutron [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updated VIF entry in instance network info cache for port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:56 compute-0 nova_compute[187208]: 2025-12-05 12:06:56.719 187212 DEBUG nova.network.neutron [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updating instance_info_cache with network_info: [{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:56 compute-0 nova_compute[187208]: 2025-12-05 12:06:56.748 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:56 compute-0 nova_compute[187208]: 2025-12-05 12:06:56.861 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:56 compute-0 nova_compute[187208]: 2025-12-05 12:06:56.862 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:56 compute-0 nova_compute[187208]: 2025-12-05 12:06:56.862 187212 INFO nova.compute.manager [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Shelving
Dec 05 12:06:56 compute-0 nova_compute[187208]: 2025-12-05 12:06:56.890 187212 DEBUG nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:06:57 compute-0 podman[227406]: 2025-12-05 12:06:57.215978914 +0000 UTC m=+0.061154196 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 12:06:57 compute-0 podman[227405]: 2025-12-05 12:06:57.225457719 +0000 UTC m=+0.070820336 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.460 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.461 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.485 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.485 187212 DEBUG nova.compute.manager [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-changed-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.485 187212 DEBUG nova.compute.manager [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing instance network info cache due to event network-changed-8749491f-af83-499c-b823-14496cf1872d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.486 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.486 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.486 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing network info cache for port 8749491f-af83-499c-b823-14496cf1872d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.487 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:57 compute-0 nova_compute[187208]: 2025-12-05 12:06:57.677 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.592 187212 DEBUG nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.593 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.593 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.593 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.593 187212 DEBUG nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.594 187212 WARNING nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d for instance with vm_state active and task_state None.
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.594 187212 DEBUG nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.594 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.594 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.595 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.595 187212 DEBUG nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.595 187212 WARNING nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d for instance with vm_state active and task_state None.
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.696 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updated VIF entry in instance network info cache for port 8749491f-af83-499c-b823-14496cf1872d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.697 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.721 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.990 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936403.9883554, 297d72ef-6b79-45b3-813b-52b5144b522e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:06:58 compute-0 nova_compute[187208]: 2025-12-05 12:06:58.990 187212 INFO nova.compute.manager [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] VM Stopped (Lifecycle Event)
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.017 187212 DEBUG nova.compute.manager [None req-fb1722fa-3dbb-49f5-b3b2-9433e42b8c95 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:06:59 compute-0 kernel: tapac02dd63-5a (unregistering): left promiscuous mode
Dec 05 12:06:59 compute-0 NetworkManager[55691]: <info>  [1764936419.0848] device (tapac02dd63-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.088 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:59 compute-0 ovn_controller[95610]: 2025-12-05T12:06:59Z|00518|binding|INFO|Releasing lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b from this chassis (sb_readonly=0)
Dec 05 12:06:59 compute-0 ovn_controller[95610]: 2025-12-05T12:06:59Z|00519|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b down in Southbound
Dec 05 12:06:59 compute-0 ovn_controller[95610]: 2025-12-05T12:06:59Z|00520|binding|INFO|Removing iface tapac02dd63-5a ovn-installed in OVS
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.098 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.102 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c5:99 10.100.0.8'], port_security=['fa:16:3e:6a:c5:99 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d62df5807554f499d26b5fc77ec8603', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5a04f4af-e81b-4661-95ed-5737ffc98cae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a7d298f-265e-44c5-a73a-18dd9ed0b171, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.104 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b in datapath fc6ce614-d0f7-413f-bc3e-26f7271993d9 unbound from our chassis
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.106 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc6ce614-d0f7-413f-bc3e-26f7271993d9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.109 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e91fa82a-1db2-4f34-99c8-685ff480d1e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.109 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 namespace which is not needed anymore
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:59 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Dec 05 12:06:59 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000003e.scope: Consumed 15.067s CPU time.
Dec 05 12:06:59 compute-0 systemd-machined[153543]: Machine qemu-65-instance-0000003e terminated.
Dec 05 12:06:59 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [NOTICE]   (226438) : haproxy version is 2.8.14-c23fe91
Dec 05 12:06:59 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [NOTICE]   (226438) : path to executable is /usr/sbin/haproxy
Dec 05 12:06:59 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [WARNING]  (226438) : Exiting Master process...
Dec 05 12:06:59 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [ALERT]    (226438) : Current worker (226440) exited with code 143 (Terminated)
Dec 05 12:06:59 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [WARNING]  (226438) : All workers exited. Exiting... (0)
Dec 05 12:06:59 compute-0 systemd[1]: libpod-161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c.scope: Deactivated successfully.
Dec 05 12:06:59 compute-0 podman[227469]: 2025-12-05 12:06:59.248922095 +0000 UTC m=+0.042402241 container died 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:06:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c-userdata-shm.mount: Deactivated successfully.
Dec 05 12:06:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-a1f29ee6b8791a73cdc634bfdec56c186b809e5d9ad3d5a603996e4c487e56a0-merged.mount: Deactivated successfully.
Dec 05 12:06:59 compute-0 podman[227469]: 2025-12-05 12:06:59.290909534 +0000 UTC m=+0.084389680 container cleanup 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 12:06:59 compute-0 systemd[1]: libpod-conmon-161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c.scope: Deactivated successfully.
Dec 05 12:06:59 compute-0 podman[227499]: 2025-12-05 12:06:59.360234975 +0000 UTC m=+0.045743408 container remove 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.367 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[10d69b94-5ebb-4427-b1b2-19bc26d9e5f0]: (4, ('Fri Dec  5 12:06:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 (161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c)\n161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c\nFri Dec  5 12:06:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 (161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c)\n161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.369 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac9e4e73-ac02-40fd-9fbf-da42a8e986a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.370 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6ce614-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.372 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:59 compute-0 kernel: tapfc6ce614-d0: left promiscuous mode
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.387 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.391 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce581071-182b-4a00-bf88-0783172f027c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.407 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0345c101-f181-446e-8ad4-19c826dfecc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.409 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91e883af-6979-4d4b-ad06-247f05f2240c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.426 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2789a0da-6c83-4c17-b254-0aa8c4441732]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375872, 'reachable_time': 21824, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227531, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.429 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:06:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.430 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[76c32303-a1b2-478b-a633-bcdf9c7ac9f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:06:59 compute-0 systemd[1]: run-netns-ovnmeta\x2dfc6ce614\x2dd0f7\x2d413f\x2dbc3e\x2d26f7271993d9.mount: Deactivated successfully.
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.908 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance shutdown successfully after 3 seconds.
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.914 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance destroyed successfully.
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.915 187212 DEBUG nova.objects.instance [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.951 187212 DEBUG nova.objects.instance [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'flavor' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.984 187212 DEBUG oslo_concurrency.lockutils [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:06:59 compute-0 nova_compute[187208]: 2025-12-05 12:06:59.984 187212 DEBUG oslo_concurrency.lockutils [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.201 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Beginning cold snapshot process
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.279 187212 DEBUG nova.compute.manager [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-unplugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.279 187212 DEBUG oslo_concurrency.lockutils [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.279 187212 DEBUG oslo_concurrency.lockutils [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.280 187212 DEBUG oslo_concurrency.lockutils [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.280 187212 DEBUG nova.compute.manager [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-unplugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.280 187212 WARNING nova.compute.manager [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-unplugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state active and task_state shelving_image_pending_upload.
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.367 187212 DEBUG nova.privsep.utils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.368 187212 DEBUG oslo_concurrency.processutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk /var/lib/nova/instances/snapshots/tmp5j8wm6k8/adb9d9dddeed4a54b09e9bedf8cd0b63 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.994 187212 DEBUG oslo_concurrency.processutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk /var/lib/nova/instances/snapshots/tmp5j8wm6k8/adb9d9dddeed4a54b09e9bedf8cd0b63" returned: 0 in 0.626s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:00 compute-0 nova_compute[187208]: 2025-12-05 12:07:00.995 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Snapshot extracted, beginning image upload
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.405 187212 DEBUG nova.network.neutron [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.620 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-8749491f-af83-499c-b823-14496cf1872d" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.620 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-8749491f-af83-499c-b823-14496cf1872d" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.643 187212 DEBUG nova.objects.instance [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.672 187212 DEBUG nova.virt.libvirt.vif [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.673 187212 DEBUG nova.network.os_vif_util [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.675 187212 DEBUG nova.network.os_vif_util [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.680 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.681 187212 DEBUG nova.compute.manager [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.682 187212 DEBUG nova.compute.manager [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.682 187212 DEBUG oslo_concurrency.lockutils [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.683 187212 DEBUG oslo_concurrency.lockutils [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.683 187212 DEBUG nova.network.neutron [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.686 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.688 187212 DEBUG nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Attempting to detach device tap8749491f-af from instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.688 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:83:50:d2"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <target dev="tap8749491f-af"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]: </interface>
Dec 05 12:07:01 compute-0 nova_compute[187208]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.758 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.763 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface>not found in domain: <domain type='kvm' id='64'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <name>instance-0000003c</name>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <uuid>25918fc4-05ec-4a16-b77f-ca1d352a2763</uuid>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:06:52</nova:creationTime>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:port uuid="8749491f-af83-499c-b823-14496cf1872d">
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:07:01 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <memory unit='KiB'>131072</memory>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <vcpu placement='static'>1</vcpu>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <resource>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <partition>/machine</partition>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </resource>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <sysinfo type='smbios'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <system>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='manufacturer'>RDO</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='serial'>25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='uuid'>25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='family'>Virtual Machine</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </system>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <os>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <boot dev='hd'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <smbios mode='sysinfo'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </os>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <features>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <vmcoreinfo state='on'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </features>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <vendor>AMD</vendor>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='x2apic'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc-deadline'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='hypervisor'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc_adjust'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='spec-ctrl'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='stibp'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='ssbd'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='cmp_legacy'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='overflow-recov'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='succor'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='ibrs'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='amd-ssbd'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='virt-ssbd'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='lbrv'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='tsc-scale'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='vmcb-clean'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='flushbyasid'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='pause-filter'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='pfthreshold'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='xsaves'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='svm'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='topoext'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='npt'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='nrip-save'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <clock offset='utc'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <timer name='hpet' present='no'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <on_poweroff>destroy</on_poweroff>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <on_reboot>restart</on_reboot>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <on_crash>destroy</on_crash>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <disk type='file' device='disk'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk' index='2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <backingStore type='file' index='3'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:         <format type='raw'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:         <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:         <backingStore/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       </backingStore>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target dev='vda' bus='virtio'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='virtio-disk0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <disk type='file' device='cdrom'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config' index='1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <backingStore/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target dev='sda' bus='sata'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <readonly/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='sata0-0-0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pcie.0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='1' port='0x10'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='2' port='0x11'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='3' port='0x12'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.3'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='4' port='0x13'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.4'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='5' port='0x14'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.5'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='6' port='0x15'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='7' port='0x16'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='8' port='0x17'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.8'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='9' port='0x18'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.9'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='10' port='0x19'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.10'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='11' port='0x1a'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.11'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='12' port='0x1b'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.12'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='13' port='0x1c'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.13'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='14' port='0x1d'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.14'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='15' port='0x1e'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.15'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='16' port='0x1f'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.16'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='17' port='0x20'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.17'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='18' port='0x21'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.18'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='19' port='0x22'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.19'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='20' port='0x23'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.20'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='21' port='0x24'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.21'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='22' port='0x25'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.22'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='23' port='0x26'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.23'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='24' port='0x27'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.24'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='25' port='0x28'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.25'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-pci-bridge'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.26'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='usb'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='sata' index='0'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='ide'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:7b:68:b7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target dev='tap2064bfa7-12'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='net0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:83:50:d2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target dev='tap8749491f-af'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='net1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <serial type='pty'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <source path='/dev/pts/6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log' append='off'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target type='isa-serial' port='0'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:         <model name='isa-serial'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       </target>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <console type='pty' tty='/dev/pts/6'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <source path='/dev/pts/6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log' append='off'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target type='serial' port='0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </console>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <input type='tablet' bus='usb'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='input0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='usb' bus='0' port='1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </input>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <input type='mouse' bus='ps2'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='input1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </input>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <input type='keyboard' bus='ps2'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='input2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </input>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <graphics type='vnc' port='5906' autoport='yes' listen='::0'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <listen type='address' address='::0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </graphics>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <audio id='1' type='none'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <video>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='video0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </video>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <watchdog model='itco' action='reset'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='watchdog0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </watchdog>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <memballoon model='virtio'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <stats period='10'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='balloon0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <rng model='virtio'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <backend model='random'>/dev/urandom</backend>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='rng0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <label>system_u:system_r:svirt_t:s0:c16,c952</label>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c16,c952</imagelabel>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <label>+107:+107</label>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <imagelabel>+107:+107</imagelabel>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:07:01 compute-0 nova_compute[187208]: </domain>
Dec 05 12:07:01 compute-0 nova_compute[187208]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.763 187212 INFO nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tap8749491f-af from instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 from the persistent domain config.
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.764 187212 DEBUG nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] (1/8): Attempting to detach device tap8749491f-af with device alias net1 from instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.764 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:83:50:d2"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <target dev="tap8749491f-af"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]: </interface>
Dec 05 12:07:01 compute-0 nova_compute[187208]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 12:07:01 compute-0 kernel: tap8749491f-af (unregistering): left promiscuous mode
Dec 05 12:07:01 compute-0 NetworkManager[55691]: <info>  [1764936421.8760] device (tap8749491f-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.884 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Received event <DeviceRemovedEvent: 1764936421.8838403, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.885 187212 DEBUG nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Start waiting for the detach event from libvirt for device tap8749491f-af with device alias net1 for instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.886 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:07:01 compute-0 ovn_controller[95610]: 2025-12-05T12:07:01Z|00521|binding|INFO|Releasing lport 8749491f-af83-499c-b823-14496cf1872d from this chassis (sb_readonly=0)
Dec 05 12:07:01 compute-0 ovn_controller[95610]: 2025-12-05T12:07:01Z|00522|binding|INFO|Setting lport 8749491f-af83-499c-b823-14496cf1872d down in Southbound
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.888 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:01 compute-0 ovn_controller[95610]: 2025-12-05T12:07:01Z|00523|binding|INFO|Removing iface tap8749491f-af ovn-installed in OVS
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.892 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface>not found in domain: <domain type='kvm' id='64'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <name>instance-0000003c</name>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <uuid>25918fc4-05ec-4a16-b77f-ca1d352a2763</uuid>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:06:52</nova:creationTime>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:port uuid="8749491f-af83-499c-b823-14496cf1872d">
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:07:01 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <memory unit='KiB'>131072</memory>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <vcpu placement='static'>1</vcpu>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <resource>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <partition>/machine</partition>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </resource>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <sysinfo type='smbios'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <system>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='manufacturer'>RDO</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='serial'>25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='uuid'>25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <entry name='family'>Virtual Machine</entry>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </system>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <os>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <boot dev='hd'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <smbios mode='sysinfo'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </os>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <features>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <vmcoreinfo state='on'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </features>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <vendor>AMD</vendor>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='x2apic'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc-deadline'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='hypervisor'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc_adjust'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='spec-ctrl'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='stibp'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='ssbd'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='cmp_legacy'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='overflow-recov'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='succor'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='ibrs'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='amd-ssbd'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='virt-ssbd'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='lbrv'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='tsc-scale'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='vmcb-clean'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='flushbyasid'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='pause-filter'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='pfthreshold'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='xsaves'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='svm'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='require' name='topoext'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='npt'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <feature policy='disable' name='nrip-save'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <clock offset='utc'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <timer name='hpet' present='no'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <on_poweroff>destroy</on_poweroff>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <on_reboot>restart</on_reboot>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <on_crash>destroy</on_crash>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <disk type='file' device='disk'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk' index='2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <backingStore type='file' index='3'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:         <format type='raw'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:         <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:         <backingStore/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       </backingStore>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target dev='vda' bus='virtio'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='virtio-disk0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <disk type='file' device='cdrom'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config' index='1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <backingStore/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target dev='sda' bus='sata'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <readonly/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='sata0-0-0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pcie.0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='1' port='0x10'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='2' port='0x11'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='3' port='0x12'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.3'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='4' port='0x13'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.4'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='5' port='0x14'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.5'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='6' port='0x15'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='7' port='0x16'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='8' port='0x17'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.8'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='9' port='0x18'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.9'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='10' port='0x19'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.10'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='11' port='0x1a'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.11'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='12' port='0x1b'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.12'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='13' port='0x1c'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.13'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='14' port='0x1d'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.14'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='15' port='0x1e'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.15'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='16' port='0x1f'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.16'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='17' port='0x20'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.17'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='18' port='0x21'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.18'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='19' port='0x22'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.19'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='20' port='0x23'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.20'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='21' port='0x24'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.21'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='22' port='0x25'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.22'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='23' port='0x26'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.23'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='24' port='0x27'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.24'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target chassis='25' port='0x28'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.25'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model name='pcie-pci-bridge'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='pci.26'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='usb'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <controller type='sata' index='0'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='ide'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:7b:68:b7'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target dev='tap2064bfa7-12'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='net0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <serial type='pty'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <source path='/dev/pts/6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log' append='off'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target type='isa-serial' port='0'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:         <model name='isa-serial'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       </target>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <console type='pty' tty='/dev/pts/6'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <source path='/dev/pts/6'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log' append='off'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <target type='serial' port='0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </console>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <input type='tablet' bus='usb'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='input0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='usb' bus='0' port='1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </input>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <input type='mouse' bus='ps2'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='input1'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </input>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <input type='keyboard' bus='ps2'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='input2'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </input>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <graphics type='vnc' port='5906' autoport='yes' listen='::0'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <listen type='address' address='::0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </graphics>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <audio id='1' type='none'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <video>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='video0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </video>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <watchdog model='itco' action='reset'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='watchdog0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </watchdog>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <memballoon model='virtio'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <stats period='10'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='balloon0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <rng model='virtio'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <backend model='random'>/dev/urandom</backend>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <alias name='rng0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <label>system_u:system_r:svirt_t:s0:c16,c952</label>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c16,c952</imagelabel>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <label>+107:+107</label>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <imagelabel>+107:+107</imagelabel>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:07:01 compute-0 nova_compute[187208]: </domain>
Dec 05 12:07:01 compute-0 nova_compute[187208]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.892 187212 INFO nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tap8749491f-af from instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 from the live domain config.
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.893 187212 DEBUG nova.virt.libvirt.vif [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.893 187212 DEBUG nova.network.os_vif_util [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.894 187212 DEBUG nova.network.os_vif_util [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.894 187212 DEBUG os_vif [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.895 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.896 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8749491f-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.899 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.903 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:50:d2 10.100.0.14'], port_security=['fa:16:3e:83:50:d2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8749491f-af83-499c-b823-14496cf1872d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.904 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.905 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8749491f-af83-499c-b823-14496cf1872d in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:07:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.907 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.908 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.911 187212 INFO os_vif [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af')
Dec 05 12:07:01 compute-0 nova_compute[187208]: 2025-12-05 12:07:01.912 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:07:01</nova:creationTime>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec 05 12:07:01 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:07:01 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:07:01 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:07:01 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:07:01 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:07:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.921 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9df379-2cd7-4ad2-80a1-e8dc17d4cdc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.945 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e5607d35-b01e-4a17-87ad-789c3d2a2024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.950 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[33d6b5de-7a87-4626-b4eb-ed1ea1b7fc85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.980 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[655aa3ff-4a45-456f-b076-d86fb7142b10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.000 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[57f4937b-9c9d-4d66-a51c-ecb287da83e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375484, 'reachable_time': 41038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227568, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.016 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7c64100c-21f8-4494-a07b-554c4873f062]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375496, 'tstamp': 375496}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227569, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375499, 'tstamp': 375499}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227569, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.018 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:02 compute-0 nova_compute[187208]: 2025-12-05 12:07:02.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:02 compute-0 nova_compute[187208]: 2025-12-05 12:07:02.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.026 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.026 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.026 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.027 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:02 compute-0 nova_compute[187208]: 2025-12-05 12:07:02.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:02 compute-0 nova_compute[187208]: 2025-12-05 12:07:02.844 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:02 compute-0 nova_compute[187208]: 2025-12-05 12:07:02.844 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:02 compute-0 nova_compute[187208]: 2025-12-05 12:07:02.844 187212 DEBUG nova.network.neutron [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:03.013 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:03.014 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.018 187212 DEBUG nova.network.neutron [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.081 187212 DEBUG oslo_concurrency.lockutils [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.081 187212 DEBUG nova.compute.manager [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.082 187212 DEBUG nova.compute.manager [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] network_info to inject: |[{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Dec 05 12:07:03 compute-0 podman[227571]: 2025-12-05 12:07:03.214538381 +0000 UTC m=+0.062841575 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:07:03 compute-0 podman[227572]: 2025-12-05 12:07:03.244011296 +0000 UTC m=+0.089750656 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.433 187212 DEBUG nova.network.neutron [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.435 187212 DEBUG nova.network.neutron [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.456 187212 DEBUG oslo_concurrency.lockutils [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.644 187212 DEBUG nova.compute.manager [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.645 187212 DEBUG oslo_concurrency.lockutils [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.646 187212 DEBUG oslo_concurrency.lockutils [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.646 187212 DEBUG oslo_concurrency.lockutils [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.646 187212 DEBUG nova.compute.manager [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:03 compute-0 nova_compute[187208]: 2025-12-05 12:07:03.646 187212 WARNING nova.compute.manager [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state active and task_state shelving_image_uploading.
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.002 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.002 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.003 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.003 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.003 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.004 187212 INFO nova.compute.manager [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Terminating instance
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.005 187212 DEBUG nova.compute.manager [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:07:04 compute-0 kernel: tap9357c6a6-eb (unregistering): left promiscuous mode
Dec 05 12:07:04 compute-0 NetworkManager[55691]: <info>  [1764936424.0716] device (tap9357c6a6-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:04 compute-0 ovn_controller[95610]: 2025-12-05T12:07:04Z|00524|binding|INFO|Releasing lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 from this chassis (sb_readonly=0)
Dec 05 12:07:04 compute-0 ovn_controller[95610]: 2025-12-05T12:07:04Z|00525|binding|INFO|Setting lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 down in Southbound
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 ovn_controller[95610]: 2025-12-05T12:07:04Z|00526|binding|INFO|Removing iface tap9357c6a6-eb ovn-installed in OVS
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.080 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.098 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.112 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Snapshot image upload complete
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.112 187212 DEBUG nova.compute.manager [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:04 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000037.scope: Deactivated successfully.
Dec 05 12:07:04 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000037.scope: Consumed 15.414s CPU time.
Dec 05 12:07:04 compute-0 systemd-machined[153543]: Machine qemu-59-instance-00000037 terminated.
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.272 187212 INFO nova.virt.libvirt.driver [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance destroyed successfully.
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.273 187212 DEBUG nova.objects.instance [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'resources' on Instance uuid 472c7e2c-bdad-4230-904b-6937ceb872d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.312 187212 INFO nova.network.neutron [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Port 8749491f-af83-499c-b823-14496cf1872d from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.312 187212 DEBUG nova.network.neutron [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.329 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:e8:08 10.100.0.14'], port_security=['fa:16:3e:08:e8:08 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f4c4888-4b32-4259-8441-31af091e0c7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85037de7275442698e604ee3f6283cbc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed3fff5f-a24a-492e-ba85-8f010d446cfc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac2e7e6b-9342-46f8-a910-5de5a261f0a9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9357c6a6-eb6f-4ab9-bfd6-486765004ac5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.331 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 in datapath 0f4c4888-4b32-4259-8441-31af091e0c7d unbound from our chassis
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.333 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f4c4888-4b32-4259-8441-31af091e0c7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[66f061fa-687e-457d-9196-8f9ceb221e5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.335 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d namespace which is not needed anymore
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.404 187212 DEBUG nova.virt.libvirt.vif [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-292918791',display_name='tempest-FloatingIPsAssociationTestJSON-server-292918791',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-292918791',id=55,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-c3vnhg04',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:05:46Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=472c7e2c-bdad-4230-904b-6937ceb872d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.405 187212 DEBUG nova.network.os_vif_util [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.405 187212 DEBUG nova.network.os_vif_util [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.405 187212 DEBUG os_vif [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.407 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9357c6a6-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.408 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.411 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.413 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.416 187212 INFO os_vif [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb')
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.417 187212 INFO nova.virt.libvirt.driver [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Deleting instance files /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2_del
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.417 187212 INFO nova.virt.libvirt.driver [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Deletion of /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2_del complete
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.423 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.448 187212 INFO nova.compute.manager [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Shelve offloading
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.455 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-8749491f-af83-499c-b823-14496cf1872d" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.459 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance destroyed successfully.
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.459 187212 DEBUG nova.compute.manager [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [NOTICE]   (225592) : haproxy version is 2.8.14-c23fe91
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [NOTICE]   (225592) : path to executable is /usr/sbin/haproxy
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [WARNING]  (225592) : Exiting Master process...
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.462 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.462 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [ALERT]    (225592) : Current worker (225594) exited with code 143 (Terminated)
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [WARNING]  (225592) : All workers exited. Exiting... (0)
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.463 187212 DEBUG nova.network.neutron [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:04 compute-0 systemd[1]: libpod-96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302.scope: Deactivated successfully.
Dec 05 12:07:04 compute-0 podman[227668]: 2025-12-05 12:07:04.472912417 +0000 UTC m=+0.050792755 container died 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:07:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302-userdata-shm.mount: Deactivated successfully.
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.505 187212 INFO nova.compute.manager [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Took 0.50 seconds to destroy the instance on the hypervisor.
Dec 05 12:07:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-346d0c910feecbfb16f9369239d1a7161a45d173e7171bca2bd39f251f209cca-merged.mount: Deactivated successfully.
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.506 187212 DEBUG oslo.service.loopingcall [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.506 187212 DEBUG nova.compute.manager [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.506 187212 DEBUG nova.network.neutron [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:07:04 compute-0 podman[227668]: 2025-12-05 12:07:04.512418533 +0000 UTC m=+0.090298861 container cleanup 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:07:04 compute-0 systemd[1]: libpod-conmon-96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302.scope: Deactivated successfully.
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.569 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.570 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing instance network info cache due to event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.570 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.570 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.570 187212 DEBUG nova.network.neutron [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:04 compute-0 podman[227697]: 2025-12-05 12:07:04.578699386 +0000 UTC m=+0.044302996 container remove 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.584 104579 DEBUG eventlet.wsgi.server [-] (104579) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.584 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[608cd84d-a1cf-4b1c-86e9-7dbe60e28a1e]: (4, ('Fri Dec  5 12:07:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d (96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302)\n96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302\nFri Dec  5 12:07:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d (96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302)\n96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.586 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3cffab9e-e0d3-4f7a-8520-453010f459c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.586 104579 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: Accept: */*
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: Connection: close
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: Content-Type: text/plain
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: Host: 169.254.169.254
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: User-Agent: curl/7.84.0
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: X-Forwarded-For: 10.100.0.10
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: X-Ovn-Network-Id: 59233d66-44e6-47b3-b612-4f7d677af03d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.588 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f4c4888-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.590 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 kernel: tap0f4c4888-40: left promiscuous mode
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.608 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.611 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0829f761-b8c4-465d-9cfe-34bfc240221d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.625 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[844062a8-623d-4dd2-b2ad-135db8e0f0f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.626 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d25625d-8330-4df5-a8d0-85b7628869d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.644 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[178972a3-0ccb-475c-bc0f-ace6aafbf5a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372331, 'reachable_time': 38588, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227715, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.646 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.646 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[3647f202-51d0-46f3-bbc0-843a46dd176c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:04 compute-0 systemd[1]: run-netns-ovnmeta\x2d0f4c4888\x2d4b32\x2d4259\x2d8441\x2d31af091e0c7d.mount: Deactivated successfully.
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.767 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.768 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.768 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.768 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.769 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.770 187212 INFO nova.compute.manager [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Terminating instance
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.771 187212 DEBUG nova.compute.manager [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:07:04 compute-0 kernel: tap2064bfa7-12 (unregistering): left promiscuous mode
Dec 05 12:07:04 compute-0 NetworkManager[55691]: <info>  [1764936424.7963] device (tap2064bfa7-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:04 compute-0 ovn_controller[95610]: 2025-12-05T12:07:04Z|00527|binding|INFO|Releasing lport 2064bfa7-125e-466c-9365-6c0ec6655113 from this chassis (sb_readonly=0)
Dec 05 12:07:04 compute-0 ovn_controller[95610]: 2025-12-05T12:07:04Z|00528|binding|INFO|Setting lport 2064bfa7-125e-466c-9365-6c0ec6655113 down in Southbound
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 ovn_controller[95610]: 2025-12-05T12:07:04Z|00529|binding|INFO|Removing iface tap2064bfa7-12 ovn-installed in OVS
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.809 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.821 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:68:b7 10.100.0.12'], port_security=['fa:16:3e:7b:68:b7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '38cb0acb-7ac3-4fef-baeb-661c59e2e07c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.234'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2064bfa7-125e-466c-9365-6c0ec6655113) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.822 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2064bfa7-125e-466c-9365-6c0ec6655113 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.824 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.825 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7b28e76a-93ab-4501-8ad0-6f9ad4fb4fd9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.826 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace which is not needed anymore
Dec 05 12:07:04 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Dec 05 12:07:04 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000003c.scope: Consumed 13.920s CPU time.
Dec 05 12:07:04 compute-0 systemd-machined[153543]: Machine qemu-64-instance-0000003c terminated.
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [NOTICE]   (226293) : haproxy version is 2.8.14-c23fe91
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [NOTICE]   (226293) : path to executable is /usr/sbin/haproxy
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [WARNING]  (226293) : Exiting Master process...
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [ALERT]    (226293) : Current worker (226296) exited with code 143 (Terminated)
Dec 05 12:07:04 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [WARNING]  (226293) : All workers exited. Exiting... (0)
Dec 05 12:07:04 compute-0 systemd[1]: libpod-b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934.scope: Deactivated successfully.
Dec 05 12:07:04 compute-0 podman[227738]: 2025-12-05 12:07:04.950838115 +0000 UTC m=+0.044408879 container died b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:07:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934-userdata-shm.mount: Deactivated successfully.
Dec 05 12:07:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-39a6a101eec5b4484b9949c620581716165bc1e8dfc205f2cf46df83cc7fa1cb-merged.mount: Deactivated successfully.
Dec 05 12:07:04 compute-0 podman[227738]: 2025-12-05 12:07:04.989315172 +0000 UTC m=+0.082885956 container cleanup b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:07:04 compute-0 nova_compute[187208]: 2025-12-05 12:07:04.995 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:04 compute-0 systemd[1]: libpod-conmon-b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934.scope: Deactivated successfully.
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.040 187212 INFO nova.virt.libvirt.driver [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance destroyed successfully.
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.042 187212 DEBUG nova.objects.instance [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'resources' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.057 187212 DEBUG nova.virt.libvirt.vif [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.058 187212 DEBUG nova.network.os_vif_util [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.058 187212 DEBUG nova.network.os_vif_util [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.059 187212 DEBUG os_vif [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.061 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.061 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2064bfa7-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:05 compute-0 podman[227775]: 2025-12-05 12:07:05.063184635 +0000 UTC m=+0.049319172 container remove b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.119 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.119 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[71fef8b3-d4b8-49f8-8cc2-aef8478b55be]: (4, ('Fri Dec  5 12:07:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934)\nb18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934\nFri Dec  5 12:07:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934)\nb18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.121 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d10c6f82-e83b-453e-aa4b-9479c8ab2ea1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.122 187212 INFO os_vif [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12')
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.122 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.123 187212 INFO nova.virt.libvirt.driver [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Deleting instance files /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763_del
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.124 187212 INFO nova.virt.libvirt.driver [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Deletion of /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763_del complete
Dec 05 12:07:05 compute-0 kernel: tapfbfed6fc-30: left promiscuous mode
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.142 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.145 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0547ba3e-7270-4f6c-960e-0431b69cce28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.160 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05a117c3-d9d0-47f1-963c-80843eaefa7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.161 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[feb9c0e4-b3ec-4f44-9107-22a159ab8700]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.174 187212 INFO nova.compute.manager [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.175 187212 DEBUG oslo.service.loopingcall [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.175 187212 DEBUG nova.compute.manager [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:07:05 compute-0 nova_compute[187208]: 2025-12-05 12:07:05.175 187212 DEBUG nova.network.neutron [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.177 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4909a4-b1a6-42b7-b927-6894cad3b2e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375474, 'reachable_time': 25020, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227801, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.179 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.180 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd70753-53ac-4ac9-87e9-c88ec88d8409]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:05 compute-0 systemd[1]: run-netns-ovnmeta\x2dfbfed6fc\x2d3701\x2d4311\x2da4c2\x2d8c49c5b7584c.mount: Deactivated successfully.
Dec 05 12:07:05 compute-0 haproxy-metadata-proxy-59233d66-44e6-47b3-b612-4f7d677af03d[227143]: 10.100.0.10:41388 [05/Dec/2025:12:07:04.583] listener listener/metadata 0/0/0/1345/1345 200 1655 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.928 104579 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 05 12:07:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.928 104579 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1671 time: 1.3425035
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.091 187212 DEBUG nova.network.neutron [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.111 187212 INFO nova.compute.manager [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Took 1.60 seconds to deallocate network for instance.
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.170 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.171 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.320 187212 DEBUG nova.network.neutron [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.401 187212 DEBUG nova.compute.provider_tree [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.537 187212 DEBUG nova.network.neutron [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updated VIF entry in instance network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.538 187212 DEBUG nova.network.neutron [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.686 187212 DEBUG nova.scheduler.client.report [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.690 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.690 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-unplugged-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.690 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-unplugged-8749491f-af83-499c-b823-14496cf1872d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 WARNING nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-unplugged-8749491f-af83-499c-b823-14496cf1872d for instance with vm_state active and task_state None.
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 WARNING nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d for instance with vm_state active and task_state None.
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-deleted-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.709 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.738 187212 INFO nova.scheduler.client.report [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Deleted allocations for instance 472c7e2c-bdad-4230-904b-6937ceb872d2
Dec 05 12:07:06 compute-0 nova_compute[187208]: 2025-12-05 12:07:06.813 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.046 187212 DEBUG nova.objects.instance [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'flavor' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.068 187212 DEBUG oslo_concurrency.lockutils [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.068 187212 DEBUG oslo_concurrency.lockutils [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.301 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.301 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.302 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.302 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.303 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.304 187212 INFO nova.compute.manager [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Terminating instance
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.305 187212 DEBUG nova.compute.manager [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:07:07 compute-0 kernel: tapd10caa85-df (unregistering): left promiscuous mode
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.335 187212 DEBUG nova.network.neutron [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:07 compute-0 NetworkManager[55691]: <info>  [1764936427.3371] device (tapd10caa85-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.342 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:07 compute-0 ovn_controller[95610]: 2025-12-05T12:07:07Z|00530|binding|INFO|Releasing lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 from this chassis (sb_readonly=0)
Dec 05 12:07:07 compute-0 ovn_controller[95610]: 2025-12-05T12:07:07Z|00531|binding|INFO|Setting lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 down in Southbound
Dec 05 12:07:07 compute-0 ovn_controller[95610]: 2025-12-05T12:07:07Z|00532|binding|INFO|Removing iface tapd10caa85-df ovn-installed in OVS
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.346 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.350 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:8d:e9 10.100.0.10'], port_security=['fa:16:3e:cc:8d:e9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ed00d159-9d70-481e-93be-ea180fea04ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59233d66-44e6-47b3-b612-4f7d677af03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc1fd38e325f4a2caa75aeab79da75d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cb353a76-4787-4857-933e-e95743324e9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f37497c0-7b03-4b0b-94d8-7ed5a2c705cb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.352 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 in datapath 59233d66-44e6-47b3-b612-4f7d677af03d unbound from our chassis
Dec 05 12:07:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.354 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59233d66-44e6-47b3-b612-4f7d677af03d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.354 187212 DEBUG nova.compute.manager [req-46859646-168e-4d33-b072-3ed9aba301ff req-3bc0f9e2-9dc1-4e2c-a6f2-9815a75e122e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-deleted-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.356 187212 INFO nova.compute.manager [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Took 2.18 seconds to deallocate network for instance.
Dec 05 12:07:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.359 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f5846837-b84e-452d-985b-215c52384045]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.360 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d namespace which is not needed anymore
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.364 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:07 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000040.scope: Deactivated successfully.
Dec 05 12:07:07 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000040.scope: Consumed 14.810s CPU time.
Dec 05 12:07:07 compute-0 systemd-machined[153543]: Machine qemu-68-instance-00000040 terminated.
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.409 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.410 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:07 compute-0 podman[227805]: 2025-12-05 12:07:07.427302136 +0000 UTC m=+0.059257011 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.529 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.541 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.573 187212 INFO nova.virt.libvirt.driver [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance destroyed successfully.
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.573 187212 DEBUG nova.objects.instance [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lazy-loading 'resources' on Instance uuid ed00d159-9d70-481e-93be-ea180fea04ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.588 187212 DEBUG nova.virt.libvirt.vif [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=64,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATHw79fzCFS1LAWHUiavQB3gUFaXpS81QU/Ce6wZ4HmvTj5LBGoan0DqDckMccItIq/MaTr8w95EnUae9L4Bz4KldjVTS0oi0uLUNfFAJiLjBukcvGPiZbx9R9d1EWHww==',key_name='tempest-keypair-599091465',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dc1fd38e325f4a2caa75aeab79da75d3',ramdisk_id='',reservation_id='r-cei648o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-303309807',owner_user_name='tempest-ServersV294TestFqdnHostnames-303309807-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='077bcce844cb42a197dcd6100549b7d3',uuid=ed00d159-9d70-481e-93be-ea180fea04ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.588 187212 DEBUG nova.network.os_vif_util [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converting VIF {"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.589 187212 DEBUG nova.network.os_vif_util [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.589 187212 DEBUG os_vif [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.591 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.591 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd10caa85-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.595 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.596 187212 DEBUG nova.compute.provider_tree [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.600 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.603 187212 INFO os_vif [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df')
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.603 187212 INFO nova.virt.libvirt.driver [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Deleting instance files /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba_del
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.604 187212 INFO nova.virt.libvirt.driver [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Deletion of /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba_del complete
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.614 187212 DEBUG nova.scheduler.client.report [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.643 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.656 187212 INFO nova.compute.manager [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.656 187212 DEBUG oslo.service.loopingcall [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.657 187212 DEBUG nova.compute.manager [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.657 187212 DEBUG nova.network.neutron [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.666 187212 INFO nova.scheduler.client.report [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Deleted allocations for instance 25918fc4-05ec-4a16-b77f-ca1d352a2763
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.749 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:07 compute-0 nova_compute[187208]: 2025-12-05 12:07:07.750 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:07 compute-0 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [NOTICE]   (227141) : haproxy version is 2.8.14-c23fe91
Dec 05 12:07:07 compute-0 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [NOTICE]   (227141) : path to executable is /usr/sbin/haproxy
Dec 05 12:07:07 compute-0 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [WARNING]  (227141) : Exiting Master process...
Dec 05 12:07:07 compute-0 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [WARNING]  (227141) : Exiting Master process...
Dec 05 12:07:07 compute-0 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [ALERT]    (227141) : Current worker (227143) exited with code 143 (Terminated)
Dec 05 12:07:07 compute-0 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [WARNING]  (227141) : All workers exited. Exiting... (0)
Dec 05 12:07:07 compute-0 systemd[1]: libpod-44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b.scope: Deactivated successfully.
Dec 05 12:07:07 compute-0 podman[227846]: 2025-12-05 12:07:07.82274168 +0000 UTC m=+0.374429566 container died 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 05 12:07:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b-userdata-shm.mount: Deactivated successfully.
Dec 05 12:07:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-a0d77ec8b1cc024940e91355c3fecada2e5d7bf69ad2fc36a67c677303a54e3e-merged.mount: Deactivated successfully.
Dec 05 12:07:08 compute-0 podman[227846]: 2025-12-05 12:07:08.047829272 +0000 UTC m=+0.599517158 container cleanup 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:07:08 compute-0 systemd[1]: libpod-conmon-44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b.scope: Deactivated successfully.
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.149 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-unplugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.149 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.149 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.149 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.150 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] No waiting events found dispatching network-vif-unplugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.150 187212 WARNING nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received unexpected event network-vif-unplugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 for instance with vm_state deleted and task_state None.
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.150 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.150 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] No waiting events found dispatching network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 WARNING nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received unexpected event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 for instance with vm_state deleted and task_state None.
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-unplugged-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-unplugged-2064bfa7-125e-466c-9365-6c0ec6655113 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 WARNING nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-unplugged-2064bfa7-125e-466c-9365-6c0ec6655113 for instance with vm_state deleted and task_state None.
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.154 187212 WARNING nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 for instance with vm_state deleted and task_state None.
Dec 05 12:07:08 compute-0 podman[227892]: 2025-12-05 12:07:08.16319949 +0000 UTC m=+0.093907486 container remove 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 12:07:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.168 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1b9fd42b-07cd-4ff1-9a28-886cff9d7ce4]: (4, ('Fri Dec  5 12:07:07 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d (44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b)\n44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b\nFri Dec  5 12:07:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d (44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b)\n44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a8fd98-4178-458e-92e1-0ab93697e94b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.171 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59233d66-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:08 compute-0 kernel: tap59233d66-40: left promiscuous mode
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.184 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance destroyed successfully.
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.185 187212 DEBUG nova.objects.instance [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'resources' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.189 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.192 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[33118861-e5c2-4478-8812-5aed35775211]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.201 187212 DEBUG nova.virt.libvirt.vif [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHKhL003clvQeWhyQnRnlaccZLUvEBLEhvImBOCB5geqDizgWJsGjayma/8q9qGL/NiGPTPxEZoxanWZnFRBuZklxJy5hDaSwVjbF4FtdnX9ysLeFgNsQAX0H4LK24ei2Q==',key_name='tempest-keypair-105541899',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member',shelved_at='2025-12-05T12:07:04.112687',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='13b862b8-8b0a-448a-bbba-7d8ef455d2c6'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.202 187212 DEBUG nova.network.os_vif_util [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.202 187212 DEBUG nova.network.os_vif_util [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.203 187212 DEBUG os_vif [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.204 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.204 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac02dd63-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.206 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.207 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.209 187212 INFO os_vif [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a')
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.210 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deleting instance files /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458_del
Dec 05 12:07:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.210 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4ea93c-87e7-485f-8181-0159d87781c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.211 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[726cc51f-1868-4c2c-8084-7ce21434a083]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.216 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deletion of /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458_del complete
Dec 05 12:07:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.225 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d679fd2-9c29-4901-a2f6-0e1ca75b871e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377979, 'reachable_time': 23454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227912, 'error': None, 'target': 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.229 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:07:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.229 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[ddb070e6-c58f-4e09-802e-b73b3362caa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d59233d66\x2d44e6\x2d47b3\x2db612\x2d4f7d677af03d.mount: Deactivated successfully.
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.303 187212 DEBUG nova.network.neutron [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.827 187212 INFO nova.scheduler.client.report [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Deleted allocations for instance 5d70ac2d-111f-4e1b-ac26-3e02849b0458
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.963 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:08 compute-0 nova_compute[187208]: 2025-12-05 12:07:08.963 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.127 187212 DEBUG nova.compute.provider_tree [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.297 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.298 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.304 187212 DEBUG nova.scheduler.client.report [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.336 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.357 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.532 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 12.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.615 187212 DEBUG nova.network.neutron [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.638 187212 INFO nova.compute.manager [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Took 1.98 seconds to deallocate network for instance.
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.644 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.644 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.653 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.653 187212 INFO nova.compute.claims [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.717 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.907 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-deleted-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.908 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-unplugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.908 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.908 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.908 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] No waiting events found dispatching network-vif-unplugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 WARNING nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received unexpected event network-vif-unplugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 for instance with vm_state deleted and task_state None.
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.910 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.910 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] No waiting events found dispatching network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.910 187212 WARNING nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received unexpected event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 for instance with vm_state deleted and task_state None.
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.963 187212 DEBUG nova.compute.provider_tree [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:09 compute-0 nova_compute[187208]: 2025-12-05 12:07:09.986 187212 DEBUG nova.scheduler.client.report [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.007 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.007 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.010 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.047 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.048 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.065 187212 INFO nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.082 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.177 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.178 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.178 187212 INFO nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Creating image(s)
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.179 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.179 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.180 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.194 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.228 187212 DEBUG nova.compute.provider_tree [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.231 187212 DEBUG nova.network.neutron [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.253 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.253 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.254 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.254 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.254 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.256 187212 INFO nova.compute.manager [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Terminating instance
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.257 187212 DEBUG nova.compute.manager [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.259 187212 DEBUG nova.scheduler.client.report [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.264 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.265 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.265 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.279 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.303 187212 DEBUG oslo_concurrency.lockutils [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.304 187212 DEBUG nova.compute.manager [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.304 187212 DEBUG nova.compute.manager [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] network_info to inject: |[{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.308 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.318 187212 DEBUG nova.compute.manager [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.318 187212 DEBUG nova.compute.manager [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing instance network info cache due to event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.319 187212 DEBUG oslo_concurrency.lockutils [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.319 187212 DEBUG oslo_concurrency.lockutils [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.319 187212 DEBUG nova.network.neutron [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.338 187212 INFO nova.scheduler.client.report [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Deleted allocations for instance ed00d159-9d70-481e-93be-ea180fea04ba
Dec 05 12:07:10 compute-0 kernel: tap5683f8a8-69 (unregistering): left promiscuous mode
Dec 05 12:07:10 compute-0 NetworkManager[55691]: <info>  [1764936430.3460] device (tap5683f8a8-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.348 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.349 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:10 compute-0 ovn_controller[95610]: 2025-12-05T12:07:10Z|00533|binding|INFO|Releasing lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 from this chassis (sb_readonly=0)
Dec 05 12:07:10 compute-0 ovn_controller[95610]: 2025-12-05T12:07:10Z|00534|binding|INFO|Setting lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 down in Southbound
Dec 05 12:07:10 compute-0 ovn_controller[95610]: 2025-12-05T12:07:10Z|00535|binding|INFO|Removing iface tap5683f8a8-69 ovn-installed in OVS
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.365 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:3c:38 10.100.0.11'], port_security=['fa:16:3e:d3:3c:38 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.366 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.371 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.375 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.385 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[715a1027-ca91-4efc-8ff9-b6458a79998e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:10 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Dec 05 12:07:10 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000003a.scope: Consumed 14.513s CPU time.
Dec 05 12:07:10 compute-0 systemd-machined[153543]: Machine qemu-62-instance-0000003a terminated.
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.415 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[24ec07fd-35a4-4823-9aa8-1f87b3e5de69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.419 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7b579f1d-2328-4499-9983-ad557e5117c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.440 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk 1073741824" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.441 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.442 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.450 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e7480a-7454-429b-84c3-bb69f957b182]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.466 187212 DEBUG nova.policy [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.470 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8248c1a8-e3f3-4f23-8207-d1d4bdfcc6cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 17, 'rx_bytes': 868, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 17, 'rx_bytes': 868, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227935, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.471 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.487 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[09835604-c3b0-4f3e-8dc0-e61b21f4ca62]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227936, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227936, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.489 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.490 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.501 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.502 187212 DEBUG nova.virt.disk.api [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Checking if we can resize image /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.502 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.541 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.572 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.573 187212 DEBUG nova.virt.disk.api [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Cannot resize image /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.573 187212 DEBUG nova.objects.instance [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 854e3893-3908-4b4a-b29c-7fb4384e4f0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.591 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.592 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Ensure instance console log exists: /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.593 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.594 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.594 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.597 187212 INFO nova.virt.libvirt.driver [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance destroyed successfully.
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.598 187212 DEBUG nova.objects.instance [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'resources' on Instance uuid b81bb939-d14f-4a72-b7fe-95fc5d8810a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.611 187212 DEBUG nova.virt.libvirt.vif [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1462907521',display_name='tempest-ListServerFiltersTestJSON-instance-1462907521',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1462907521',id=58,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-bzpoia2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:01Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=b81bb939-d14f-4a72-b7fe-95fc5d8810a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.612 187212 DEBUG nova.network.os_vif_util [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.613 187212 DEBUG nova.network.os_vif_util [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.613 187212 DEBUG os_vif [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.619 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5683f8a8-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.621 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.629 187212 INFO os_vif [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69')
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.629 187212 INFO nova.virt.libvirt.driver [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Deleting instance files /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1_del
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.630 187212 INFO nova.virt.libvirt.driver [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Deletion of /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1_del complete
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.688 187212 INFO nova.compute.manager [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Took 0.43 seconds to destroy the instance on the hypervisor.
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.688 187212 DEBUG oslo.service.loopingcall [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.689 187212 DEBUG nova.compute.manager [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:07:10 compute-0 nova_compute[187208]: 2025-12-05 12:07:10.689 187212 DEBUG nova.network.neutron [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:07:11 compute-0 nova_compute[187208]: 2025-12-05 12:07:11.814 187212 DEBUG nova.network.neutron [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:11 compute-0 nova_compute[187208]: 2025-12-05 12:07:11.842 187212 INFO nova.compute.manager [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Took 1.15 seconds to deallocate network for instance.
Dec 05 12:07:11 compute-0 nova_compute[187208]: 2025-12-05 12:07:11.900 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:11 compute-0 nova_compute[187208]: 2025-12-05 12:07:11.901 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:11 compute-0 nova_compute[187208]: 2025-12-05 12:07:11.945 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Successfully created port: 1b4ab157-ddea-449c-ab91-983a53dd2045 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:07:12 compute-0 nova_compute[187208]: 2025-12-05 12:07:12.044 187212 DEBUG nova.network.neutron [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updated VIF entry in instance network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:12 compute-0 nova_compute[187208]: 2025-12-05 12:07:12.045 187212 DEBUG nova.network.neutron [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:12 compute-0 nova_compute[187208]: 2025-12-05 12:07:12.061 187212 DEBUG oslo_concurrency.lockutils [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:12 compute-0 nova_compute[187208]: 2025-12-05 12:07:12.129 187212 DEBUG nova.compute.provider_tree [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:12 compute-0 nova_compute[187208]: 2025-12-05 12:07:12.148 187212 DEBUG nova.scheduler.client.report [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:12 compute-0 nova_compute[187208]: 2025-12-05 12:07:12.171 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:12 compute-0 nova_compute[187208]: 2025-12-05 12:07:12.193 187212 INFO nova.scheduler.client.report [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Deleted allocations for instance b81bb939-d14f-4a72-b7fe-95fc5d8810a1
Dec 05 12:07:12 compute-0 nova_compute[187208]: 2025-12-05 12:07:12.265 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:12 compute-0 nova_compute[187208]: 2025-12-05 12:07:12.753 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.056 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.056 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.056 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.057 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.057 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.058 187212 INFO nova.compute.manager [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Terminating instance
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.059 187212 DEBUG nova.compute.manager [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:07:13 compute-0 kernel: tap88c7b630-e8 (unregistering): left promiscuous mode
Dec 05 12:07:13 compute-0 NetworkManager[55691]: <info>  [1764936433.0969] device (tap88c7b630-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:13 compute-0 ovn_controller[95610]: 2025-12-05T12:07:13Z|00536|binding|INFO|Releasing lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 from this chassis (sb_readonly=0)
Dec 05 12:07:13 compute-0 ovn_controller[95610]: 2025-12-05T12:07:13Z|00537|binding|INFO|Setting lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 down in Southbound
Dec 05 12:07:13 compute-0 ovn_controller[95610]: 2025-12-05T12:07:13Z|00538|binding|INFO|Removing iface tap88c7b630-e8 ovn-installed in OVS
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.107 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:13 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Dec 05 12:07:13 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000003d.scope: Consumed 14.458s CPU time.
Dec 05 12:07:13 compute-0 systemd-machined[153543]: Machine qemu-66-instance-0000003d terminated.
Dec 05 12:07:13 compute-0 NetworkManager[55691]: <info>  [1764936433.2772] manager: (tap88c7b630-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/224)
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.294 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:19:b7 10.100.0.7'], port_security=['fa:16:3e:bb:19:b7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0566af06-3837-49db-a95c-47b9857e4e90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5285f99befb24ac285be8e4fc1d18e69', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a5c5fedc-8874-4d17-85d6-f832393ee546', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b689627-4043-49f3-b45a-0160a35a0a18, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=88c7b630-e84b-4a35-8c8f-f934e7cabaf6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.296 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 in datapath 0566af06-3837-49db-a95c-47b9857e4e90 unbound from our chassis
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.297 187212 DEBUG nova.compute.manager [req-192e953a-e699-4f96-8dcf-41dfc6b9c93e req-527f23e1-a804-4feb-97c0-9e590bc0c0f3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-deleted-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.299 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0566af06-3837-49db-a95c-47b9857e4e90, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.300 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c820a59b-ea80-4f11-9684-60337430cf21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.300 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 namespace which is not needed anymore
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.325 187212 INFO nova.virt.libvirt.driver [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance destroyed successfully.
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.326 187212 DEBUG nova.objects.instance [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'resources' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:13 compute-0 podman[227974]: 2025-12-05 12:07:13.378615302 +0000 UTC m=+0.051149285 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.465 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-unplugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.466 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.466 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.466 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.466 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] No waiting events found dispatching network-vif-unplugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 WARNING nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received unexpected event network-vif-unplugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 for instance with vm_state deleted and task_state None.
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.468 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] No waiting events found dispatching network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.468 187212 WARNING nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received unexpected event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 for instance with vm_state deleted and task_state None.
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.468 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-deleted-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.473 187212 DEBUG nova.virt.libvirt.vif [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-2105634627',display_name='tempest-AttachInterfacesUnderV243Test-server-2105634627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-2105634627',id=61,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNE7kQOo1iw7msO5U3UKQiYNUNOuR3489N27cA8/7AyK9hUMINDB4EKPtuAqKWiOpLa6/9d1/JcrFvBfelk3gje2Ue6XSif/X6uD8HtKgekiyZF9ENjW4HKYytyiU96vgQ==',key_name='tempest-keypair-865071651',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5285f99befb24ac285be8e4fc1d18e69',ramdisk_id='',reservation_id='r-93zclce8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-1358924829',owner_user_name='tempest-AttachInterfacesUnderV243Test-1358924829-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6b73160d333a43ed94d4258262e3c2b5',uuid=bcdca3f9-3e24-4209-808c-8093b55e5c2d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.473 187212 DEBUG nova.network.os_vif_util [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converting VIF {"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.474 187212 DEBUG nova.network.os_vif_util [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.474 187212 DEBUG os_vif [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.475 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.475 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88c7b630-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.517 187212 INFO os_vif [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8')
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.517 187212 INFO nova.virt.libvirt.driver [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Deleting instance files /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d_del
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.518 187212 INFO nova.virt.libvirt.driver [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Deletion of /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d_del complete
Dec 05 12:07:13 compute-0 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [NOTICE]   (226578) : haproxy version is 2.8.14-c23fe91
Dec 05 12:07:13 compute-0 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [NOTICE]   (226578) : path to executable is /usr/sbin/haproxy
Dec 05 12:07:13 compute-0 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [WARNING]  (226578) : Exiting Master process...
Dec 05 12:07:13 compute-0 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [ALERT]    (226578) : Current worker (226580) exited with code 143 (Terminated)
Dec 05 12:07:13 compute-0 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [WARNING]  (226578) : All workers exited. Exiting... (0)
Dec 05 12:07:13 compute-0 systemd[1]: libpod-abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4.scope: Deactivated successfully.
Dec 05 12:07:13 compute-0 podman[228010]: 2025-12-05 12:07:13.550706956 +0000 UTC m=+0.168809190 container died abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:07:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4-userdata-shm.mount: Deactivated successfully.
Dec 05 12:07:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-0b07d63045405ef887718f7278761beb53f95ff32a7c0a77fd47c0591ae50b1a-merged.mount: Deactivated successfully.
Dec 05 12:07:13 compute-0 podman[228010]: 2025-12-05 12:07:13.590332056 +0000 UTC m=+0.208434260 container cleanup abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 12:07:13 compute-0 systemd[1]: libpod-conmon-abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4.scope: Deactivated successfully.
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.651 187212 INFO nova.compute.manager [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Took 0.59 seconds to destroy the instance on the hypervisor.
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.651 187212 DEBUG oslo.service.loopingcall [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.651 187212 DEBUG nova.compute.manager [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.652 187212 DEBUG nova.network.neutron [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:07:13 compute-0 podman[228045]: 2025-12-05 12:07:13.689970747 +0000 UTC m=+0.073609787 container remove abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.694 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0fad9c7e-62c2-40ed-b364-cdec4c04b767]: (4, ('Fri Dec  5 12:07:13 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 (abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4)\nabb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4\nFri Dec  5 12:07:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 (abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4)\nabb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.696 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f39ada3-eba3-4f8d-8523-03c8675d16ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.697 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0566af06-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:13 compute-0 kernel: tap0566af06-30: left promiscuous mode
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.698 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:13 compute-0 nova_compute[187208]: 2025-12-05 12:07:13.716 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.720 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[11342a60-1c2c-4df6-ae9d-f90cc1e186c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.740 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[11a1e2e9-ed47-4618-92d6-1083b9bdb7f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.742 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c5955714-e525-450b-8a53-a919179351fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.757 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7b34286c-4f3f-444d-b63b-f793b2e0d080]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375995, 'reachable_time': 24710, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228059, 'error': None, 'target': 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.759 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:07:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.760 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[d4bf975d-8e8b-43dd-b47f-3b8d8e59d527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:13 compute-0 systemd[1]: run-netns-ovnmeta\x2d0566af06\x2d3837\x2d49db\x2da95c\x2d47b9857e4e90.mount: Deactivated successfully.
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.355 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936419.3544672, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.355 187212 INFO nova.compute.manager [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Stopped (Lifecycle Event)
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.371 187212 DEBUG nova.compute.manager [None req-fe8ac904-2f43-41ff-bac4-9cb943b64825 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.578 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Successfully updated port: 1b4ab157-ddea-449c-ab91-983a53dd2045 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.593 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.593 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.594 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.732 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.741 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.741 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.741 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.742 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.742 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.743 187212 INFO nova.compute.manager [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Terminating instance
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.744 187212 DEBUG nova.compute.manager [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:07:14 compute-0 kernel: tapc5cb68aa-e5 (unregistering): left promiscuous mode
Dec 05 12:07:14 compute-0 NetworkManager[55691]: <info>  [1764936434.7777] device (tapc5cb68aa-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:14 compute-0 ovn_controller[95610]: 2025-12-05T12:07:14Z|00539|binding|INFO|Releasing lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 from this chassis (sb_readonly=0)
Dec 05 12:07:14 compute-0 ovn_controller[95610]: 2025-12-05T12:07:14Z|00540|binding|INFO|Setting lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 down in Southbound
Dec 05 12:07:14 compute-0 ovn_controller[95610]: 2025-12-05T12:07:14Z|00541|binding|INFO|Removing iface tapc5cb68aa-e5 ovn-installed in OVS
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.786 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.793 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:a8:16 10.100.0.13'], port_security=['fa:16:3e:8a:a8:16 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c5cb68aa-e5c2-48b0-b9c4-e0542120e065) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.794 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c5cb68aa-e5c2-48b0-b9c4-e0542120e065 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.795 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.811 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b72444c2-a1bf-4edf-91e7-4f53d1227ea2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:14 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000039.scope: Deactivated successfully.
Dec 05 12:07:14 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000039.scope: Consumed 16.217s CPU time.
Dec 05 12:07:14 compute-0 systemd-machined[153543]: Machine qemu-61-instance-00000039 terminated.
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.846 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[24d8407d-e380-404c-a5be-fa4aa5f20899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.850 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[96e1416e-e243-439a-ad8d-fdc214586e63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.877 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[da3f4408-35e5-478b-bcc8-326390c30112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.894 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35ff0772-998b-4f36-8494-c050c123063d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 19, 'rx_bytes': 868, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 19, 'rx_bytes': 868, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228073, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.910 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7824da0b-7539-4102-a607-ffa0aca9cba8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228074, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228074, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.912 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:14 compute-0 nova_compute[187208]: 2025-12-05 12:07:14.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.918 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.918 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.919 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.919 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.011 187212 INFO nova.virt.libvirt.driver [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance destroyed successfully.
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.011 187212 DEBUG nova.objects.instance [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'resources' on Instance uuid 8888dd78-1c78-4065-8536-9a1096bdf57b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.029 187212 DEBUG nova.virt.libvirt.vif [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-2001854085',display_name='tempest-ListServerFiltersTestJSON-instance-2001854085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-2001854085',id=57,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-ubyu8olf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:05:51Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=8888dd78-1c78-4065-8536-9a1096bdf57b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.029 187212 DEBUG nova.network.os_vif_util [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.030 187212 DEBUG nova.network.os_vif_util [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.030 187212 DEBUG os_vif [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.032 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5cb68aa-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.033 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.035 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.036 187212 INFO os_vif [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5')
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.037 187212 INFO nova.virt.libvirt.driver [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Deleting instance files /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b_del
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.038 187212 INFO nova.virt.libvirt.driver [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Deletion of /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b_del complete
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.087 187212 INFO nova.compute.manager [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.088 187212 DEBUG oslo.service.loopingcall [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.088 187212 DEBUG nova.compute.manager [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.088 187212 DEBUG nova.network.neutron [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.936 187212 DEBUG nova.compute.manager [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-changed-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.936 187212 DEBUG nova.compute.manager [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Refreshing instance network info cache due to event network-changed-1b4ab157-ddea-449c-ab91-983a53dd2045. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:15 compute-0 nova_compute[187208]: 2025-12-05 12:07:15.937 187212 DEBUG oslo_concurrency.lockutils [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.031 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-unplugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.032 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.032 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.033 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.033 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] No waiting events found dispatching network-vif-unplugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.033 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-unplugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.033 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.034 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.034 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.034 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.035 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] No waiting events found dispatching network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.035 187212 WARNING nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received unexpected event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 for instance with vm_state active and task_state deleting.
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.077 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Updating instance_info_cache with network_info: [{"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.101 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.101 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance network_info: |[{"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.102 187212 DEBUG oslo_concurrency.lockutils [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.102 187212 DEBUG nova.network.neutron [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Refreshing network info cache for port 1b4ab157-ddea-449c-ab91-983a53dd2045 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.105 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start _get_guest_xml network_info=[{"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.109 187212 WARNING nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.115 187212 DEBUG nova.virt.libvirt.host [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.115 187212 DEBUG nova.virt.libvirt.host [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.122 187212 DEBUG nova.virt.libvirt.host [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.123 187212 DEBUG nova.virt.libvirt.host [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.123 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.124 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.124 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.125 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.125 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.125 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.126 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.126 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.126 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.127 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.127 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.127 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.131 187212 DEBUG nova.virt.libvirt.vif [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-63085993',display_name='tempest-ServerActionsTestOtherB-server-63085993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-63085993',id=65,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-ruwsmmgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:10Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=854e3893-3908-4b4a-b29c-7fb4384e4f0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.131 187212 DEBUG nova.network.os_vif_util [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.132 187212 DEBUG nova.network.os_vif_util [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.133 187212 DEBUG nova.objects.instance [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 854e3893-3908-4b4a-b29c-7fb4384e4f0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.146 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <uuid>854e3893-3908-4b4a-b29c-7fb4384e4f0c</uuid>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <name>instance-00000041</name>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestOtherB-server-63085993</nova:name>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:07:16</nova:creationTime>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:07:16 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:07:16 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:07:16 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:07:16 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:07:16 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:07:16 compute-0 nova_compute[187208]:         <nova:user uuid="4ad1281afc874c0ca55d908d3a6e05a8">tempest-ServerActionsTestOtherB-1759520420-project-member</nova:user>
Dec 05 12:07:16 compute-0 nova_compute[187208]:         <nova:project uuid="58cbd93e463049988ccd6d013893e7d6">tempest-ServerActionsTestOtherB-1759520420</nova:project>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:07:16 compute-0 nova_compute[187208]:         <nova:port uuid="1b4ab157-ddea-449c-ab91-983a53dd2045">
Dec 05 12:07:16 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <system>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <entry name="serial">854e3893-3908-4b4a-b29c-7fb4384e4f0c</entry>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <entry name="uuid">854e3893-3908-4b4a-b29c-7fb4384e4f0c</entry>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     </system>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <os>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   </os>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <features>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   </features>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.config"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:03:e5:0a"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <target dev="tap1b4ab157-dd"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/console.log" append="off"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <video>
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     </video>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:07:16 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:07:16 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:07:16 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:07:16 compute-0 nova_compute[187208]: </domain>
Dec 05 12:07:16 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.147 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Preparing to wait for external event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.147 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.147 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.148 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.148 187212 DEBUG nova.virt.libvirt.vif [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-63085993',display_name='tempest-ServerActionsTestOtherB-server-63085993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-63085993',id=65,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-ruwsmmgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:10Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=854e3893-3908-4b4a-b29c-7fb4384e4f0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.148 187212 DEBUG nova.network.os_vif_util [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.149 187212 DEBUG nova.network.os_vif_util [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.149 187212 DEBUG os_vif [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.150 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.150 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.152 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.152 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b4ab157-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.153 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1b4ab157-dd, col_values=(('external_ids', {'iface-id': '1b4ab157-ddea-449c-ab91-983a53dd2045', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:e5:0a', 'vm-uuid': '854e3893-3908-4b4a-b29c-7fb4384e4f0c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.154 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:16 compute-0 NetworkManager[55691]: <info>  [1764936436.1551] manager: (tap1b4ab157-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.157 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.167 187212 INFO os_vif [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd')
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.219 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.219 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.219 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No VIF found with MAC fa:16:3e:03:e5:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.220 187212 INFO nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Using config drive
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.413 187212 DEBUG nova.network.neutron [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.437 187212 INFO nova.compute.manager [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Took 1.35 seconds to deallocate network for instance.
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.511 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.511 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.643 187212 DEBUG nova.compute.provider_tree [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.663 187212 DEBUG nova.scheduler.client.report [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.671 187212 DEBUG nova.network.neutron [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.695 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.697 187212 INFO nova.compute.manager [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Took 3.05 seconds to deallocate network for instance.
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.732 187212 INFO nova.scheduler.client.report [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Deleted allocations for instance 8888dd78-1c78-4065-8536-9a1096bdf57b
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.758 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.759 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.797 187212 INFO nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Creating config drive at /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.config
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.802 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpokrksgkz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.846 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.923 187212 DEBUG nova.compute.provider_tree [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.931 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpokrksgkz" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.938 187212 DEBUG nova.scheduler.client.report [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.960 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:16 compute-0 nova_compute[187208]: 2025-12-05 12:07:16.984 187212 INFO nova.scheduler.client.report [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Deleted allocations for instance bcdca3f9-3e24-4209-808c-8093b55e5c2d
Dec 05 12:07:17 compute-0 kernel: tap1b4ab157-dd: entered promiscuous mode
Dec 05 12:07:17 compute-0 NetworkManager[55691]: <info>  [1764936437.0053] manager: (tap1b4ab157-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.005 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 ovn_controller[95610]: 2025-12-05T12:07:17Z|00542|binding|INFO|Claiming lport 1b4ab157-ddea-449c-ab91-983a53dd2045 for this chassis.
Dec 05 12:07:17 compute-0 ovn_controller[95610]: 2025-12-05T12:07:17Z|00543|binding|INFO|1b4ab157-ddea-449c-ab91-983a53dd2045: Claiming fa:16:3e:03:e5:0a 10.100.0.13
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.009 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.013 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:e5:0a 10.100.0.13'], port_security=['fa:16:3e:03:e5:0a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df1c03c3-b3c9-47b6-a712-a13948dd510e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=1b4ab157-ddea-449c-ab91-983a53dd2045) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.015 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 1b4ab157-ddea-449c-ab91-983a53dd2045 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.016 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:07:17 compute-0 ovn_controller[95610]: 2025-12-05T12:07:17Z|00544|binding|INFO|Setting lport 1b4ab157-ddea-449c-ab91-983a53dd2045 ovn-installed in OVS
Dec 05 12:07:17 compute-0 ovn_controller[95610]: 2025-12-05T12:07:17Z|00545|binding|INFO|Setting lport 1b4ab157-ddea-449c-ab91-983a53dd2045 up in Southbound
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 systemd-udevd[228113]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.034 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[09c881e8-c8d7-440c-8778-880c5b951109]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 NetworkManager[55691]: <info>  [1764936437.0442] device (tap1b4ab157-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:07:17 compute-0 NetworkManager[55691]: <info>  [1764936437.0452] device (tap1b4ab157-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:07:17 compute-0 systemd-machined[153543]: New machine qemu-70-instance-00000041.
Dec 05 12:07:17 compute-0 systemd[1]: Started Virtual Machine qemu-70-instance-00000041.
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.060 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9840555b-033c-405e-b9e7-536bd77589f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.069 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c75d8726-531e-4739-a33a-ce2c8a62547c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.097 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b43d4c15-ea72-47c2-9aaa-ea9bdfee420d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.114 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e57ec1bf-6b58-40b6-9bd0-87eeee58a93b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228126, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.129 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a611388-c324-46cf-ad3b-fc8bac68e3a2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228127, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228127, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.132 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.135 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.135 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.135 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.136 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.136 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.637 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.638 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.638 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.638 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.638 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.639 187212 INFO nova.compute.manager [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Terminating instance
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.640 187212 DEBUG nova.compute.manager [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:07:17 compute-0 kernel: tap549318e9-e6 (unregistering): left promiscuous mode
Dec 05 12:07:17 compute-0 NetworkManager[55691]: <info>  [1764936437.6703] device (tap549318e9-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.674 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 ovn_controller[95610]: 2025-12-05T12:07:17Z|00546|binding|INFO|Releasing lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 from this chassis (sb_readonly=0)
Dec 05 12:07:17 compute-0 ovn_controller[95610]: 2025-12-05T12:07:17Z|00547|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 down in Southbound
Dec 05 12:07:17 compute-0 ovn_controller[95610]: 2025-12-05T12:07:17Z|00548|binding|INFO|Removing iface tap549318e9-e6 ovn-installed in OVS
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.678 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.684 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '8', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.685 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.688 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.689 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dd928541-1a47-4d49-a25c-ff925e9a986d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.689 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 namespace which is not needed anymore
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.710 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000038.scope: Deactivated successfully.
Dec 05 12:07:17 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000038.scope: Consumed 13.561s CPU time.
Dec 05 12:07:17 compute-0 systemd-machined[153543]: Machine qemu-69-instance-00000038 terminated.
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.755 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [NOTICE]   (225723) : haproxy version is 2.8.14-c23fe91
Dec 05 12:07:17 compute-0 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [NOTICE]   (225723) : path to executable is /usr/sbin/haproxy
Dec 05 12:07:17 compute-0 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [WARNING]  (225723) : Exiting Master process...
Dec 05 12:07:17 compute-0 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [ALERT]    (225723) : Current worker (225725) exited with code 143 (Terminated)
Dec 05 12:07:17 compute-0 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [WARNING]  (225723) : All workers exited. Exiting... (0)
Dec 05 12:07:17 compute-0 systemd[1]: libpod-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b.scope: Deactivated successfully.
Dec 05 12:07:17 compute-0 conmon[225719]: conmon 912e0eba89b1a71753b4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b.scope/container/memory.events
Dec 05 12:07:17 compute-0 podman[228150]: 2025-12-05 12:07:17.837154312 +0000 UTC m=+0.051440884 container died 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.862 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b-userdata-shm.mount: Deactivated successfully.
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.868 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-a0675ec35d20a48a0477b2dc90980940bf1de49a39a39196259a264090cabf69-merged.mount: Deactivated successfully.
Dec 05 12:07:17 compute-0 podman[228150]: 2025-12-05 12:07:17.879693016 +0000 UTC m=+0.093979568 container cleanup 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 12:07:17 compute-0 systemd[1]: libpod-conmon-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b.scope: Deactivated successfully.
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.913 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance destroyed successfully.
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.914 187212 DEBUG nova.objects.instance [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'resources' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.929 187212 DEBUG nova.virt.libvirt.vif [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:41Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.930 187212 DEBUG nova.network.os_vif_util [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.930 187212 DEBUG nova.network.os_vif_util [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.931 187212 DEBUG os_vif [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.935 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap549318e9-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.938 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.941 187212 INFO os_vif [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6')
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.941 187212 INFO nova.virt.libvirt.driver [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Deleting instance files /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf_del
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.942 187212 INFO nova.virt.libvirt.driver [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Deletion of /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf_del complete
Dec 05 12:07:17 compute-0 podman[228189]: 2025-12-05 12:07:17.952963272 +0000 UTC m=+0.048832948 container remove 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.959 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2e03c293-d85d-4840-aab1-563f7af25c3b]: (4, ('Fri Dec  5 12:07:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 (912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b)\n912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b\nFri Dec  5 12:07:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 (912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b)\n912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.961 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d25777-92c4-4734-8a98-996c50eac59f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.962 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.963 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 kernel: tap4a2d11fe-a0: left promiscuous mode
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.965 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:17 compute-0 ovn_controller[95610]: 2025-12-05T12:07:17Z|00549|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec 05 12:07:17 compute-0 ovn_controller[95610]: 2025-12-05T12:07:17Z|00550|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.968 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ca65f1fc-d6aa-4302-9fcf-2f029e3e26db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.986 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54ec0e18-d977-4306-b295-9e08eb1b0d3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.987 187212 INFO nova.compute.manager [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.988 187212 DEBUG oslo.service.loopingcall [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:07:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.988 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f0828aca-894c-4ecd-a81c-70838f38470e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.988 187212 DEBUG nova.compute.manager [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.989 187212 DEBUG nova.network.neutron [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:07:17 compute-0 nova_compute[187208]: 2025-12-05 12:07:17.996 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:18.005 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8613182e-29da-44c3-9f8a-f0fba6302573]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372582, 'reachable_time': 34004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228206, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:18 compute-0 systemd[1]: run-netns-ovnmeta\x2d4a2d11fe\x2da91d\x2d4cf5\x2dbde7\x2d283f0aa52f63.mount: Deactivated successfully.
Dec 05 12:07:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:18.016 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:07:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:18.016 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[894892ca-2ccb-403a-bd63-77736b91bc8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.105 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.106 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.106 187212 INFO nova.compute.manager [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Unshelving
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.188 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.189 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.194 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.209 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.222 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.223 187212 INFO nova.compute.claims [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.232 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936438.2324052, 854e3893-3908-4b4a-b29c-7fb4384e4f0c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.233 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] VM Started (Lifecycle Event)
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.257 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.262 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936438.233243, 854e3893-3908-4b4a-b29c-7fb4384e4f0c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.262 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] VM Paused (Lifecycle Event)
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.281 187212 DEBUG nova.network.neutron [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Updated VIF entry in instance network info cache for port 1b4ab157-ddea-449c-ab91-983a53dd2045. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.282 187212 DEBUG nova.network.neutron [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Updating instance_info_cache with network_info: [{"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.287 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.290 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.304 187212 DEBUG oslo_concurrency.lockutils [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.311 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.426 187212 DEBUG nova.compute.provider_tree [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.438 187212 DEBUG nova.scheduler.client.report [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:18 compute-0 nova_compute[187208]: 2025-12-05 12:07:18.456 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.036 187212 INFO nova.network.neutron [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.270 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936424.2694447, 472c7e2c-bdad-4230-904b-6937ceb872d2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.270 187212 INFO nova.compute.manager [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] VM Stopped (Lifecycle Event)
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.290 187212 DEBUG nova.compute.manager [None req-5290adf5-332a-4bd2-8732-23221d7e73ff - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.524 187212 DEBUG nova.compute.manager [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-deleted-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.525 187212 DEBUG nova.compute.manager [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.525 187212 DEBUG oslo_concurrency.lockutils [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.526 187212 DEBUG oslo_concurrency.lockutils [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.527 187212 DEBUG oslo_concurrency.lockutils [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.527 187212 DEBUG nova.compute.manager [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.528 187212 DEBUG nova.compute.manager [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.621 187212 DEBUG nova.network.neutron [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.645 187212 INFO nova.compute.manager [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Took 1.66 seconds to deallocate network for instance.
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.675 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-unplugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.675 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.676 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.676 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.677 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] No waiting events found dispatching network-vif-unplugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.677 187212 WARNING nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received unexpected event network-vif-unplugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 for instance with vm_state deleted and task_state None.
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.677 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.677 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.678 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.678 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.678 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] No waiting events found dispatching network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.678 187212 WARNING nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received unexpected event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 for instance with vm_state deleted and task_state None.
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.679 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-deleted-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.679 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.679 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.679 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.680 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.680 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Processing event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.681 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:07:19 compute-0 sshd-session[228214]: Received disconnect from 193.46.255.20 port 17440:11:  [preauth]
Dec 05 12:07:19 compute-0 sshd-session[228214]: Disconnected from authenticating user root 193.46.255.20 port 17440 [preauth]
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.685 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936439.684806, 854e3893-3908-4b4a-b29c-7fb4384e4f0c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.685 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] VM Resumed (Lifecycle Event)
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.687 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.883 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.883 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.884 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.887 187212 INFO nova.virt.libvirt.driver [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance spawned successfully.
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.888 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.900 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.910 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.910 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.911 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.911 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.912 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.912 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.924 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.990 187212 INFO nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Took 9.81 seconds to spawn the instance on the hypervisor.
Dec 05 12:07:19 compute-0 nova_compute[187208]: 2025-12-05 12:07:19.991 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.029 187212 DEBUG nova.compute.provider_tree [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.039 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936425.0384374, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.040 187212 INFO nova.compute.manager [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] VM Stopped (Lifecycle Event)
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.046 187212 DEBUG nova.scheduler.client.report [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.055 187212 INFO nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Took 10.44 seconds to build instance.
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.062 187212 DEBUG nova.compute.manager [None req-efafde38-a9ce-4ffe-a95f-1dd94f410d42 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.069 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.073 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.775s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.089 187212 INFO nova.scheduler.client.report [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Deleted allocations for instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.152 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.152 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.155 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.176 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.244 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.245 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.250 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.250 187212 INFO nova.compute.claims [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.402 187212 DEBUG nova.compute.provider_tree [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.416 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.417 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.417 187212 DEBUG nova.network.neutron [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.419 187212 DEBUG nova.scheduler.client.report [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.484 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.484 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.528 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.529 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.557 187212 INFO nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.572 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.657 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.659 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.659 187212 INFO nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Creating image(s)
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.660 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.660 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.661 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.679 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.752 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.753 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.754 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.764 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.825 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.826 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.866 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.867 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.868 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.924 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.925 187212 DEBUG nova.virt.disk.api [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Checking if we can resize image /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.926 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.986 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.987 187212 DEBUG nova.virt.disk.api [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Cannot resize image /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:07:20 compute-0 nova_compute[187208]: 2025-12-05 12:07:20.988 187212 DEBUG nova.objects.instance [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'migration_context' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:21 compute-0 nova_compute[187208]: 2025-12-05 12:07:21.001 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:07:21 compute-0 nova_compute[187208]: 2025-12-05 12:07:21.002 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Ensure instance console log exists: /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:07:21 compute-0 nova_compute[187208]: 2025-12-05 12:07:21.002 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:21 compute-0 nova_compute[187208]: 2025-12-05 12:07:21.003 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:21 compute-0 nova_compute[187208]: 2025-12-05 12:07:21.003 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:21 compute-0 nova_compute[187208]: 2025-12-05 12:07:21.064 187212 DEBUG nova.policy [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:07:21 compute-0 podman[228231]: 2025-12-05 12:07:21.215408813 +0000 UTC m=+0.068431587 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 12:07:21 compute-0 ovn_controller[95610]: 2025-12-05T12:07:21Z|00551|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec 05 12:07:21 compute-0 ovn_controller[95610]: 2025-12-05T12:07:21Z|00552|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:07:21 compute-0 nova_compute[187208]: 2025-12-05 12:07:21.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:21 compute-0 nova_compute[187208]: 2025-12-05 12:07:21.901 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Successfully created port: d596fdf6-011f-43a4-bdb8-e76cc7302187 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.423 187212 INFO nova.compute.manager [None req-6d43865c-2a73-43b9-ac56-288219ab4719 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Get console output
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.515 213424 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.571 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936427.5699482, ed00d159-9d70-481e-93be-ea180fea04ba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.572 187212 INFO nova.compute.manager [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] VM Stopped (Lifecycle Event)
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.595 187212 DEBUG nova.compute.manager [None req-84cb1d41-2932-4279-8526-d136a835395e - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.623 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.624 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.624 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.625 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.625 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.625 187212 WARNING nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state deleted and task_state None.
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.625 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-deleted-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.626 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.626 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing instance network info cache due to event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.626 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.647 187212 DEBUG nova.network.neutron [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.669 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.670 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.671 187212 INFO nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating image(s)
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.672 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.672 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.672 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.673 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.674 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.674 187212 DEBUG nova.network.neutron [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.696 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.697 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.705 187212 DEBUG nova.compute.manager [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.706 187212 DEBUG oslo_concurrency.lockutils [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.706 187212 DEBUG oslo_concurrency.lockutils [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.706 187212 DEBUG oslo_concurrency.lockutils [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.706 187212 DEBUG nova.compute.manager [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] No waiting events found dispatching network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.707 187212 WARNING nova.compute.manager [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received unexpected event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 for instance with vm_state active and task_state None.
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:22 compute-0 nova_compute[187208]: 2025-12-05 12:07:22.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.003 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Successfully updated port: d596fdf6-011f-43a4-bdb8-e76cc7302187 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.012 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.013 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.033 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.033 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.034 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.043 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.116 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.117 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.126 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.126 187212 INFO nova.compute.claims [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.370 187212 DEBUG nova.compute.provider_tree [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.384 187212 DEBUG nova.scheduler.client.report [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.404 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.405 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.475 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.476 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.494 187212 INFO nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.512 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.529 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.617 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.619 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.620 187212 INFO nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Creating image(s)
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.620 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.621 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.622 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.637 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.705 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.706 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.708 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.719 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.795 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.797 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.829 187212 DEBUG nova.policy [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.839 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.840 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.841 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.942 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.943 187212 DEBUG nova.virt.disk.api [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Checking if we can resize image /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:07:23 compute-0 nova_compute[187208]: 2025-12-05 12:07:23.944 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:24 compute-0 nova_compute[187208]: 2025-12-05 12:07:24.004 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:24 compute-0 nova_compute[187208]: 2025-12-05 12:07:24.005 187212 DEBUG nova.virt.disk.api [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Cannot resize image /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:07:24 compute-0 nova_compute[187208]: 2025-12-05 12:07:24.006 187212 DEBUG nova.objects.instance [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'migration_context' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:24 compute-0 nova_compute[187208]: 2025-12-05 12:07:24.034 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:07:24 compute-0 nova_compute[187208]: 2025-12-05 12:07:24.035 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Ensure instance console log exists: /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:07:24 compute-0 nova_compute[187208]: 2025-12-05 12:07:24.035 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:24 compute-0 nova_compute[187208]: 2025-12-05 12:07:24.036 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:24 compute-0 nova_compute[187208]: 2025-12-05 12:07:24.036 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.036 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.098 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.099 187212 DEBUG nova.virt.images [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] 13b862b8-8b0a-448a-bbba-7d8ef455d2c6 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.100 187212 DEBUG nova.privsep.utils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.101 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.part /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.518 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.part /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.converted" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.536 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.565 187212 DEBUG nova.compute.manager [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.566 187212 DEBUG nova.compute.manager [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing instance network info cache due to event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.567 187212 DEBUG oslo_concurrency.lockutils [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.590 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936430.5894876, b81bb939-d14f-4a72-b7fe-95fc5d8810a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.590 187212 INFO nova.compute.manager [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] VM Stopped (Lifecycle Event)
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.606 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.converted --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.607 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.623 187212 DEBUG nova.compute.manager [None req-8820c450-8dd3-4246-9da0-dea368daa4ba - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.624 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.688 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.690 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.691 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.704 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.769 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully created port: f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.773 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.774 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89,backing_fmt=raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.809 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89,backing_fmt=raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.810 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.810 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:25 compute-0 sshd-session[228252]: Connection reset by authenticating user root 45.135.232.92 port 49950 [preauth]
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.868 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.870 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'migration_context' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.887 187212 INFO nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Rebasing disk image.
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.887 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.909 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.930 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.931 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance network_info: |[{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.931 187212 DEBUG oslo_concurrency.lockutils [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.932 187212 DEBUG nova.network.neutron [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.936 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start _get_guest_xml network_info=[{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.941 187212 WARNING nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.945 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.946 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.972 187212 DEBUG nova.virt.libvirt.host [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.973 187212 DEBUG nova.virt.libvirt.host [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.978 187212 DEBUG nova.virt.libvirt.host [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.979 187212 DEBUG nova.virt.libvirt.host [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.980 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.980 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.981 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.981 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.981 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.982 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.982 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.982 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.983 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.983 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.983 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.984 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.989 187212 DEBUG nova.virt.libvirt.vif [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:20Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.989 187212 DEBUG nova.network.os_vif_util [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.990 187212 DEBUG nova.network.os_vif_util [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:25 compute-0 nova_compute[187208]: 2025-12-05 12:07:25.991 187212 DEBUG nova.objects.instance [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.009 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <uuid>39a36503-acd4-4199-89f3-2e714ef9e5c5</uuid>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <name>instance-00000042</name>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <nova:name>tempest-SecurityGroupsTestJSON-server-1919324581</nova:name>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:07:25</nova:creationTime>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:07:26 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:07:26 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:07:26 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:07:26 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:07:26 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:07:26 compute-0 nova_compute[187208]:         <nova:user uuid="8db061f8c48141d1ac1c3216db1cc7f8">tempest-SecurityGroupsTestJSON-549628149-project-member</nova:user>
Dec 05 12:07:26 compute-0 nova_compute[187208]:         <nova:project uuid="442a804e3368417d9de1636d533a25e0">tempest-SecurityGroupsTestJSON-549628149</nova:project>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:07:26 compute-0 nova_compute[187208]:         <nova:port uuid="d596fdf6-011f-43a4-bdb8-e76cc7302187">
Dec 05 12:07:26 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <system>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <entry name="serial">39a36503-acd4-4199-89f3-2e714ef9e5c5</entry>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <entry name="uuid">39a36503-acd4-4199-89f3-2e714ef9e5c5</entry>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     </system>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <os>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   </os>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <features>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   </features>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:20:58:3d"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <target dev="tapd596fdf6-01"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/console.log" append="off"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <video>
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     </video>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:07:26 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:07:26 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:07:26 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:07:26 compute-0 nova_compute[187208]: </domain>
Dec 05 12:07:26 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.010 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Preparing to wait for external event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.012 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.012 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.012 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.013 187212 DEBUG nova.virt.libvirt.vif [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:20Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.013 187212 DEBUG nova.network.os_vif_util [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.013 187212 DEBUG nova.network.os_vif_util [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.014 187212 DEBUG os_vif [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.015 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.015 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.017 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.018 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd596fdf6-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.018 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd596fdf6-01, col_values=(('external_ids', {'iface-id': 'd596fdf6-011f-43a4-bdb8-e76cc7302187', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:58:3d', 'vm-uuid': '39a36503-acd4-4199-89f3-2e714ef9e5c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:26 compute-0 NetworkManager[55691]: <info>  [1764936446.0209] manager: (tapd596fdf6-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.027 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.028 187212 INFO os_vif [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01')
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.097 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.097 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.098 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No VIF found with MAC fa:16:3e:20:58:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.098 187212 INFO nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Using config drive
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.138 187212 DEBUG nova.network.neutron [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updated VIF entry in instance network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.138 187212 DEBUG nova.network.neutron [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.154 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.808 187212 INFO nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Creating config drive at /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.813 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvcvvu7gy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:26 compute-0 nova_compute[187208]: 2025-12-05 12:07:26.942 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvcvvu7gy" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:27 compute-0 kernel: tapd596fdf6-01: entered promiscuous mode
Dec 05 12:07:27 compute-0 NetworkManager[55691]: <info>  [1764936447.0054] manager: (tapd596fdf6-01): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Dec 05 12:07:27 compute-0 ovn_controller[95610]: 2025-12-05T12:07:27Z|00553|binding|INFO|Claiming lport d596fdf6-011f-43a4-bdb8-e76cc7302187 for this chassis.
Dec 05 12:07:27 compute-0 ovn_controller[95610]: 2025-12-05T12:07:27Z|00554|binding|INFO|d596fdf6-011f-43a4-bdb8-e76cc7302187: Claiming fa:16:3e:20:58:3d 10.100.0.11
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.057 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.063 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:58:3d 10.100.0.11'], port_security=['fa:16:3e:20:58:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '39a36503-acd4-4199-89f3-2e714ef9e5c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d596fdf6-011f-43a4-bdb8-e76cc7302187) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.064 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d596fdf6-011f-43a4-bdb8-e76cc7302187 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 bound to our chassis
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.067 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4
Dec 05 12:07:27 compute-0 ovn_controller[95610]: 2025-12-05T12:07:27Z|00555|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 ovn-installed in OVS
Dec 05 12:07:27 compute-0 ovn_controller[95610]: 2025-12-05T12:07:27Z|00556|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 up in Southbound
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.075 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.079 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.085 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d7179543-d28b-4352-aecc-562524b9def1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:27 compute-0 systemd-udevd[228323]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:07:27 compute-0 NetworkManager[55691]: <info>  [1764936447.1037] device (tapd596fdf6-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:07:27 compute-0 NetworkManager[55691]: <info>  [1764936447.1050] device (tapd596fdf6-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:07:27 compute-0 systemd-machined[153543]: New machine qemu-71-instance-00000042.
Dec 05 12:07:27 compute-0 systemd[1]: Started Virtual Machine qemu-71-instance-00000042.
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.130 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2546f613-dc25-4bb4-8d7e-4e00cbb73866]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.135 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f08c6432-5fda-4f6b-b532-86cdf52b0c72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.175 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ad6a4c-fdd6-4ede-99d4-a81fb64ea1b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.191 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9b190fec-aef9-4663-9736-994538c5a815]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228336, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.208 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[18302545-0aa0-4f1a-99fb-e9f9eee07412]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376710, 'tstamp': 376710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228338, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376713, 'tstamp': 376713}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228338, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.211 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.213 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.214 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.215 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.215 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.548 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk" returned: 0 in 1.602s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.549 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.549 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Ensure instance console log exists: /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.550 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.550 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.550 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.553 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start _get_guest_xml network_info=[{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='28a29f9d03a8dff023fa9db5bc7f166e',container_format='bare',created_at=2025-12-05T12:06:56Z,direct_url=<?>,disk_format='qcow2',id=13b862b8-8b0a-448a-bbba-7d8ef455d2c6,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-795100487-shelved',owner='6d62df5807554f499d26b5fc77ec8603',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-12-05T12:07:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.560 187212 WARNING nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.565 187212 DEBUG nova.virt.libvirt.host [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.566 187212 DEBUG nova.virt.libvirt.host [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.569 187212 DEBUG nova.virt.libvirt.host [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.569 187212 DEBUG nova.virt.libvirt.host [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.569 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.569 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='28a29f9d03a8dff023fa9db5bc7f166e',container_format='bare',created_at=2025-12-05T12:06:56Z,direct_url=<?>,disk_format='qcow2',id=13b862b8-8b0a-448a-bbba-7d8ef455d2c6,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-795100487-shelved',owner='6d62df5807554f499d26b5fc77ec8603',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-12-05T12:07:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.570 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.570 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.570 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.570 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.571 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.571 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.571 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.571 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.572 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.572 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.572 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.636 187212 DEBUG nova.virt.libvirt.vif [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='13b862b8-8b0a-448a-bbba-7d8ef455d2c6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-105541899',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member',shelved_at='2025-12-05T12:07:04.112687',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='13b862b8-8b0a-448a-bbba-7d8ef455d2c6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.637 187212 DEBUG nova.network.os_vif_util [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.637 187212 DEBUG nova.network.os_vif_util [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.638 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.910 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <uuid>5d70ac2d-111f-4e1b-ac26-3e02849b0458</uuid>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <name>instance-0000003e</name>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <nova:name>tempest-AttachVolumeShelveTestJSON-server-795100487</nova:name>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:07:27</nova:creationTime>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:07:27 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:07:27 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:07:27 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:07:27 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:07:27 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:07:27 compute-0 nova_compute[187208]:         <nova:user uuid="bc4332be3b424a5e996b61b244505cfc">tempest-AttachVolumeShelveTestJSON-1858452545-project-member</nova:user>
Dec 05 12:07:27 compute-0 nova_compute[187208]:         <nova:project uuid="6d62df5807554f499d26b5fc77ec8603">tempest-AttachVolumeShelveTestJSON-1858452545</nova:project>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="13b862b8-8b0a-448a-bbba-7d8ef455d2c6"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:07:27 compute-0 nova_compute[187208]:         <nova:port uuid="ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b">
Dec 05 12:07:27 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <system>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <entry name="serial">5d70ac2d-111f-4e1b-ac26-3e02849b0458</entry>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <entry name="uuid">5d70ac2d-111f-4e1b-ac26-3e02849b0458</entry>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     </system>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <os>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   </os>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <features>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   </features>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:6a:c5:99"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <target dev="tapac02dd63-5a"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/console.log" append="off"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <video>
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     </video>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:07:27 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:07:27 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:07:27 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:07:27 compute-0 nova_compute[187208]: </domain>
Dec 05 12:07:27 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.910 187212 DEBUG nova.compute.manager [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Preparing to wait for external event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.910 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.911 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.911 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.912 187212 DEBUG nova.virt.libvirt.vif [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='13b862b8-8b0a-448a-bbba-7d8ef455d2c6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-105541899',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member',shelved_at='2025-12-05T12:07:04.112687',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='13b862b8-8b0a-448a-bbba-7d8ef455d2c6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.912 187212 DEBUG nova.network.os_vif_util [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.912 187212 DEBUG nova.network.os_vif_util [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.913 187212 DEBUG os_vif [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.913 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.914 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.917 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac02dd63-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.917 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapac02dd63-5a, col_values=(('external_ids', {'iface-id': 'ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:c5:99', 'vm-uuid': '5d70ac2d-111f-4e1b-ac26-3e02849b0458'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.919 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 NetworkManager[55691]: <info>  [1764936447.9198] manager: (tapac02dd63-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.926 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:27 compute-0 nova_compute[187208]: 2025-12-05 12:07:27.926 187212 INFO os_vif [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a')
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.155 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully updated port: f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.164 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.165 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.167 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No VIF found with MAC fa:16:3e:6a:c5:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.168 187212 INFO nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Using config drive
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.211 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936448.2097366, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.211 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Started (Lifecycle Event)
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.214 187212 DEBUG nova.network.neutron [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updated VIF entry in instance network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.214 187212 DEBUG nova.network.neutron [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:28 compute-0 podman[228349]: 2025-12-05 12:07:28.219543271 +0000 UTC m=+0.066254564 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 12:07:28 compute-0 podman[228348]: 2025-12-05 12:07:28.229796208 +0000 UTC m=+0.078686124 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.324 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936433.3228905, bcdca3f9-3e24-4209-808c-8093b55e5c2d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.324 187212 INFO nova.compute.manager [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] VM Stopped (Lifecycle Event)
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.371 187212 DEBUG nova.compute.manager [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.371 187212 DEBUG oslo_concurrency.lockutils [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.371 187212 DEBUG oslo_concurrency.lockutils [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.372 187212 DEBUG oslo_concurrency.lockutils [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.372 187212 DEBUG nova.compute.manager [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Processing event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.373 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.376 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.379 187212 INFO nova.virt.libvirt.driver [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance spawned successfully.
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.379 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.442 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.446 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.446 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.446 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.448 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.449 187212 DEBUG oslo_concurrency.lockutils [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.451 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.503 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.504 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936448.2099152, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.504 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Paused (Lifecycle Event)
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.510 187212 DEBUG nova.compute.manager [None req-815fa428-c21e-488f-96f0-4b03e6268301 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.518 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.518 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.519 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.519 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.519 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.520 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.524 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'keypairs' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.526 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.529 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936448.381738, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.529 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Resumed (Lifecycle Event)
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.549 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.557 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.574 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.581 187212 INFO nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Took 7.92 seconds to spawn the instance on the hypervisor.
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.581 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.634 187212 INFO nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Took 8.41 seconds to build instance.
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.648 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.496s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.845 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.884 187212 INFO nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating config drive at /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config
Dec 05 12:07:28 compute-0 nova_compute[187208]: 2025-12-05 12:07:28.890 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4hv2bkec execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.018 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4hv2bkec" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:29 compute-0 kernel: tapac02dd63-5a: entered promiscuous mode
Dec 05 12:07:29 compute-0 NetworkManager[55691]: <info>  [1764936449.0866] manager: (tapac02dd63-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Dec 05 12:07:29 compute-0 systemd-udevd[228327]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:29 compute-0 ovn_controller[95610]: 2025-12-05T12:07:29Z|00557|binding|INFO|Claiming lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for this chassis.
Dec 05 12:07:29 compute-0 ovn_controller[95610]: 2025-12-05T12:07:29Z|00558|binding|INFO|ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b: Claiming fa:16:3e:6a:c5:99 10.100.0.8
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.095 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:29 compute-0 NetworkManager[55691]: <info>  [1764936449.1034] device (tapac02dd63-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:07:29 compute-0 NetworkManager[55691]: <info>  [1764936449.1044] device (tapac02dd63-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:07:29 compute-0 ovn_controller[95610]: 2025-12-05T12:07:29Z|00559|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b ovn-installed in OVS
Dec 05 12:07:29 compute-0 ovn_controller[95610]: 2025-12-05T12:07:29Z|00560|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b up in Southbound
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.108 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c5:99 10.100.0.8'], port_security=['fa:16:3e:6a:c5:99 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d62df5807554f499d26b5fc77ec8603', 'neutron:revision_number': '7', 'neutron:security_group_ids': '5a04f4af-e81b-4661-95ed-5737ffc98cae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a7d298f-265e-44c5-a73a-18dd9ed0b171, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.109 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b in datapath fc6ce614-d0f7-413f-bc3e-26f7271993d9 bound to our chassis
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.111 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.113 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.123 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b24dc6d3-79dd-4ed8-993b-4e469bb568c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.124 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfc6ce614-d1 in ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.126 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfc6ce614-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.126 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d1900567-7792-46e3-a080-2336e1eaba9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.127 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[46a14a7b-f6e6-40e1-962d-02a62b3117d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 systemd-machined[153543]: New machine qemu-72-instance-0000003e.
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.142 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b0fa45b0-c070-4c49-8273-586640ae8f05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 systemd[1]: Started Virtual Machine qemu-72-instance-0000003e.
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.164 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43daa7d4-ecd8-4139-9f89-19148a6240f4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.200 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1b1bd4aa-fbdc-42f7-b620-03c9082fdd58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.213 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6098934c-c22e-4f1d-b88c-e671619d479b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 NetworkManager[55691]: <info>  [1764936449.2178] manager: (tapfc6ce614-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/231)
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.249 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa64d84-8463-400f-8ca0-c2f357a0ebd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.252 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[259513ff-8b04-4a34-b61a-984efa0f7cf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 NetworkManager[55691]: <info>  [1764936449.2838] device (tapfc6ce614-d0): carrier: link connected
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.291 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[852bfbe5-67b3-412a-afe8-db8cfbab04be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.310 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[00fad784-ba8f-4520-af9e-74d3b90a6242]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc6ce614-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:6b:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382984, 'reachable_time': 22274, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228437, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.324 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6217ad-57ce-40d5-8cc8-6f0961d1b6b8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:6b90'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382984, 'tstamp': 382984}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228438, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.351 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2cabaf7c-e0d5-4151-97f0-d0eec0ee0274]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc6ce614-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:6b:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382984, 'reachable_time': 22274, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228439, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.383 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[18f3c71b-1ed8-4ad2-a57c-51568d7d8c96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.464 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[97ae0480-6004-4017-85f8-f3e3ed288acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.465 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6ce614-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.465 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.465 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc6ce614-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.467 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:29 compute-0 NetworkManager[55691]: <info>  [1764936449.4677] manager: (tapfc6ce614-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Dec 05 12:07:29 compute-0 kernel: tapfc6ce614-d0: entered promiscuous mode
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.472 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc6ce614-d0, col_values=(('external_ids', {'iface-id': '1b193bb7-c39e-445c-9a2c-dd8ee58553b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:29 compute-0 ovn_controller[95610]: 2025-12-05T12:07:29Z|00561|binding|INFO|Releasing lport 1b193bb7-c39e-445c-9a2c-dd8ee58553b9 from this chassis (sb_readonly=0)
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.475 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.475 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.476 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[432b1155-12fe-4d0d-b8c7-a2d1bc43e8ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.477 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:07:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.478 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'env', 'PROCESS_TAG=haproxy-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fc6ce614-d0f7-413f-bc3e-26f7271993d9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:07:29 compute-0 nova_compute[187208]: 2025-12-05 12:07:29.487 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:29 compute-0 sshd-session[228300]: Connection reset by authenticating user root 45.135.232.92 port 22842 [preauth]
Dec 05 12:07:29 compute-0 podman[228472]: 2025-12-05 12:07:29.885780582 +0000 UTC m=+0.059190478 container create 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:07:29 compute-0 systemd[1]: Started libpod-conmon-770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f.scope.
Dec 05 12:07:29 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:07:29 compute-0 podman[228472]: 2025-12-05 12:07:29.853502336 +0000 UTC m=+0.026912262 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:07:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/609c1c748c26ab3742ccbcbaed3a0fb9e3b7ac74e56bc02b438dfce85dc57371/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:07:29 compute-0 podman[228472]: 2025-12-05 12:07:29.995367832 +0000 UTC m=+0.168777728 container init 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 05 12:07:30 compute-0 podman[228472]: 2025-12-05 12:07:30.004891589 +0000 UTC m=+0.178301485 container start 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:07:30 compute-0 nova_compute[187208]: 2025-12-05 12:07:30.010 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936435.0093517, 8888dd78-1c78-4065-8536-9a1096bdf57b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:30 compute-0 nova_compute[187208]: 2025-12-05 12:07:30.011 187212 INFO nova.compute.manager [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] VM Stopped (Lifecycle Event)
Dec 05 12:07:30 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [NOTICE]   (228491) : New worker (228493) forked
Dec 05 12:07:30 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [NOTICE]   (228491) : Loading success.
Dec 05 12:07:30 compute-0 nova_compute[187208]: 2025-12-05 12:07:30.526 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936450.5262105, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:30 compute-0 nova_compute[187208]: 2025-12-05 12:07:30.527 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Started (Lifecycle Event)
Dec 05 12:07:30 compute-0 nova_compute[187208]: 2025-12-05 12:07:30.730 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:30 compute-0 nova_compute[187208]: 2025-12-05 12:07:30.731 187212 DEBUG nova.compute.manager [None req-853b82b1-d298-4fbc-81ac-8e0255a73d3b - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:30 compute-0 nova_compute[187208]: 2025-12-05 12:07:30.734 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936450.5295753, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:30 compute-0 nova_compute[187208]: 2025-12-05 12:07:30.735 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Paused (Lifecycle Event)
Dec 05 12:07:31 compute-0 nova_compute[187208]: 2025-12-05 12:07:31.246 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:31 compute-0 nova_compute[187208]: 2025-12-05 12:07:31.251 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:31 compute-0 ovn_controller[95610]: 2025-12-05T12:07:31Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:e5:0a 10.100.0.13
Dec 05 12:07:31 compute-0 ovn_controller[95610]: 2025-12-05T12:07:31Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:e5:0a 10.100.0.13
Dec 05 12:07:31 compute-0 nova_compute[187208]: 2025-12-05 12:07:31.813 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:07:32 compute-0 sshd-session[228450]: Connection reset by authenticating user root 45.135.232.92 port 22860 [preauth]
Dec 05 12:07:32 compute-0 ovn_controller[95610]: 2025-12-05T12:07:32Z|00562|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec 05 12:07:32 compute-0 ovn_controller[95610]: 2025-12-05T12:07:32Z|00563|binding|INFO|Releasing lport 1b193bb7-c39e-445c-9a2c-dd8ee58553b9 from this chassis (sb_readonly=0)
Dec 05 12:07:32 compute-0 ovn_controller[95610]: 2025-12-05T12:07:32Z|00564|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.761 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.873 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.912 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936437.9086945, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.912 187212 INFO nova.compute.manager [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Stopped (Lifecycle Event)
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.915 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.916 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance network_info: |[{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.918 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start _get_guest_xml network_info=[{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.920 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.925 187212 WARNING nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.931 187212 DEBUG nova.virt.libvirt.host [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.933 187212 DEBUG nova.virt.libvirt.host [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.938 187212 DEBUG nova.virt.libvirt.host [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.939 187212 DEBUG nova.virt.libvirt.host [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.939 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.940 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.940 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.941 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.941 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.942 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.942 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.943 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.944 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.944 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.945 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.946 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.953 187212 DEBUG nova.virt.libvirt.vif [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.954 187212 DEBUG nova.network.os_vif_util [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.955 187212 DEBUG nova.network.os_vif_util [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.957 187212 DEBUG nova.objects.instance [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_devices' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.960 187212 DEBUG nova.compute.manager [None req-ebb36257-f0c5-4f43-a28e-1f4d3fc15521 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.972 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <uuid>f1e72d05-87e7-495d-9dbb-1a10b112c69f</uuid>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <name>instance-00000043</name>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:07:32</nova:creationTime>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:07:32 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:07:32 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:07:32 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:07:32 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:07:32 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:07:32 compute-0 nova_compute[187208]:         <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:07:32 compute-0 nova_compute[187208]:         <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:07:32 compute-0 nova_compute[187208]:         <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec 05 12:07:32 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <system>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <entry name="serial">f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <entry name="uuid">f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     </system>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <os>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   </os>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <features>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   </features>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:01:99:b0"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <target dev="tapf7a6775e-6d"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log" append="off"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <video>
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     </video>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:07:32 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:07:32 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:07:32 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:07:32 compute-0 nova_compute[187208]: </domain>
Dec 05 12:07:32 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.978 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Preparing to wait for external event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.978 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.978 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.979 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.979 187212 DEBUG nova.virt.libvirt.vif [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.980 187212 DEBUG nova.network.os_vif_util [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.981 187212 DEBUG nova.network.os_vif_util [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.981 187212 DEBUG os_vif [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.982 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.983 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.986 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.986 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a6775e-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.987 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7a6775e-6d, col_values=(('external_ids', {'iface-id': 'f7a6775e-6d9c-48e1-91d7-829a6f5f3742', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:99:b0', 'vm-uuid': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.988 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:32 compute-0 NetworkManager[55691]: <info>  [1764936452.9895] manager: (tapf7a6775e-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.990 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.996 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:32 compute-0 nova_compute[187208]: 2025-12-05 12:07:32.997 187212 INFO os_vif [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d')
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.050 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.051 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.051 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:01:99:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.051 187212 INFO nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Using config drive
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.303 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.304 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-f7a6775e-6d9c-48e1-91d7-829a6f5f3742. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.304 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.305 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.305 187212 DEBUG nova.network.neutron [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.801 187212 INFO nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Creating config drive at /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.808 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph3wf633v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:33 compute-0 nova_compute[187208]: 2025-12-05 12:07:33.939 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph3wf633v" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:34 compute-0 sshd-session[228524]: Invalid user monitor from 45.135.232.92 port 22874
Dec 05 12:07:34 compute-0 NetworkManager[55691]: <info>  [1764936454.0304] manager: (tapf7a6775e-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Dec 05 12:07:34 compute-0 kernel: tapf7a6775e-6d: entered promiscuous mode
Dec 05 12:07:34 compute-0 ovn_controller[95610]: 2025-12-05T12:07:34Z|00565|binding|INFO|Claiming lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 for this chassis.
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.079 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:34 compute-0 ovn_controller[95610]: 2025-12-05T12:07:34Z|00566|binding|INFO|f7a6775e-6d9c-48e1-91d7-829a6f5f3742: Claiming fa:16:3e:01:99:b0 10.100.0.7
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.087 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:99:b0 10.100.0.7'], port_security=['fa:16:3e:01:99:b0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '83c79c65-073e-4860-a990-92e9abafc0bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f7a6775e-6d9c-48e1-91d7-829a6f5f3742) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.088 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f7a6775e-6d9c-48e1-91d7-829a6f5f3742 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.090 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:07:34 compute-0 ovn_controller[95610]: 2025-12-05T12:07:34Z|00567|binding|INFO|Setting lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 ovn-installed in OVS
Dec 05 12:07:34 compute-0 ovn_controller[95610]: 2025-12-05T12:07:34Z|00568|binding|INFO|Setting lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 up in Southbound
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.100 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.102 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[168d68e2-18b1-4e6c-b97f-38bfcbcefa89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.103 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbfed6fc-31 in ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.104 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.106 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbfed6fc-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.106 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7e77ab2c-f126-487d-befb-65ab38c31b3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb789ad-fb57-4046-8767-a8540f71e515]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 systemd-machined[153543]: New machine qemu-73-instance-00000043.
Dec 05 12:07:34 compute-0 systemd-udevd[228593]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:07:34 compute-0 systemd[1]: Started Virtual Machine qemu-73-instance-00000043.
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.120 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[da03d46c-8f1c-4082-9bed-fa0a5669206c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 NetworkManager[55691]: <info>  [1764936454.1369] device (tapf7a6775e-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:07:34 compute-0 NetworkManager[55691]: <info>  [1764936454.1382] device (tapf7a6775e-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:07:34 compute-0 podman[228546]: 2025-12-05 12:07:34.143658297 +0000 UTC m=+0.127948913 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.152 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c1b84b-9ea3-43ce-80ee-94188ff59cda]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 podman[228548]: 2025-12-05 12:07:34.155598944 +0000 UTC m=+0.140872749 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.184 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8768fd-cbaf-407e-a55b-3e8a868fbc6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 NetworkManager[55691]: <info>  [1764936454.1971] manager: (tapfbfed6fc-30): new Veth device (/org/freedesktop/NetworkManager/Devices/235)
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.196 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cc089378-684b-4bbe-a7df-25c94f67d3ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.229 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4f756c58-347a-423b-bf33-fe26a0676ff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.233 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b153919e-83f0-4297-a643-6a4f8a5e7e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 NetworkManager[55691]: <info>  [1764936454.2586] device (tapfbfed6fc-30): carrier: link connected
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.265 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[59e43858-6186-4b26-945a-f96fce4155c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.283 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9bf408-f233-475f-95bd-c1a8ad227ef5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 35716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228639, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.297 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[945c38f2-2c3e-4beb-898c-4e963aaf672f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe26:8872'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383482, 'tstamp': 383482}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228640, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[df3024f8-0270-4e53-93d5-5eb1e401f076]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 35716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228641, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.347 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6f01e0-c185-4132-b757-0f7e67f95bc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 sshd-session[228524]: Connection reset by invalid user monitor 45.135.232.92 port 22874 [preauth]
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.418 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5559a484-0036-460d-b826-1bef9768ac7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.420 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.420 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.421 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:34 compute-0 NetworkManager[55691]: <info>  [1764936454.4256] manager: (tapfbfed6fc-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Dec 05 12:07:34 compute-0 kernel: tapfbfed6fc-30: entered promiscuous mode
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.425 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.427 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:34 compute-0 ovn_controller[95610]: 2025-12-05T12:07:34Z|00569|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.430 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.434 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ba0eb845-ead2-4d37-904c-d2aa82e112db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.434 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:07:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.435 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'env', 'PROCESS_TAG=haproxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.443 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.445 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936454.4443576, f1e72d05-87e7-495d-9dbb-1a10b112c69f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.445 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] VM Started (Lifecycle Event)
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.470 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.475 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936454.4447885, f1e72d05-87e7-495d-9dbb-1a10b112c69f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.476 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] VM Paused (Lifecycle Event)
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.503 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.506 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:34 compute-0 nova_compute[187208]: 2025-12-05 12:07:34.536 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:07:34 compute-0 podman[228681]: 2025-12-05 12:07:34.865663879 +0000 UTC m=+0.053371170 container create 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:07:34 compute-0 systemd[1]: Started libpod-conmon-9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b.scope.
Dec 05 12:07:34 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:07:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/186f8522ea6206e72ff71431d61cc132ff2e048571a25807429a54cf15a146be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:07:34 compute-0 podman[228681]: 2025-12-05 12:07:34.834604908 +0000 UTC m=+0.022312229 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:07:34 compute-0 podman[228681]: 2025-12-05 12:07:34.941378226 +0000 UTC m=+0.129085557 container init 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 05 12:07:34 compute-0 podman[228681]: 2025-12-05 12:07:34.946771683 +0000 UTC m=+0.134478984 container start 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:07:34 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [NOTICE]   (228701) : New worker (228703) forked
Dec 05 12:07:34 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [NOTICE]   (228701) : Loading success.
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.125 187212 DEBUG nova.network.neutron [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port f7a6775e-6d9c-48e1-91d7-829a6f5f3742. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.125 187212 DEBUG nova.network.neutron [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.173 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.174 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.174 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.175 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.175 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.175 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] No waiting events found dispatching network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.175 187212 WARNING nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received unexpected event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 for instance with vm_state active and task_state None.
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.176 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.176 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.176 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.176 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.177 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Processing event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.177 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.177 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.177 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.178 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.178 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.178 187212 WARNING nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state shelved_offloaded and task_state spawning.
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.179 187212 DEBUG nova.compute.manager [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.184 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936455.1841257, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.184 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Resumed (Lifecycle Event)
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.186 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.195 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance spawned successfully.
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.206 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.211 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:35 compute-0 nova_compute[187208]: 2025-12-05 12:07:35.256 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.078 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.078 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.100 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.234 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.235 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.244 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.244 187212 INFO nova.compute.claims [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:07:36 compute-0 sshd-session[228658]: Invalid user admin from 45.135.232.92 port 36684
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.461 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.589 187212 DEBUG nova.compute.provider_tree [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.610 187212 DEBUG nova.scheduler.client.report [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.651 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.652 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.681 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.683 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.683 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.683 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.684 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Processing event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.684 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.684 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing instance network info cache due to event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.684 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.685 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.685 187212 DEBUG nova.network.neutron [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.686 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.690 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.692 187212 INFO nova.virt.libvirt.driver [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance spawned successfully.
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.693 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.696 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936456.6958237, f1e72d05-87e7-495d-9dbb-1a10b112c69f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.696 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] VM Resumed (Lifecycle Event)
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.707 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.707 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.720 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.720 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.721 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.721 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.722 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.722 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.728 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.731 187212 INFO nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.735 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.765 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.766 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.820 187212 INFO nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Took 13.20 seconds to spawn the instance on the hypervisor.
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.821 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:36 compute-0 sshd-session[228658]: Connection reset by invalid user admin 45.135.232.92 port 36684 [preauth]
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.897 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.899 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.899 187212 INFO nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Creating image(s)
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.900 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.900 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.901 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.923 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.961 187212 INFO nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Took 13.87 seconds to build instance.
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.990 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.991 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.991 187212 INFO nova.compute.manager [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Rebooting instance
Dec 05 12:07:36 compute-0 nova_compute[187208]: 2025-12-05 12:07:36.993 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.980s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.009 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.012 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.013 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.014 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.015 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.032 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.099 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.100 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.144 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.145 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.146 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.213 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.216 187212 DEBUG nova.virt.disk.api [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Checking if we can resize image /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.216 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.251 187212 DEBUG nova.policy [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.287 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.293 187212 DEBUG nova.virt.disk.api [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Cannot resize image /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.294 187212 DEBUG nova.objects.instance [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.310 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.311 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Ensure instance console log exists: /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.311 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.312 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.312 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.402 187212 DEBUG nova.compute.manager [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.616 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 19.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.837 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:37 compute-0 nova_compute[187208]: 2025-12-05 12:07:37.989 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:38 compute-0 podman[228728]: 2025-12-05 12:07:38.212691344 +0000 UTC m=+0.059257941 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 12:07:38 compute-0 nova_compute[187208]: 2025-12-05 12:07:38.768 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Successfully created port: 29e412e9-d3cc-4af2-b85a-ab48fcad0372 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.028 187212 DEBUG nova.network.neutron [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updated VIF entry in instance network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.029 187212 DEBUG nova.network.neutron [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.062 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.063 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.063 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.063 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.064 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.064 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.064 187212 WARNING nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 for instance with vm_state building and task_state spawning.
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.065 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:39 compute-0 nova_compute[187208]: 2025-12-05 12:07:39.065 187212 DEBUG nova.network.neutron [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:40 compute-0 nova_compute[187208]: 2025-12-05 12:07:40.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:40 compute-0 nova_compute[187208]: 2025-12-05 12:07:40.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:07:40 compute-0 nova_compute[187208]: 2025-12-05 12:07:40.440 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:40 compute-0 nova_compute[187208]: 2025-12-05 12:07:40.551 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:40 compute-0 nova_compute[187208]: 2025-12-05 12:07:40.551 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:40 compute-0 nova_compute[187208]: 2025-12-05 12:07:40.551 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:07:41 compute-0 ovn_controller[95610]: 2025-12-05T12:07:41Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:20:58:3d 10.100.0.11
Dec 05 12:07:41 compute-0 ovn_controller[95610]: 2025-12-05T12:07:41Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:20:58:3d 10.100.0.11
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.104 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Successfully updated port: 29e412e9-d3cc-4af2-b85a-ab48fcad0372 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.126 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.126 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.126 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.225 187212 DEBUG nova.compute.manager [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received event network-changed-29e412e9-d3cc-4af2-b85a-ab48fcad0372 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.226 187212 DEBUG nova.compute.manager [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Refreshing instance network info cache due to event network-changed-29e412e9-d3cc-4af2-b85a-ab48fcad0372. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.226 187212 DEBUG oslo_concurrency.lockutils [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.724 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.821 187212 DEBUG nova.network.neutron [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.840 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:41 compute-0 nova_compute[187208]: 2025-12-05 12:07:41.841 187212 DEBUG nova.compute.manager [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:41 compute-0 kernel: tapd596fdf6-01 (unregistering): left promiscuous mode
Dec 05 12:07:41 compute-0 NetworkManager[55691]: <info>  [1764936461.9954] device (tapd596fdf6-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00570|binding|INFO|Releasing lport d596fdf6-011f-43a4-bdb8-e76cc7302187 from this chassis (sb_readonly=0)
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00571|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 down in Southbound
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00572|binding|INFO|Removing iface tapd596fdf6-01 ovn-installed in OVS
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.019 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:58:3d 10.100.0.11'], port_security=['fa:16:3e:20:58:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '39a36503-acd4-4199-89f3-2e714ef9e5c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ca5c04f9-329e-4501-a23a-78c17c54e4de fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d596fdf6-011f-43a4-bdb8-e76cc7302187) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.020 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d596fdf6-011f-43a4-bdb8-e76cc7302187 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 unbound from our chassis
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.011 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.022 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.041 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d5bd40-0bad-47ca-84a5-53fc9b883c4e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000042.scope: Deactivated successfully.
Dec 05 12:07:42 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000042.scope: Consumed 13.008s CPU time.
Dec 05 12:07:42 compute-0 systemd-machined[153543]: Machine qemu-71-instance-00000042 terminated.
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.078 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[28bc87fb-b749-4259-8d5a-b44ffaea15da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.081 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[65a89004-633d-4cb5-9d1b-abc0ac6a831e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.110 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4f997d39-6283-45e5-8043-556e70b52205]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.124 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[592d3d6f-3e64-4abc-8912-e3abccb5f1ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228764, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.138 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b520aede-01bf-4b69-be96-3cc823695a75]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376710, 'tstamp': 376710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228765, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376713, 'tstamp': 376713}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228765, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.140 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.142 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.148 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.149 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.149 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.149 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.150 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.186 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.194 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.227 187212 INFO nova.virt.libvirt.driver [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance destroyed successfully.
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.228 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'resources' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.244 187212 DEBUG nova.virt.libvirt.vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:41Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.245 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.246 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.246 187212 DEBUG os_vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.249 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.249 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd596fdf6-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.253 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.256 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.258 187212 INFO os_vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01')
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.265 187212 DEBUG nova.virt.libvirt.driver [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start _get_guest_xml network_info=[{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.269 187212 WARNING nova.virt.libvirt.driver [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.277 187212 DEBUG nova.virt.libvirt.host [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.279 187212 DEBUG nova.virt.libvirt.host [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.284 187212 DEBUG nova.virt.libvirt.host [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.285 187212 DEBUG nova.virt.libvirt.host [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.285 187212 DEBUG nova.virt.libvirt.driver [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.285 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.285 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.287 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.287 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.287 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.287 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.304 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.362 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.363 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.363 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.364 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.366 187212 DEBUG nova.virt.libvirt.vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:41Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.366 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.367 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.368 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.387 187212 DEBUG nova.virt.libvirt.driver [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <uuid>39a36503-acd4-4199-89f3-2e714ef9e5c5</uuid>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <name>instance-00000042</name>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <nova:name>tempest-SecurityGroupsTestJSON-server-1919324581</nova:name>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:07:42</nova:creationTime>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:07:42 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:07:42 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:07:42 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:07:42 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:07:42 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:07:42 compute-0 nova_compute[187208]:         <nova:user uuid="8db061f8c48141d1ac1c3216db1cc7f8">tempest-SecurityGroupsTestJSON-549628149-project-member</nova:user>
Dec 05 12:07:42 compute-0 nova_compute[187208]:         <nova:project uuid="442a804e3368417d9de1636d533a25e0">tempest-SecurityGroupsTestJSON-549628149</nova:project>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:07:42 compute-0 nova_compute[187208]:         <nova:port uuid="d596fdf6-011f-43a4-bdb8-e76cc7302187">
Dec 05 12:07:42 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <system>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <entry name="serial">39a36503-acd4-4199-89f3-2e714ef9e5c5</entry>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <entry name="uuid">39a36503-acd4-4199-89f3-2e714ef9e5c5</entry>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     </system>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <os>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   </os>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <features>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   </features>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:20:58:3d"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <target dev="tapd596fdf6-01"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/console.log" append="off"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <video>
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     </video>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:07:42 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:07:42 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:07:42 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:07:42 compute-0 nova_compute[187208]: </domain>
Dec 05 12:07:42 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.388 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.457 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.458 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.518 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.520 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.534 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.597 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.598 187212 DEBUG nova.virt.disk.api [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Checking if we can resize image /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.599 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.663 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.664 187212 DEBUG nova.virt.disk.api [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Cannot resize image /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.664 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'migration_context' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.682 187212 DEBUG nova.virt.libvirt.vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:41Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.683 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.684 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.685 187212 DEBUG os_vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.685 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.686 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.687 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.691 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd596fdf6-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.692 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd596fdf6-01, col_values=(('external_ids', {'iface-id': 'd596fdf6-011f-43a4-bdb8-e76cc7302187', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:58:3d', 'vm-uuid': '39a36503-acd4-4199-89f3-2e714ef9e5c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 NetworkManager[55691]: <info>  [1764936462.7419] manager: (tapd596fdf6-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.742 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.747 187212 INFO os_vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01')
Dec 05 12:07:42 compute-0 kernel: tapd596fdf6-01: entered promiscuous mode
Dec 05 12:07:42 compute-0 systemd-udevd[228755]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:07:42 compute-0 NetworkManager[55691]: <info>  [1764936462.8246] manager: (tapd596fdf6-01): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00573|binding|INFO|Claiming lport d596fdf6-011f-43a4-bdb8-e76cc7302187 for this chassis.
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00574|binding|INFO|d596fdf6-011f-43a4-bdb8-e76cc7302187: Claiming fa:16:3e:20:58:3d 10.100.0.11
Dec 05 12:07:42 compute-0 NetworkManager[55691]: <info>  [1764936462.8393] device (tapd596fdf6-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:07:42 compute-0 NetworkManager[55691]: <info>  [1764936462.8434] device (tapd596fdf6-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.841 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:58:3d 10.100.0.11'], port_security=['fa:16:3e:20:58:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '39a36503-acd4-4199-89f3-2e714ef9e5c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ca5c04f9-329e-4501-a23a-78c17c54e4de fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d596fdf6-011f-43a4-bdb8-e76cc7302187) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.844 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d596fdf6-011f-43a4-bdb8-e76cc7302187 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 bound to our chassis
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.848 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00575|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 ovn-installed in OVS
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00576|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 up in Southbound
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.849 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.854 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00577|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00578|binding|INFO|Releasing lport 1b193bb7-c39e-445c-9a2c-dd8ee58553b9 from this chassis (sb_readonly=0)
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00579|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:07:42 compute-0 ovn_controller[95610]: 2025-12-05T12:07:42Z|00580|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.864 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1d409f7f-1748-41e1-8e23-c24570f7fd79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 systemd-machined[153543]: New machine qemu-74-instance-00000042.
Dec 05 12:07:42 compute-0 systemd[1]: Started Virtual Machine qemu-74-instance-00000042.
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.899 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d4bfe351-eb8b-4dda-888c-6fb10efc959b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.902 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e3de5e47-d2cb-4930-a3e4-a8261bc24802]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.931 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d62d6c-f5a3-478f-bed8-2d777286229d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.951 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4707a2-4008-4dab-8a20-d39f19c8967f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228824, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.977 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[52a8ba84-e2b1-429f-95f5-bbe2a57ccb19]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376710, 'tstamp': 376710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228827, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376713, 'tstamp': 376713}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228827, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.979 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.981 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 nova_compute[187208]: 2025-12-05 12:07:42.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.982 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.983 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.983 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.984 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.043 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Updating instance_info_cache with network_info: [{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.070 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.070 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance network_info: |[{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.071 187212 DEBUG oslo_concurrency.lockutils [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.071 187212 DEBUG nova.network.neutron [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Refreshing network info cache for port 29e412e9-d3cc-4af2-b85a-ab48fcad0372 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.075 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Start _get_guest_xml network_info=[{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.081 187212 WARNING nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.093 187212 DEBUG nova.virt.libvirt.host [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.094 187212 DEBUG nova.virt.libvirt.host [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.105 187212 DEBUG nova.virt.libvirt.host [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.106 187212 DEBUG nova.virt.libvirt.host [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.106 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.107 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.107 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.107 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.108 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.108 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.108 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.108 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.109 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.109 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.109 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.109 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.114 187212 DEBUG nova.virt.libvirt.vif [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1539570170',display_name='tempest-ServerActionsTestOtherB-server-1539570170',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1539570170',id=68,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-bfsh2n18',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:36Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=5659bd52-8c24-483d-80a4-8eb6b28e1349,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.114 187212 DEBUG nova.network.os_vif_util [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.115 187212 DEBUG nova.network.os_vif_util [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.116 187212 DEBUG nova.objects.instance [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.162 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <uuid>5659bd52-8c24-483d-80a4-8eb6b28e1349</uuid>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <name>instance-00000044</name>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestOtherB-server-1539570170</nova:name>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:07:43</nova:creationTime>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:07:43 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:07:43 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:07:43 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:07:43 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:07:43 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:07:43 compute-0 nova_compute[187208]:         <nova:user uuid="4ad1281afc874c0ca55d908d3a6e05a8">tempest-ServerActionsTestOtherB-1759520420-project-member</nova:user>
Dec 05 12:07:43 compute-0 nova_compute[187208]:         <nova:project uuid="58cbd93e463049988ccd6d013893e7d6">tempest-ServerActionsTestOtherB-1759520420</nova:project>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:07:43 compute-0 nova_compute[187208]:         <nova:port uuid="29e412e9-d3cc-4af2-b85a-ab48fcad0372">
Dec 05 12:07:43 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <system>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <entry name="serial">5659bd52-8c24-483d-80a4-8eb6b28e1349</entry>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <entry name="uuid">5659bd52-8c24-483d-80a4-8eb6b28e1349</entry>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     </system>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <os>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   </os>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <features>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   </features>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.config"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:68:32:38"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <target dev="tap29e412e9-d3"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/console.log" append="off"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <video>
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     </video>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:07:43 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:07:43 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:07:43 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:07:43 compute-0 nova_compute[187208]: </domain>
Dec 05 12:07:43 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.163 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Preparing to wait for external event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.163 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.163 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.163 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.164 187212 DEBUG nova.virt.libvirt.vif [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1539570170',display_name='tempest-ServerActionsTestOtherB-server-1539570170',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1539570170',id=68,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-bfsh2n18',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:36Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=5659bd52-8c24-483d-80a4-8eb6b28e1349,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.164 187212 DEBUG nova.network.os_vif_util [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.165 187212 DEBUG nova.network.os_vif_util [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.165 187212 DEBUG os_vif [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.166 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.166 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.169 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29e412e9-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.170 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29e412e9-d3, col_values=(('external_ids', {'iface-id': '29e412e9-d3cc-4af2-b85a-ab48fcad0372', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:32:38', 'vm-uuid': '5659bd52-8c24-483d-80a4-8eb6b28e1349'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:43 compute-0 NetworkManager[55691]: <info>  [1764936463.1721] manager: (tap29e412e9-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.178 187212 INFO os_vif [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3')
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.290 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 39a36503-acd4-4199-89f3-2e714ef9e5c5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.290 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936463.2896805, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.291 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Resumed (Lifecycle Event)
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.293 187212 DEBUG nova.compute.manager [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.295 187212 INFO nova.virt.libvirt.driver [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance rebooted successfully.
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.296 187212 DEBUG nova.compute.manager [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.520 187212 DEBUG nova.compute.manager [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.520 187212 DEBUG nova.compute.manager [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-f7a6775e-6d9c-48e1-91d7-829a6f5f3742. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.521 187212 DEBUG oslo_concurrency.lockutils [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.521 187212 DEBUG oslo_concurrency.lockutils [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.521 187212 DEBUG nova.network.neutron [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.574 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.578 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.602 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.602 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.602 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No VIF found with MAC fa:16:3e:68:32:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.603 187212 INFO nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Using config drive
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.609 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.610 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936463.2913365, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.610 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Started (Lifecycle Event)
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.623 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.634 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:43 compute-0 nova_compute[187208]: 2025-12-05 12:07:43.637 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:44 compute-0 podman[228840]: 2025-12-05 12:07:44.210060978 +0000 UTC m=+0.060761214 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.687 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.783 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.784 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.784 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.785 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.785 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.785 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.785 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.786 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.786 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.786 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.880 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.881 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.881 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.881 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:07:44 compute-0 nova_compute[187208]: 2025-12-05 12:07:44.985 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.122 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.124 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.175 187212 INFO nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Creating config drive at /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.config
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.181 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkht8fwj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.202 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.209 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.277 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.278 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.313 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkht8fwj" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.350 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.359 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 NetworkManager[55691]: <info>  [1764936465.3792] manager: (tap29e412e9-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/240)
Dec 05 12:07:45 compute-0 kernel: tap29e412e9-d3: entered promiscuous mode
Dec 05 12:07:45 compute-0 ovn_controller[95610]: 2025-12-05T12:07:45Z|00581|binding|INFO|Claiming lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 for this chassis.
Dec 05 12:07:45 compute-0 ovn_controller[95610]: 2025-12-05T12:07:45Z|00582|binding|INFO|29e412e9-d3cc-4af2-b85a-ab48fcad0372: Claiming fa:16:3e:68:32:38 10.100.0.6
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.385 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:45 compute-0 ovn_controller[95610]: 2025-12-05T12:07:45Z|00583|binding|INFO|Setting lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 ovn-installed in OVS
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.403 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:45 compute-0 systemd-udevd[228896]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:07:45 compute-0 systemd-machined[153543]: New machine qemu-75-instance-00000044.
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.437 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.438 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 systemd[1]: Started Virtual Machine qemu-75-instance-00000044.
Dec 05 12:07:45 compute-0 NetworkManager[55691]: <info>  [1764936465.4475] device (tap29e412e9-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:07:45 compute-0 NetworkManager[55691]: <info>  [1764936465.4484] device (tap29e412e9-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.512 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.533 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.588 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:32:38 10.100.0.6'], port_security=['fa:16:3e:68:32:38 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5659bd52-8c24-483d-80a4-8eb6b28e1349', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df1c03c3-b3c9-47b6-a712-a13948dd510e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=29e412e9-d3cc-4af2-b85a-ab48fcad0372) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.590 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 29e412e9-d3cc-4af2-b85a-ab48fcad0372 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis
Dec 05 12:07:45 compute-0 ovn_controller[95610]: 2025-12-05T12:07:45Z|00584|binding|INFO|Setting lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 up in Southbound
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.592 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.608 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7038ee-9865-423e-ba3e-4d2d354e0e92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.608 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.609 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.635 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d116ef78-0b7b-4f66-8451-9a5896ff7ca3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.639 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6f73cc-1a61-4e90-a0db-9ec209715629]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.666 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.667 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1e022c-44f7-469b-8275-4b8ceacae915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.682 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.685 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c20b6f1-0456-4d8d-b26b-3b6a2fa60ab8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228921, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.704 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f63db98-e7a4-40df-9dbb-0058e4b9d3e9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228922, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228922, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.706 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.716 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.716 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.716 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.717 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.717 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.779 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.780 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.846 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.856 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.937 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:45 compute-0 nova_compute[187208]: 2025-12-05 12:07:45.938 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.009 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.016 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.064 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936466.0631733, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.065 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Started (Lifecycle Event)
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.090 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.090 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.154 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.162 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936466.0634668, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.162 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Paused (Lifecycle Event)
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.170 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.378 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.383 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.443 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.444 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4805MB free_disk=73.0350456237793GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.444 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.445 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.450 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.484 187212 DEBUG nova.network.neutron [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Updated VIF entry in instance network info cache for port 29e412e9-d3cc-4af2-b85a-ab48fcad0372. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.484 187212 DEBUG nova.network.neutron [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Updating instance_info_cache with network_info: [{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:46.622 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:46.622 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.627 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.653 187212 DEBUG oslo_concurrency.lockutils [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.684 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.684 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance e9f9bf08-7688-4213-91ff-74f2271ec71d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.684 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 854e3893-3908-4b4a-b29c-7fb4384e4f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.684 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 5d70ac2d-111f-4e1b-ac26-3e02849b0458 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 39a36503-acd4-4199-89f3-2e714ef9e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance f1e72d05-87e7-495d-9dbb-1a10b112c69f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 5659bd52-8c24-483d-80a4-8eb6b28e1349 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1408MB phys_disk=79GB used_disk=7GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.863 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.880 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.889 187212 DEBUG nova.compute.manager [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.890 187212 DEBUG nova.compute.manager [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing instance network info cache due to event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.890 187212 DEBUG oslo_concurrency.lockutils [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.890 187212 DEBUG oslo_concurrency.lockutils [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.890 187212 DEBUG nova.network.neutron [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:07:46 compute-0 nova_compute[187208]: 2025-12-05 12:07:46.922 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:47 compute-0 nova_compute[187208]: 2025-12-05 12:07:47.056 187212 DEBUG nova.network.neutron [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port f7a6775e-6d9c-48e1-91d7-829a6f5f3742. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:47 compute-0 nova_compute[187208]: 2025-12-05 12:07:47.056 187212 DEBUG nova.network.neutron [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:47 compute-0 ovn_controller[95610]: 2025-12-05T12:07:47Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:c5:99 10.100.0.8
Dec 05 12:07:47 compute-0 nova_compute[187208]: 2025-12-05 12:07:47.211 187212 DEBUG oslo_concurrency.lockutils [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:47 compute-0 nova_compute[187208]: 2025-12-05 12:07:47.423 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:47 compute-0 nova_compute[187208]: 2025-12-05 12:07:47.855 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:48 compute-0 nova_compute[187208]: 2025-12-05 12:07:48.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:48 compute-0 ovn_controller[95610]: 2025-12-05T12:07:48Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:01:99:b0 10.100.0.7
Dec 05 12:07:48 compute-0 ovn_controller[95610]: 2025-12-05T12:07:48Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:01:99:b0 10.100.0.7
Dec 05 12:07:48 compute-0 nova_compute[187208]: 2025-12-05 12:07:48.918 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:48 compute-0 nova_compute[187208]: 2025-12-05 12:07:48.919 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.040 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.041 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.041 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.041 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.041 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.042 187212 INFO nova.compute.manager [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Terminating instance
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.043 187212 DEBUG nova.compute.manager [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:07:50 compute-0 kernel: tapd596fdf6-01 (unregistering): left promiscuous mode
Dec 05 12:07:50 compute-0 NetworkManager[55691]: <info>  [1764936470.0608] device (tapd596fdf6-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.068 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 ovn_controller[95610]: 2025-12-05T12:07:50Z|00585|binding|INFO|Releasing lport d596fdf6-011f-43a4-bdb8-e76cc7302187 from this chassis (sb_readonly=0)
Dec 05 12:07:50 compute-0 ovn_controller[95610]: 2025-12-05T12:07:50Z|00586|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 down in Southbound
Dec 05 12:07:50 compute-0 ovn_controller[95610]: 2025-12-05T12:07:50Z|00587|binding|INFO|Removing iface tapd596fdf6-01 ovn-installed in OVS
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.080 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.091 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.098 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:58:3d 10.100.0.11'], port_security=['fa:16:3e:20:58:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '39a36503-acd4-4199-89f3-2e714ef9e5c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'af894ac8-cd98-4c47-9a74-1921c6ddcff3 ca5c04f9-329e-4501-a23a-78c17c54e4de fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d596fdf6-011f-43a4-bdb8-e76cc7302187) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.100 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d596fdf6-011f-43a4-bdb8-e76cc7302187 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 unbound from our chassis
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.102 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4
Dec 05 12:07:50 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000042.scope: Deactivated successfully.
Dec 05 12:07:50 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000042.scope: Consumed 7.339s CPU time.
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.118 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b7128b-5959-462a-beb2-32990b465c14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:50 compute-0 systemd-machined[153543]: Machine qemu-74-instance-00000042 terminated.
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.144 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8e393fd9-10bb-4b1b-bcca-654faef0aaf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.147 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ac807f-163e-4f19-91ab-26225337a4d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.175 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[72d6b17e-5a6b-443a-8b71-dedaf3db3c01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.192 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fda1c4d3-36ea-4dd4-8caa-d5089449f25f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228988, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.208 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd80af5-6812-46ae-9ea9-6ba94e8e5cce]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376710, 'tstamp': 376710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228989, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376713, 'tstamp': 376713}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228989, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.210 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.211 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.215 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.217 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.264 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.270 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.307 187212 INFO nova.virt.libvirt.driver [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance destroyed successfully.
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.309 187212 DEBUG nova.objects.instance [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'resources' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.328 187212 DEBUG nova.virt.libvirt.vif [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:43Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.328 187212 DEBUG nova.network.os_vif_util [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.329 187212 DEBUG nova.network.os_vif_util [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.330 187212 DEBUG os_vif [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.333 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd596fdf6-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.335 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.337 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.338 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.341 187212 INFO os_vif [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01')
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.342 187212 INFO nova.virt.libvirt.driver [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Deleting instance files /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5_del
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.342 187212 INFO nova.virt.libvirt.driver [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Deletion of /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5_del complete
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.402 187212 INFO nova.compute.manager [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.403 187212 DEBUG oslo.service.loopingcall [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.403 187212 DEBUG nova.compute.manager [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.403 187212 DEBUG nova.network.neutron [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.450 187212 DEBUG nova.network.neutron [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updated VIF entry in instance network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.450 187212 DEBUG nova.network.neutron [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:50 compute-0 nova_compute[187208]: 2025-12-05 12:07:50.480 187212 DEBUG oslo_concurrency.lockutils [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:07:52 compute-0 nova_compute[187208]: 2025-12-05 12:07:52.031 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:52 compute-0 podman[229007]: 2025-12-05 12:07:52.224598455 +0000 UTC m=+0.063975907 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm)
Dec 05 12:07:52 compute-0 nova_compute[187208]: 2025-12-05 12:07:52.547 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:52 compute-0 nova_compute[187208]: 2025-12-05 12:07:52.905 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:52 compute-0 nova_compute[187208]: 2025-12-05 12:07:52.998 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.201 187212 DEBUG nova.network.neutron [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.218 187212 INFO nova.compute.manager [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Took 2.81 seconds to deallocate network for instance.
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.281 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.281 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.406 187212 DEBUG nova.compute.manager [req-5cc4fc36-9d4a-4daa-aeb4-94654b1b00a1 req-57451ebf-db91-4dd7-9480-89a0975d2215 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-vif-deleted-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.528 187212 DEBUG nova.compute.provider_tree [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.554 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.554 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.556 187212 DEBUG nova.scheduler.client.report [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.605 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.608 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.647 187212 INFO nova.scheduler.client.report [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Deleted allocations for instance 39a36503-acd4-4199-89f3-2e714ef9e5c5
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.719 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.720 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.725 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.726 187212 INFO nova.compute.claims [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.740 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:53 compute-0 nova_compute[187208]: 2025-12-05 12:07:53.994 187212 DEBUG nova.compute.provider_tree [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.023 187212 DEBUG nova.scheduler.client.report [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.051 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.051 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.109 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.110 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.134 187212 INFO nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.171 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.300 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.303 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.304 187212 INFO nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Creating image(s)
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.305 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.305 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.306 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.319 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.382 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.383 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.383 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.395 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.457 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.459 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.499 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.501 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.501 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.568 187212 DEBUG nova.policy [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21ddc7a76417447daa2a5a26cdf17d53', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'feb2d7c8b49945a08355fc4f902f2786', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.572 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.573 187212 DEBUG nova.virt.disk.api [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Checking if we can resize image /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.573 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.635 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.636 187212 DEBUG nova.virt.disk.api [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Cannot resize image /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.637 187212 DEBUG nova.objects.instance [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lazy-loading 'migration_context' on Instance uuid 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.664 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.665 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Ensure instance console log exists: /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.665 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.666 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:54 compute-0 nova_compute[187208]: 2025-12-05 12:07:54.666 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:55 compute-0 nova_compute[187208]: 2025-12-05 12:07:55.337 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:55.624 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.711 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.711 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.711 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.712 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.712 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.713 187212 INFO nova.compute.manager [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Terminating instance
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.715 187212 DEBUG nova.compute.manager [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:07:56 compute-0 kernel: tapac02dd63-5a (unregistering): left promiscuous mode
Dec 05 12:07:56 compute-0 NetworkManager[55691]: <info>  [1764936476.7444] device (tapac02dd63-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:56 compute-0 ovn_controller[95610]: 2025-12-05T12:07:56Z|00588|binding|INFO|Releasing lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b from this chassis (sb_readonly=0)
Dec 05 12:07:56 compute-0 ovn_controller[95610]: 2025-12-05T12:07:56Z|00589|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b down in Southbound
Dec 05 12:07:56 compute-0 ovn_controller[95610]: 2025-12-05T12:07:56Z|00590|binding|INFO|Removing iface tapac02dd63-5a ovn-installed in OVS
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.760 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c5:99 10.100.0.8'], port_security=['fa:16:3e:6a:c5:99 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d62df5807554f499d26b5fc77ec8603', 'neutron:revision_number': '9', 'neutron:security_group_ids': '5a04f4af-e81b-4661-95ed-5737ffc98cae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a7d298f-265e-44c5-a73a-18dd9ed0b171, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:07:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.762 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b in datapath fc6ce614-d0f7-413f-bc3e-26f7271993d9 unbound from our chassis
Dec 05 12:07:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.764 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc6ce614-d0f7-413f-bc3e-26f7271993d9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:07:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.765 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d6427e64-d7a6-4166-b715-21e4c5283020]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.765 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 namespace which is not needed anymore
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.782 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:56 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Dec 05 12:07:56 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000003e.scope: Consumed 13.871s CPU time.
Dec 05 12:07:56 compute-0 systemd-machined[153543]: Machine qemu-72-instance-0000003e terminated.
Dec 05 12:07:56 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [NOTICE]   (228491) : haproxy version is 2.8.14-c23fe91
Dec 05 12:07:56 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [NOTICE]   (228491) : path to executable is /usr/sbin/haproxy
Dec 05 12:07:56 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [WARNING]  (228491) : Exiting Master process...
Dec 05 12:07:56 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [ALERT]    (228491) : Current worker (228493) exited with code 143 (Terminated)
Dec 05 12:07:56 compute-0 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [WARNING]  (228491) : All workers exited. Exiting... (0)
Dec 05 12:07:56 compute-0 systemd[1]: libpod-770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f.scope: Deactivated successfully.
Dec 05 12:07:56 compute-0 podman[229063]: 2025-12-05 12:07:56.903234872 +0000 UTC m=+0.045119500 container died 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 12:07:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-609c1c748c26ab3742ccbcbaed3a0fb9e3b7ac74e56bc02b438dfce85dc57371-merged.mount: Deactivated successfully.
Dec 05 12:07:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f-userdata-shm.mount: Deactivated successfully.
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.940 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:56 compute-0 podman[229063]: 2025-12-05 12:07:56.942285045 +0000 UTC m=+0.084169673 container cleanup 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:07:56 compute-0 systemd[1]: libpod-conmon-770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f.scope: Deactivated successfully.
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.982 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance destroyed successfully.
Dec 05 12:07:56 compute-0 nova_compute[187208]: 2025-12-05 12:07:56.982 187212 DEBUG nova.objects.instance [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'resources' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:07:57 compute-0 podman[229104]: 2025-12-05 12:07:57.008701333 +0000 UTC m=+0.044357308 container remove 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:07:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.013 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[787d7c9a-baf1-4ea8-bb7c-466918771aa3]: (4, ('Fri Dec  5 12:07:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 (770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f)\n770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f\nFri Dec  5 12:07:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 (770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f)\n770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.015 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[750700a1-6de8-4439-b210-a09396e1c933]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.016 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6ce614-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.018 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:57 compute-0 kernel: tapfc6ce614-d0: left promiscuous mode
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.033 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.038 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f4495e-d718-4412-b0b3-52d64bbbaaa2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.054 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5c465009-88ac-4c68-900d-bff315dc0917]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.055 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[230b68ef-1662-4dd2-9599-25c80f50a827]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.071 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[269156e2-a2eb-4bc4-b8f8-aaf8b6f31eab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382975, 'reachable_time': 40482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229128, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.073 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:07:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.074 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce07fff-8778-4b2f-9519-6c449419f01c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:07:57 compute-0 systemd[1]: run-netns-ovnmeta\x2dfc6ce614\x2dd0f7\x2d413f\x2dbc3e\x2d26f7271993d9.mount: Deactivated successfully.
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.447 187212 DEBUG nova.virt.libvirt.vif [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHKhL003clvQeWhyQnRnlaccZLUvEBLEhvImBOCB5geqDizgWJsGjayma/8q9qGL/NiGPTPxEZoxanWZnFRBuZklxJy5hDaSwVjbF4FtdnX9ysLeFgNsQAX0H4LK24ei2Q==',key_name='tempest-keypair-105541899',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.448 187212 DEBUG nova.network.os_vif_util [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.448 187212 DEBUG nova.network.os_vif_util [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.449 187212 DEBUG os_vif [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.450 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.451 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac02dd63-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.452 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.455 187212 INFO os_vif [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a')
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.456 187212 INFO nova.virt.libvirt.driver [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deleting instance files /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458_del
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.462 187212 INFO nova.virt.libvirt.driver [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deletion of /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458_del complete
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.608 187212 INFO nova.compute.manager [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Took 0.89 seconds to destroy the instance on the hypervisor.
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.609 187212 DEBUG oslo.service.loopingcall [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.609 187212 DEBUG nova.compute.manager [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.610 187212 DEBUG nova.network.neutron [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.828 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Successfully created port: b66066cc-97eb-4896-a98d-267498dedf74 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:07:57 compute-0 nova_compute[187208]: 2025-12-05 12:07:57.908 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:07:58 compute-0 nova_compute[187208]: 2025-12-05 12:07:58.896 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:58 compute-0 nova_compute[187208]: 2025-12-05 12:07:58.897 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.019 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.136 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.137 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.147 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.147 187212 INFO nova.compute.claims [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:07:59 compute-0 podman[229130]: 2025-12-05 12:07:59.221634499 +0000 UTC m=+0.067818709 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 12:07:59 compute-0 podman[229129]: 2025-12-05 12:07:59.239873388 +0000 UTC m=+0.080534568 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9)
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.465 187212 DEBUG nova.compute.provider_tree [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.566 187212 DEBUG nova.scheduler.client.report [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.593 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.456s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.594 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.705 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.706 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.752 187212 INFO nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.773 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.885 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.886 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.887 187212 INFO nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating image(s)
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.888 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.888 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.889 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.904 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.972 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.973 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.974 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:07:59 compute-0 nova_compute[187208]: 2025-12-05 12:07:59.987 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.079 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.081 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.120 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.121 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.122 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.190 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.191 187212 DEBUG nova.virt.disk.api [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Checking if we can resize image /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.191 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.253 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.254 187212 DEBUG nova.virt.disk.api [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Cannot resize image /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.255 187212 DEBUG nova.objects.instance [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'migration_context' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.340 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.341 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Ensure instance console log exists: /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.342 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.342 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.343 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:00 compute-0 nova_compute[187208]: 2025-12-05 12:08:00.584 187212 DEBUG nova.policy [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:08:01 compute-0 nova_compute[187208]: 2025-12-05 12:08:01.589 187212 DEBUG nova.network.neutron [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:01 compute-0 nova_compute[187208]: 2025-12-05 12:08:01.610 187212 INFO nova.compute.manager [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Took 4.00 seconds to deallocate network for instance.
Dec 05 12:08:01 compute-0 nova_compute[187208]: 2025-12-05 12:08:01.666 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:01 compute-0 nova_compute[187208]: 2025-12-05 12:08:01.666 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:01 compute-0 nova_compute[187208]: 2025-12-05 12:08:01.880 187212 DEBUG nova.compute.provider_tree [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:08:01 compute-0 nova_compute[187208]: 2025-12-05 12:08:01.903 187212 DEBUG nova.scheduler.client.report [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:08:01 compute-0 nova_compute[187208]: 2025-12-05 12:08:01.954 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:01 compute-0 nova_compute[187208]: 2025-12-05 12:08:01.982 187212 INFO nova.scheduler.client.report [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Deleted allocations for instance 5d70ac2d-111f-4e1b-ac26-3e02849b0458
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.044 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.725 187212 DEBUG nova.compute.manager [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.725 187212 DEBUG oslo_concurrency.lockutils [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.726 187212 DEBUG oslo_concurrency.lockutils [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.726 187212 DEBUG oslo_concurrency.lockutils [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.726 187212 DEBUG nova.compute.manager [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Processing event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.727 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance event wait completed in 16 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.731 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936482.7313843, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.732 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Resumed (Lifecycle Event)
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.736 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.741 187212 INFO nova.virt.libvirt.driver [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance spawned successfully.
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.742 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.758 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.766 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.771 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.772 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.772 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.773 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.773 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.773 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.797 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.812 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Successfully updated port: b66066cc-97eb-4896-a98d-267498dedf74 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.839 187212 INFO nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Took 25.94 seconds to spawn the instance on the hypervisor.
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.839 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.840 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.840 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquired lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.840 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.891 187212 DEBUG nova.compute.manager [req-0d9dddb7-eb24-43b4-93c9-b78e96c8b927 req-86e7b140-5813-423f-a933-72747bc196fa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-deleted-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.905 187212 INFO nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Took 26.73 seconds to build instance.
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.911 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:02 compute-0 nova_compute[187208]: 2025-12-05 12:08:02.923 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:03.014 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:03.016 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:04 compute-0 nova_compute[187208]: 2025-12-05 12:08:04.116 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:08:05 compute-0 podman[229187]: 2025-12-05 12:08:05.223548315 +0000 UTC m=+0.076440289 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:08:05 compute-0 podman[229188]: 2025-12-05 12:08:05.232874276 +0000 UTC m=+0.082662110 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:08:05 compute-0 nova_compute[187208]: 2025-12-05 12:08:05.304 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936470.3032813, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:05 compute-0 nova_compute[187208]: 2025-12-05 12:08:05.305 187212 INFO nova.compute.manager [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Stopped (Lifecycle Event)
Dec 05 12:08:05 compute-0 nova_compute[187208]: 2025-12-05 12:08:05.627 187212 DEBUG nova.compute.manager [None req-30ab2256-17eb-492e-958c-9ff045abc5f6 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:06 compute-0 nova_compute[187208]: 2025-12-05 12:08:06.057 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Successfully created port: 11c7fa90-6a48-487a-a375-5adf7f41cb90 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.197 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Updating instance_info_cache with network_info: [{"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.218 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Releasing lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.219 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance network_info: |[{"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.222 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start _get_guest_xml network_info=[{"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.226 187212 WARNING nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.235 187212 DEBUG nova.virt.libvirt.host [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.236 187212 DEBUG nova.virt.libvirt.host [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.241 187212 DEBUG nova.virt.libvirt.host [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.242 187212 DEBUG nova.virt.libvirt.host [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.242 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.242 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.266 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.267 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.268 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.268 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.269 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.269 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.270 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.270 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.271 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.271 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.284 187212 DEBUG nova.virt.libvirt.vif [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1497746963',display_name='tempest-ServerAddressesNegativeTestJSON-server-1497746963',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1497746963',id=69,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='feb2d7c8b49945a08355fc4f902f2786',ramdisk_id='',reservation_id='r-wvd553zf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-717599576',owner_user_name='tempest-ServerAddressesNegativeTestJSON-717599576-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:54Z,user_data=None,user_id='21ddc7a76417447daa2a5a26cdf17d53',uuid=3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.285 187212 DEBUG nova.network.os_vif_util [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converting VIF {"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.286 187212 DEBUG nova.network.os_vif_util [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.287 187212 DEBUG nova.objects.instance [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.303 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <uuid>3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf</uuid>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <name>instance-00000045</name>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerAddressesNegativeTestJSON-server-1497746963</nova:name>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:08:07</nova:creationTime>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:08:07 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:08:07 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:08:07 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:08:07 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:08:07 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:08:07 compute-0 nova_compute[187208]:         <nova:user uuid="21ddc7a76417447daa2a5a26cdf17d53">tempest-ServerAddressesNegativeTestJSON-717599576-project-member</nova:user>
Dec 05 12:08:07 compute-0 nova_compute[187208]:         <nova:project uuid="feb2d7c8b49945a08355fc4f902f2786">tempest-ServerAddressesNegativeTestJSON-717599576</nova:project>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:08:07 compute-0 nova_compute[187208]:         <nova:port uuid="b66066cc-97eb-4896-a98d-267498dedf74">
Dec 05 12:08:07 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <system>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <entry name="serial">3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf</entry>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <entry name="uuid">3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf</entry>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     </system>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <os>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   </os>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <features>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   </features>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.config"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:b2:b8:fe"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <target dev="tapb66066cc-97"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/console.log" append="off"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <video>
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     </video>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:08:07 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:08:07 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:08:07 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:08:07 compute-0 nova_compute[187208]: </domain>
Dec 05 12:08:07 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.303 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Preparing to wait for external event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.304 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.304 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.304 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.305 187212 DEBUG nova.virt.libvirt.vif [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1497746963',display_name='tempest-ServerAddressesNegativeTestJSON-server-1497746963',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1497746963',id=69,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='feb2d7c8b49945a08355fc4f902f2786',ramdisk_id='',reservation_id='r-wvd553zf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-717599576',owner_user_name='tempest-ServerAddressesNegativeTestJSON-717599576-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:54Z,user_data=None,user_id='21ddc7a76417447daa2a5a26cdf17d53',uuid=3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.305 187212 DEBUG nova.network.os_vif_util [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converting VIF {"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.306 187212 DEBUG nova.network.os_vif_util [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.306 187212 DEBUG os_vif [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.307 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.307 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.308 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.311 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.312 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb66066cc-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.312 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb66066cc-97, col_values=(('external_ids', {'iface-id': 'b66066cc-97eb-4896-a98d-267498dedf74', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:b8:fe', 'vm-uuid': '3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.314 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:07 compute-0 NetworkManager[55691]: <info>  [1764936487.3150] manager: (tapb66066cc-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.319 187212 INFO os_vif [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97')
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.384 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.385 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.385 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] No VIF found with MAC fa:16:3e:b2:b8:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.386 187212 INFO nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Using config drive
Dec 05 12:08:07 compute-0 nova_compute[187208]: 2025-12-05 12:08:07.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.110 187212 INFO nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Creating config drive at /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.config
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.114 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxmh473oo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.191 187212 DEBUG nova.compute.manager [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-changed-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.192 187212 DEBUG nova.compute.manager [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Refreshing instance network info cache due to event network-changed-b66066cc-97eb-4896-a98d-267498dedf74. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.192 187212 DEBUG oslo_concurrency.lockutils [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.192 187212 DEBUG oslo_concurrency.lockutils [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.192 187212 DEBUG nova.network.neutron [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Refreshing network info cache for port b66066cc-97eb-4896-a98d-267498dedf74 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.197 187212 DEBUG nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.198 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.198 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.198 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.198 187212 DEBUG nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] No waiting events found dispatching network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.199 187212 WARNING nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received unexpected event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 for instance with vm_state active and task_state None.
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.199 187212 DEBUG nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.199 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.199 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.200 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.200 187212 DEBUG nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.200 187212 WARNING nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state deleted and task_state None.
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.242 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxmh473oo" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.292 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.293 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.293 187212 DEBUG nova.objects.instance [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:08 compute-0 kernel: tapb66066cc-97: entered promiscuous mode
Dec 05 12:08:08 compute-0 ovn_controller[95610]: 2025-12-05T12:08:08Z|00591|binding|INFO|Claiming lport b66066cc-97eb-4896-a98d-267498dedf74 for this chassis.
Dec 05 12:08:08 compute-0 NetworkManager[55691]: <info>  [1764936488.3295] manager: (tapb66066cc-97): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Dec 05 12:08:08 compute-0 ovn_controller[95610]: 2025-12-05T12:08:08Z|00592|binding|INFO|b66066cc-97eb-4896-a98d-267498dedf74: Claiming fa:16:3e:b2:b8:fe 10.100.0.8
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.328 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.335 187212 DEBUG nova.objects.instance [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.337 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:b8:fe 10.100.0.8'], port_security=['fa:16:3e:b2:b8:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'feb2d7c8b49945a08355fc4f902f2786', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fdf3db2a-0067-4a50-8487-b97fc3fdd122', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8cd27338-7640-4d03-958e-44ccc0e8c5fb, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b66066cc-97eb-4896-a98d-267498dedf74) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.339 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b66066cc-97eb-4896-a98d-267498dedf74 in datapath ba5c1b46-c606-429f-b268-8a88a7b3641a bound to our chassis
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.341 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ba5c1b46-c606-429f-b268-8a88a7b3641a
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.336 187212 INFO nova.compute.manager [None req-adb88aac-3418-476b-a369-a74b1f64653d 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Pausing
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.337 187212 DEBUG nova.objects.instance [None req-adb88aac-3418-476b-a369-a74b1f64653d 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'flavor' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.348 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:08 compute-0 ovn_controller[95610]: 2025-12-05T12:08:08Z|00593|binding|INFO|Setting lport b66066cc-97eb-4896-a98d-267498dedf74 ovn-installed in OVS
Dec 05 12:08:08 compute-0 ovn_controller[95610]: 2025-12-05T12:08:08Z|00594|binding|INFO|Setting lport b66066cc-97eb-4896-a98d-267498dedf74 up in Southbound
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.351 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.355 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c7810d5c-3258-4ebe-bf3b-2a2c3958814c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.356 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapba5c1b46-c1 in ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.358 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapba5c1b46-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.358 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[86f03fa9-97cd-42b1-b3c3-64de9a93bdc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.361 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[970eeed6-87ac-4a6c-b184-93d4d687f13e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.371 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:08:08 compute-0 systemd-udevd[229265]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.378 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[1007c90a-5216-4ff0-a488-31fe20b42d15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 NetworkManager[55691]: <info>  [1764936488.3883] device (tapb66066cc-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:08:08 compute-0 NetworkManager[55691]: <info>  [1764936488.3891] device (tapb66066cc-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.391 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936488.3902311, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.391 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Paused (Lifecycle Event)
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.393 187212 DEBUG nova.compute.manager [None req-adb88aac-3418-476b-a369-a74b1f64653d 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.393 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a66f6823-90dc-487d-8b2f-129ffaeb4844]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 systemd-machined[153543]: New machine qemu-76-instance-00000045.
Dec 05 12:08:08 compute-0 systemd[1]: Started Virtual Machine qemu-76-instance-00000045.
Dec 05 12:08:08 compute-0 podman[229246]: 2025-12-05 12:08:08.40966295 +0000 UTC m=+0.091447544 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.423 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[149a93e7-0508-46d8-ba0d-c6eee761a83d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.427 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.428 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9a60589f-64b1-4415-a01e-cbdb35d7299f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 NetworkManager[55691]: <info>  [1764936488.4297] manager: (tapba5c1b46-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Dec 05 12:08:08 compute-0 systemd-udevd[229274]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.434 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.463 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8d1974fd-1403-46fd-b19f-7afcccf0551b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.469 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cf506f8a-b53e-437b-b0cd-947cac4421af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.485 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] During sync_power_state the instance has a pending task (pausing). Skip.
Dec 05 12:08:08 compute-0 NetworkManager[55691]: <info>  [1764936488.4926] device (tapba5c1b46-c0): carrier: link connected
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.497 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0f65f2c3-4fb6-46e5-8a31-627bcd39ab54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.513 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[911e12a8-c5c6-4f8e-b04b-ee94bb86df4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba5c1b46-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:7b:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386905, 'reachable_time': 30980, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229305, 'error': None, 'target': 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.533 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7c01e9-32f2-4516-97b0-4dbe360a4e4d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:7ba4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386905, 'tstamp': 386905}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229306, 'error': None, 'target': 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.551 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f02c5b-3c79-46db-aa15-331774b9e78f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba5c1b46-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:7b:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386905, 'reachable_time': 30980, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229307, 'error': None, 'target': 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.582 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ff62f8e8-6314-4909-b1f9-e00c0a1c67a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.638 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6e50d870-4c75-458b-9ad3-d74578bd6a98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.640 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba5c1b46-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.641 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.642 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapba5c1b46-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:08 compute-0 kernel: tapba5c1b46-c0: entered promiscuous mode
Dec 05 12:08:08 compute-0 NetworkManager[55691]: <info>  [1764936488.6447] manager: (tapba5c1b46-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.644 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.650 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapba5c1b46-c0, col_values=(('external_ids', {'iface-id': 'cf881e66-1434-41ee-aff2-459b4b74bf50'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.652 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:08 compute-0 ovn_controller[95610]: 2025-12-05T12:08:08Z|00595|binding|INFO|Releasing lport cf881e66-1434-41ee-aff2-459b4b74bf50 from this chassis (sb_readonly=0)
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.657 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ba5c1b46-c606-429f-b268-8a88a7b3641a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ba5c1b46-c606-429f-b268-8a88a7b3641a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.658 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[97efd1a4-9fbc-49e2-a9e5-7e5204cd9803]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.659 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-ba5c1b46-c606-429f-b268-8a88a7b3641a
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/ba5c1b46-c606-429f-b268-8a88a7b3641a.pid.haproxy
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID ba5c1b46-c606-429f-b268-8a88a7b3641a
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:08:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.662 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'env', 'PROCESS_TAG=haproxy-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ba5c1b46-c606-429f-b268-8a88a7b3641a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.667 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.726 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936488.7263088, 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.727 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] VM Started (Lifecycle Event)
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.747 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.751 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936488.7264965, 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.751 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] VM Paused (Lifecycle Event)
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.770 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.773 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:08 compute-0 nova_compute[187208]: 2025-12-05 12:08:08.794 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:08:09 compute-0 podman[229346]: 2025-12-05 12:08:09.052256248 +0000 UTC m=+0.027606983 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:08:09 compute-0 podman[229346]: 2025-12-05 12:08:09.457454736 +0000 UTC m=+0.432805461 container create a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:08:09 compute-0 systemd[1]: Started libpod-conmon-a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5.scope.
Dec 05 12:08:09 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:08:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4532b7759c030458be51c068284cef48e94b6377a17ce91486e109cbe6a7f64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:08:09 compute-0 podman[229346]: 2025-12-05 12:08:09.567670694 +0000 UTC m=+0.543021439 container init a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 05 12:08:09 compute-0 podman[229346]: 2025-12-05 12:08:09.573210445 +0000 UTC m=+0.548561170 container start a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:08:09 compute-0 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [NOTICE]   (229365) : New worker (229367) forked
Dec 05 12:08:09 compute-0 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [NOTICE]   (229365) : Loading success.
Dec 05 12:08:09 compute-0 nova_compute[187208]: 2025-12-05 12:08:09.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:09 compute-0 nova_compute[187208]: 2025-12-05 12:08:09.939 187212 DEBUG nova.policy [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:08:10 compute-0 nova_compute[187208]: 2025-12-05 12:08:10.511 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Successfully updated port: 11c7fa90-6a48-487a-a375-5adf7f41cb90 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:08:10 compute-0 nova_compute[187208]: 2025-12-05 12:08:10.527 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:10 compute-0 nova_compute[187208]: 2025-12-05 12:08:10.527 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:10 compute-0 nova_compute[187208]: 2025-12-05 12:08:10.528 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:08:10 compute-0 nova_compute[187208]: 2025-12-05 12:08:10.826 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:08:11 compute-0 nova_compute[187208]: 2025-12-05 12:08:11.975 187212 DEBUG nova.network.neutron [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Updated VIF entry in instance network info cache for port b66066cc-97eb-4896-a98d-267498dedf74. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:08:11 compute-0 nova_compute[187208]: 2025-12-05 12:08:11.975 187212 DEBUG nova.network.neutron [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Updating instance_info_cache with network_info: [{"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:11 compute-0 nova_compute[187208]: 2025-12-05 12:08:11.980 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936476.9795291, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:11 compute-0 nova_compute[187208]: 2025-12-05 12:08:11.981 187212 INFO nova.compute.manager [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Stopped (Lifecycle Event)
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.003 187212 DEBUG nova.compute.manager [None req-4254625d-f9a2-4c87-898a-10f0443cfca9 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.004 187212 DEBUG oslo_concurrency.lockutils [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.211 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully created port: d35fce09-856e-4ebf-b944-0c0953a9492b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.395 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.418 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.419 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance network_info: |[{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.421 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start _get_guest_xml network_info=[{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.426 187212 WARNING nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.433 187212 DEBUG nova.virt.libvirt.host [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.477 187212 DEBUG nova.virt.libvirt.host [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.482 187212 DEBUG nova.virt.libvirt.host [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.483 187212 DEBUG nova.virt.libvirt.host [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.483 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.484 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.484 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.485 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.485 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.485 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.486 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.486 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.486 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.487 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.487 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.487 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.492 187212 DEBUG nova.virt.libvirt.vif [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1436335913',display_name='tempest-ServerRescueTestJSONUnderV235-server-1436335913',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1436335913',id=70,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e846fccb774e44f585d8847897bc4229',ramdisk_id='',reservation_id='r-230fx5t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1035500959',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1035500959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:59Z,user_data=None,user_id='6a2cefdbcaae4db3b3ece95c8227d77e',uuid=2e537618-f998-4c4d-8e1e-e9cc79219330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.492 187212 DEBUG nova.network.os_vif_util [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converting VIF {"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.493 187212 DEBUG nova.network.os_vif_util [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.494 187212 DEBUG nova.objects.instance [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.506 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <uuid>2e537618-f998-4c4d-8e1e-e9cc79219330</uuid>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <name>instance-00000046</name>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1436335913</nova:name>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:08:12</nova:creationTime>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:08:12 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:08:12 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:08:12 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:08:12 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:08:12 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:08:12 compute-0 nova_compute[187208]:         <nova:user uuid="6a2cefdbcaae4db3b3ece95c8227d77e">tempest-ServerRescueTestJSONUnderV235-1035500959-project-member</nova:user>
Dec 05 12:08:12 compute-0 nova_compute[187208]:         <nova:project uuid="e846fccb774e44f585d8847897bc4229">tempest-ServerRescueTestJSONUnderV235-1035500959</nova:project>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:08:12 compute-0 nova_compute[187208]:         <nova:port uuid="11c7fa90-6a48-487a-a375-5adf7f41cb90">
Dec 05 12:08:12 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <system>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <entry name="serial">2e537618-f998-4c4d-8e1e-e9cc79219330</entry>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <entry name="uuid">2e537618-f998-4c4d-8e1e-e9cc79219330</entry>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     </system>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <os>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   </os>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <features>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   </features>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:e4:ee:e4"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <target dev="tap11c7fa90-6a"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/console.log" append="off"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <video>
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     </video>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:08:12 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:08:12 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:08:12 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:08:12 compute-0 nova_compute[187208]: </domain>
Dec 05 12:08:12 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.508 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Preparing to wait for external event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.508 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.508 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.509 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.509 187212 DEBUG nova.virt.libvirt.vif [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1436335913',display_name='tempest-ServerRescueTestJSONUnderV235-server-1436335913',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1436335913',id=70,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e846fccb774e44f585d8847897bc4229',ramdisk_id='',reservation_id='r-230fx5t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1035500959',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1035500959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:59Z,user_data=None,user_id='6a2cefdbcaae4db3b3ece95c8227d77e',uuid=2e537618-f998-4c4d-8e1e-e9cc79219330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.510 187212 DEBUG nova.network.os_vif_util [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converting VIF {"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.510 187212 DEBUG nova.network.os_vif_util [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.511 187212 DEBUG os_vif [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.511 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.512 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.512 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.515 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11c7fa90-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.515 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap11c7fa90-6a, col_values=(('external_ids', {'iface-id': '11c7fa90-6a48-487a-a375-5adf7f41cb90', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:ee:e4', 'vm-uuid': '2e537618-f998-4c4d-8e1e-e9cc79219330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.516 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:12 compute-0 NetworkManager[55691]: <info>  [1764936492.5175] manager: (tap11c7fa90-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.523 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.524 187212 INFO os_vif [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a')
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.572 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.573 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.573 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No VIF found with MAC fa:16:3e:e4:ee:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.574 187212 INFO nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Using config drive
Dec 05 12:08:12 compute-0 nova_compute[187208]: 2025-12-05 12:08:12.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.295 187212 INFO nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating config drive at /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.301 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprmqzuqi_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:13 compute-0 ovn_controller[95610]: 2025-12-05T12:08:13Z|00596|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec 05 12:08:13 compute-0 ovn_controller[95610]: 2025-12-05T12:08:13Z|00597|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:08:13 compute-0 ovn_controller[95610]: 2025-12-05T12:08:13Z|00598|binding|INFO|Releasing lport cf881e66-1434-41ee-aff2-459b4b74bf50 from this chassis (sb_readonly=0)
Dec 05 12:08:13 compute-0 ovn_controller[95610]: 2025-12-05T12:08:13Z|00599|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.433 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprmqzuqi_" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:13 compute-0 kernel: tap11c7fa90-6a: entered promiscuous mode
Dec 05 12:08:13 compute-0 NetworkManager[55691]: <info>  [1764936493.4975] manager: (tap11c7fa90-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/246)
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.498 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:13 compute-0 ovn_controller[95610]: 2025-12-05T12:08:13Z|00600|binding|INFO|Claiming lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 for this chassis.
Dec 05 12:08:13 compute-0 ovn_controller[95610]: 2025-12-05T12:08:13Z|00601|binding|INFO|11c7fa90-6a48-487a-a375-5adf7f41cb90: Claiming fa:16:3e:e4:ee:e4 10.100.0.2
Dec 05 12:08:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:13.505 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:ee:e4 10.100.0.2'], port_security=['fa:16:3e:e4:ee:e4 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-034629ef-6cd1-463c-b963-3d0d9c530038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e846fccb774e44f585d8847897bc4229', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a77f7593-d6d1-44fb-8125-66cdfc38709c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4c9d894-0fc3-4aad-a4d5-6bee101a530c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=11c7fa90-6a48-487a-a375-5adf7f41cb90) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:13.507 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 11c7fa90-6a48-487a-a375-5adf7f41cb90 in datapath 034629ef-6cd1-463c-b963-3d0d9c530038 bound to our chassis
Dec 05 12:08:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:13.509 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 034629ef-6cd1-463c-b963-3d0d9c530038 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:08:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:13.510 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[34af6233-215e-4e65-b3f5-ea644449d93e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:13 compute-0 ovn_controller[95610]: 2025-12-05T12:08:13Z|00602|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 up in Southbound
Dec 05 12:08:13 compute-0 ovn_controller[95610]: 2025-12-05T12:08:13Z|00603|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 ovn-installed in OVS
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.513 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.515 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:13 compute-0 systemd-udevd[229395]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:08:13 compute-0 systemd-machined[153543]: New machine qemu-77-instance-00000046.
Dec 05 12:08:13 compute-0 NetworkManager[55691]: <info>  [1764936493.5438] device (tap11c7fa90-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:08:13 compute-0 NetworkManager[55691]: <info>  [1764936493.5448] device (tap11c7fa90-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:08:13 compute-0 systemd[1]: Started Virtual Machine qemu-77-instance-00000046.
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.781 187212 DEBUG nova.compute.manager [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.781 187212 DEBUG oslo_concurrency.lockutils [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.782 187212 DEBUG oslo_concurrency.lockutils [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.782 187212 DEBUG oslo_concurrency.lockutils [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.782 187212 DEBUG nova.compute.manager [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Processing event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.783 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.789 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936493.788678, 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.789 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] VM Resumed (Lifecycle Event)
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.792 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.796 187212 INFO nova.virt.libvirt.driver [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance spawned successfully.
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.797 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.808 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.821 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.825 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.826 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.826 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.827 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.827 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.828 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.849 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.885 187212 INFO nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Took 19.58 seconds to spawn the instance on the hypervisor.
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.886 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:13 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.978 187212 INFO nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Took 20.29 seconds to build instance.
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:13.999 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.289 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully updated port: d35fce09-856e-4ebf-b944-0c0953a9492b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.303 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.304 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.304 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.375 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936494.3747003, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.376 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Started (Lifecycle Event)
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.393 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.396 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936494.3748746, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.397 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Paused (Lifecycle Event)
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.411 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.414 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.431 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:08:14 compute-0 nova_compute[187208]: 2025-12-05 12:08:14.604 187212 WARNING nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it
Dec 05 12:08:15 compute-0 podman[229412]: 2025-12-05 12:08:15.208977336 +0000 UTC m=+0.058653414 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.698 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.699 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.699 187212 INFO nova.compute.manager [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Shelving
Dec 05 12:08:16 compute-0 kernel: tap29e412e9-d3 (unregistering): left promiscuous mode
Dec 05 12:08:16 compute-0 NetworkManager[55691]: <info>  [1764936496.7452] device (tap29e412e9-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 ovn_controller[95610]: 2025-12-05T12:08:16Z|00604|binding|INFO|Releasing lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 from this chassis (sb_readonly=0)
Dec 05 12:08:16 compute-0 ovn_controller[95610]: 2025-12-05T12:08:16Z|00605|binding|INFO|Setting lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 down in Southbound
Dec 05 12:08:16 compute-0 ovn_controller[95610]: 2025-12-05T12:08:16Z|00606|binding|INFO|Removing iface tap29e412e9-d3 ovn-installed in OVS
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.758 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.770 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:32:38 10.100.0.6'], port_security=['fa:16:3e:68:32:38 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5659bd52-8c24-483d-80a4-8eb6b28e1349', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'df1c03c3-b3c9-47b6-a712-a13948dd510e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=29e412e9-d3cc-4af2-b85a-ab48fcad0372) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.771 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 29e412e9-d3cc-4af2-b85a-ab48fcad0372 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.774 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.774 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.790 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f7d964fd-084c-4616-8cb6-e3be4ea26f7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:16 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000044.scope: Deactivated successfully.
Dec 05 12:08:16 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000044.scope: Consumed 6.328s CPU time.
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.821 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.821 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:16 compute-0 systemd-machined[153543]: Machine qemu-75-instance-00000044 terminated.
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.825 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b803acea-5d7d-4afd-ab13-5561f7053a9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.828 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5428dbc5-556e-47d5-b622-df5a7365febe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.838 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.849 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.859 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[12cc9e66-1e7c-4f8d-855b-dcf9d2a6f9ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.879 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa992cce-ae68-43c4-8f77-06d49adaa9e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229449, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.884 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.889 187212 DEBUG nova.virt.libvirt.vif [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.889 187212 DEBUG nova.network.os_vif_util [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.890 187212 DEBUG nova.network.os_vif_util [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.891 187212 DEBUG os_vif [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.892 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.893 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.893 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.896 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9cab09-8277-4c58-8b38-6e1571a20e1f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229450, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229450, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.898 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.902 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.904 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd35fce09-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.904 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd35fce09-85, col_values=(('external_ids', {'iface-id': 'd35fce09-856e-4ebf-b944-0c0953a9492b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:01:47', 'vm-uuid': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:16 compute-0 NetworkManager[55691]: <info>  [1764936496.9071] manager: (tapd35fce09-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.909 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.909 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.910 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.910 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.910 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.915 187212 INFO os_vif [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85')
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.916 187212 DEBUG nova.virt.libvirt.vif [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.917 187212 DEBUG nova.network.os_vif_util [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.918 187212 DEBUG nova.network.os_vif_util [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.921 187212 DEBUG nova.virt.libvirt.guest [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec 05 12:08:16 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:b8:01:47"/>
Dec 05 12:08:16 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:08:16 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:08:16 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:08:16 compute-0 nova_compute[187208]:   <target dev="tapd35fce09-85"/>
Dec 05 12:08:16 compute-0 nova_compute[187208]: </interface>
Dec 05 12:08:16 compute-0 nova_compute[187208]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.934 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.935 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:16 compute-0 systemd-udevd[229441]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:08:16 compute-0 NetworkManager[55691]: <info>  [1764936496.9408] manager: (tapd35fce09-85): new Tun device (/org/freedesktop/NetworkManager/Devices/248)
Dec 05 12:08:16 compute-0 kernel: tapd35fce09-85: entered promiscuous mode
Dec 05 12:08:16 compute-0 ovn_controller[95610]: 2025-12-05T12:08:16Z|00607|binding|INFO|Claiming lport d35fce09-856e-4ebf-b944-0c0953a9492b for this chassis.
Dec 05 12:08:16 compute-0 ovn_controller[95610]: 2025-12-05T12:08:16Z|00608|binding|INFO|d35fce09-856e-4ebf-b944-0c0953a9492b: Claiming fa:16:3e:b8:01:47 10.100.0.3
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.954 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.955 187212 INFO nova.compute.claims [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:08:16 compute-0 NetworkManager[55691]: <info>  [1764936496.9588] device (tapd35fce09-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:08:16 compute-0 NetworkManager[55691]: <info>  [1764936496.9601] device (tapd35fce09-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.954 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:01:47 10.100.0.3'], port_security=['fa:16:3e:b8:01:47 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d35fce09-856e-4ebf-b944-0c0953a9492b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.955 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d35fce09-856e-4ebf-b944-0c0953a9492b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.958 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:08:16 compute-0 ovn_controller[95610]: 2025-12-05T12:08:16Z|00609|binding|INFO|Setting lport d35fce09-856e-4ebf-b944-0c0953a9492b ovn-installed in OVS
Dec 05 12:08:16 compute-0 ovn_controller[95610]: 2025-12-05T12:08:16Z|00610|binding|INFO|Setting lport d35fce09-856e-4ebf-b944-0c0953a9492b up in Southbound
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.972 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.975 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c63a37e-de31-4db2-b877-c3dd918b8440]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:16 compute-0 nova_compute[187208]: 2025-12-05 12:08:16.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.005 187212 INFO nova.virt.libvirt.driver [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance destroyed successfully.
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.006 187212 DEBUG nova.objects.instance [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.008 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[07046d03-7d85-4006-8cc5-5917f0ec47fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.011 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a4f46fef-fed7-45ef-8aed-c0020aec7a7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.038 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c8ee53-da90-4abf-8c23-86149699d4f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.042 187212 DEBUG nova.virt.libvirt.driver [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.042 187212 DEBUG nova.virt.libvirt.driver [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.042 187212 DEBUG nova.virt.libvirt.driver [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:01:99:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.042 187212 DEBUG nova.virt.libvirt.driver [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:b8:01:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.056 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d20c361-8c6c-4a87-a7a3-48f0df83e14f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 35716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229484, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.066 187212 DEBUG nova.virt.libvirt.guest [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:08:17 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:08:17 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec 05 12:08:17 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:08:17</nova:creationTime>
Dec 05 12:08:17 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:08:17 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:08:17 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:08:17 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:08:17 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:08:17 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:08:17 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec 05 12:08:17 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec 05 12:08:17 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 12:08:17 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:08:17 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:08:17 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:08:17 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.072 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[706e7789-f569-464e-b29e-e7bfbfe80b88]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229485, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229485, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.073 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.075 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.079 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.079 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.080 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.080 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.095 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.188 187212 DEBUG nova.compute.provider_tree [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.204 187212 DEBUG nova.scheduler.client.report [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.232 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.232 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.289 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.290 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.316 187212 INFO nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.336 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.353 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Beginning cold snapshot process
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.446 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.447 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.447 187212 INFO nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Creating image(s)
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.448 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.448 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.448 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.463 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.507 187212 DEBUG nova.privsep.utils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.507 187212 DEBUG oslo_concurrency.processutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk /var/lib/nova/instances/snapshots/tmpwfyho5o3/f6ab3485ca4a4553bc3b5ba4601e8af8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.526 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.527 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.528 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.543 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.620 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.621 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.690 187212 DEBUG oslo_concurrency.processutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk /var/lib/nova/instances/snapshots/tmpwfyho5o3/f6ab3485ca4a4553bc3b5ba4601e8af8" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.691 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Snapshot extracted, beginning image upload
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.695 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk 1073741824" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.697 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.697 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.770 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.771 187212 DEBUG nova.virt.disk.api [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Checking if we can resize image /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.772 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.795 187212 DEBUG nova.policy [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.829 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.830 187212 DEBUG nova.virt.disk.api [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Cannot resize image /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.830 187212 DEBUG nova.objects.instance [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lazy-loading 'migration_context' on Instance uuid b235a96f-7a12-4bd2-8627-33b128346aa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.843 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.844 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Ensure instance console log exists: /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.844 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.845 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.845 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:17 compute-0 nova_compute[187208]: 2025-12-05 12:08:17.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:18 compute-0 nova_compute[187208]: 2025-12-05 12:08:18.616 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:18 compute-0 nova_compute[187208]: 2025-12-05 12:08:18.617 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:08:18 compute-0 nova_compute[187208]: 2025-12-05 12:08:18.617 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:18 compute-0 nova_compute[187208]: 2025-12-05 12:08:18.617 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:18 compute-0 nova_compute[187208]: 2025-12-05 12:08:18.617 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:08:18 compute-0 nova_compute[187208]: 2025-12-05 12:08:18.619 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Successfully created port: df4eecd2-b2e2-445a-acac-232f66123555 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:08:19 compute-0 ovn_controller[95610]: 2025-12-05T12:08:19Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b8:01:47 10.100.0.3
Dec 05 12:08:19 compute-0 ovn_controller[95610]: 2025-12-05T12:08:19Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:01:47 10.100.0.3
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.509 187212 DEBUG nova.compute.manager [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received event network-vif-unplugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.510 187212 DEBUG oslo_concurrency.lockutils [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.510 187212 DEBUG oslo_concurrency.lockutils [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.510 187212 DEBUG oslo_concurrency.lockutils [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.510 187212 DEBUG nova.compute.manager [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] No waiting events found dispatching network-vif-unplugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.511 187212 WARNING nova.compute.manager [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received unexpected event network-vif-unplugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 for instance with vm_state paused and task_state shelving_image_uploading.
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.803 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Snapshot image upload complete
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.804 187212 DEBUG nova.compute.manager [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.858 187212 INFO nova.compute.manager [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Shelve offloading
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.865 187212 INFO nova.virt.libvirt.driver [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance destroyed successfully.
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.865 187212 DEBUG nova.compute.manager [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.868 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.868 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:19 compute-0 nova_compute[187208]: 2025-12-05 12:08:19.869 187212 DEBUG nova.network.neutron [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.043 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Successfully updated port: df4eecd2-b2e2-445a-acac-232f66123555 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.061 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.061 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquired lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.061 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.063 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.063 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.063 187212 DEBUG nova.objects.instance [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.310 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.346 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.347 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.379 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.380 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.380 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.380 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] No waiting events found dispatching network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 WARNING nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received unexpected event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 for instance with vm_state active and task_state None.
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Processing event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.383 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.383 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.383 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.383 187212 WARNING nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state building and task_state spawning.
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-d35fce09-856e-4ebf-b944-0c0953a9492b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port d35fce09-856e-4ebf-b944-0c0953a9492b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.393 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.397 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936500.3971863, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.397 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Resumed (Lifecycle Event)
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.399 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.402 187212 INFO nova.virt.libvirt.driver [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance spawned successfully.
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.403 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.429 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.434 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.439 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.439 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.439 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.440 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.440 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.441 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.471 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.508 187212 INFO nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Took 20.62 seconds to spawn the instance on the hypervisor.
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.509 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.575 187212 INFO nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Took 21.48 seconds to build instance.
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.595 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.714 187212 DEBUG nova.objects.instance [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:20 compute-0 nova_compute[187208]: 2025-12-05 12:08:20.736 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.106 187212 DEBUG nova.policy [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.712 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Updating instance_info_cache with network_info: [{"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.740 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Releasing lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.740 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance network_info: |[{"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.742 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start _get_guest_xml network_info=[{"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.747 187212 WARNING nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.753 187212 DEBUG nova.network.neutron [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Updating instance_info_cache with network_info: [{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.754 187212 DEBUG nova.virt.libvirt.host [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.755 187212 DEBUG nova.virt.libvirt.host [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.758 187212 DEBUG nova.virt.libvirt.host [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.758 187212 DEBUG nova.virt.libvirt.host [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.758 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.759 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.759 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.759 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.761 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.761 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.761 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.765 187212 DEBUG nova.virt.libvirt.vif [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:08:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-959694714',display_name='tempest-ServerMetadataNegativeTestJSON-server-959694714',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-959694714',id=71,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf30ed1956544c7eae67c989042126e4',ramdisk_id='',reservation_id='r-h9tu7hr7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-91345283',owner_user_name='tempest-ServerMetadataNegativeTestJSON-91345283-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:08:17Z,user_data=None,user_id='132d581de02e49b9a4c99b9b831dd5b5',uuid=b235a96f-7a12-4bd2-8627-33b128346aa4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.765 187212 DEBUG nova.network.os_vif_util [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converting VIF {"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.766 187212 DEBUG nova.network.os_vif_util [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.767 187212 DEBUG nova.objects.instance [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lazy-loading 'pci_devices' on Instance uuid b235a96f-7a12-4bd2-8627-33b128346aa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.769 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.780 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <uuid>b235a96f-7a12-4bd2-8627-33b128346aa4</uuid>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <name>instance-00000047</name>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerMetadataNegativeTestJSON-server-959694714</nova:name>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:08:21</nova:creationTime>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:08:21 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:08:21 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:08:21 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:08:21 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:08:21 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:08:21 compute-0 nova_compute[187208]:         <nova:user uuid="132d581de02e49b9a4c99b9b831dd5b5">tempest-ServerMetadataNegativeTestJSON-91345283-project-member</nova:user>
Dec 05 12:08:21 compute-0 nova_compute[187208]:         <nova:project uuid="bf30ed1956544c7eae67c989042126e4">tempest-ServerMetadataNegativeTestJSON-91345283</nova:project>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:08:21 compute-0 nova_compute[187208]:         <nova:port uuid="df4eecd2-b2e2-445a-acac-232f66123555">
Dec 05 12:08:21 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <system>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <entry name="serial">b235a96f-7a12-4bd2-8627-33b128346aa4</entry>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <entry name="uuid">b235a96f-7a12-4bd2-8627-33b128346aa4</entry>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     </system>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <os>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   </os>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <features>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   </features>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.config"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:40:3b:49"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <target dev="tapdf4eecd2-b2"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/console.log" append="off"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <video>
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     </video>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:08:21 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:08:21 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:08:21 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:08:21 compute-0 nova_compute[187208]: </domain>
Dec 05 12:08:21 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.781 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Preparing to wait for external event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.781 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.781 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.781 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.782 187212 DEBUG nova.virt.libvirt.vif [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:08:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-959694714',display_name='tempest-ServerMetadataNegativeTestJSON-server-959694714',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-959694714',id=71,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf30ed1956544c7eae67c989042126e4',ramdisk_id='',reservation_id='r-h9tu7hr7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-91345283',owner_user_name='tempest-ServerMetadataNegativeTestJSON-91345283-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:08:17Z,user_data=None,user_id='132d581de02e49b9a4c99b9b831dd5b5',uuid=b235a96f-7a12-4bd2-8627-33b128346aa4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.782 187212 DEBUG nova.network.os_vif_util [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converting VIF {"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.783 187212 DEBUG nova.network.os_vif_util [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.783 187212 DEBUG os_vif [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.784 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.784 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.788 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.788 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf4eecd2-b2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.788 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf4eecd2-b2, col_values=(('external_ids', {'iface-id': 'df4eecd2-b2e2-445a-acac-232f66123555', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:3b:49', 'vm-uuid': 'b235a96f-7a12-4bd2-8627-33b128346aa4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:21 compute-0 NetworkManager[55691]: <info>  [1764936501.7915] manager: (tapdf4eecd2-b2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.798 187212 INFO os_vif [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2')
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.850 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.850 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.851 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] No VIF found with MAC fa:16:3e:40:3b:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:21 compute-0 nova_compute[187208]: 2025-12-05 12:08:21.852 187212 INFO nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Using config drive
Dec 05 12:08:22 compute-0 ovn_controller[95610]: 2025-12-05T12:08:22Z|00611|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec 05 12:08:22 compute-0 ovn_controller[95610]: 2025-12-05T12:08:22Z|00612|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:08:22 compute-0 ovn_controller[95610]: 2025-12-05T12:08:22Z|00613|binding|INFO|Releasing lport cf881e66-1434-41ee-aff2-459b4b74bf50 from this chassis (sb_readonly=0)
Dec 05 12:08:22 compute-0 ovn_controller[95610]: 2025-12-05T12:08:22Z|00614|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.390 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.483 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.483 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.484 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.484 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.484 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.485 187212 INFO nova.compute.manager [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Terminating instance
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.486 187212 DEBUG nova.compute.manager [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:08:22 compute-0 kernel: tapb66066cc-97 (unregistering): left promiscuous mode
Dec 05 12:08:22 compute-0 NetworkManager[55691]: <info>  [1764936502.5138] device (tapb66066cc-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.574 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:22 compute-0 ovn_controller[95610]: 2025-12-05T12:08:22Z|00615|binding|INFO|Releasing lport b66066cc-97eb-4896-a98d-267498dedf74 from this chassis (sb_readonly=0)
Dec 05 12:08:22 compute-0 ovn_controller[95610]: 2025-12-05T12:08:22Z|00616|binding|INFO|Setting lport b66066cc-97eb-4896-a98d-267498dedf74 down in Southbound
Dec 05 12:08:22 compute-0 ovn_controller[95610]: 2025-12-05T12:08:22Z|00617|binding|INFO|Removing iface tapb66066cc-97 ovn-installed in OVS
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.588 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:b8:fe 10.100.0.8'], port_security=['fa:16:3e:b2:b8:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'feb2d7c8b49945a08355fc4f902f2786', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fdf3db2a-0067-4a50-8487-b97fc3fdd122', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8cd27338-7640-4d03-958e-44ccc0e8c5fb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b66066cc-97eb-4896-a98d-267498dedf74) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.589 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b66066cc-97eb-4896-a98d-267498dedf74 in datapath ba5c1b46-c606-429f-b268-8a88a7b3641a unbound from our chassis
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.589 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.591 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ba5c1b46-c606-429f-b268-8a88a7b3641a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:08:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.592 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a10c32-6f74-442c-a9b9-dad1f7ffe488]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.593 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a namespace which is not needed anymore
Dec 05 12:08:22 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000045.scope: Deactivated successfully.
Dec 05 12:08:22 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000045.scope: Consumed 8.913s CPU time.
Dec 05 12:08:22 compute-0 systemd-machined[153543]: Machine qemu-76-instance-00000045 terminated.
Dec 05 12:08:22 compute-0 podman[229514]: 2025-12-05 12:08:22.673913196 +0000 UTC m=+0.099442277 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 05 12:08:22 compute-0 NetworkManager[55691]: <info>  [1764936502.7110] manager: (tapb66066cc-97): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Dec 05 12:08:22 compute-0 systemd-udevd[229521]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.777 187212 INFO nova.virt.libvirt.driver [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance destroyed successfully.
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.778 187212 DEBUG nova.objects.instance [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lazy-loading 'resources' on Instance uuid 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.793 187212 DEBUG nova.virt.libvirt.vif [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1497746963',display_name='tempest-ServerAddressesNegativeTestJSON-server-1497746963',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1497746963',id=69,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='feb2d7c8b49945a08355fc4f902f2786',ramdisk_id='',reservation_id='r-wvd553zf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-717599576',owner_user_name='tempest-ServerAddressesNegativeTestJSON-717599576-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:13Z,user_data=None,user_id='21ddc7a76417447daa2a5a26cdf17d53',uuid=3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.793 187212 DEBUG nova.network.os_vif_util [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converting VIF {"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.794 187212 DEBUG nova.network.os_vif_util [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.794 187212 DEBUG os_vif [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.796 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.796 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb66066cc-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.800 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.809 187212 INFO os_vif [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97')
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.810 187212 INFO nova.virt.libvirt.driver [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Deleting instance files /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf_del
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.811 187212 INFO nova.virt.libvirt.driver [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Deletion of /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf_del complete
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.856 187212 INFO nova.compute.manager [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Took 0.37 seconds to destroy the instance on the hypervisor.
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.857 187212 DEBUG oslo.service.loopingcall [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.857 187212 DEBUG nova.compute.manager [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.858 187212 DEBUG nova.network.neutron [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.891 187212 INFO nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Creating config drive at /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.config
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.896 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2bfra8dh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.924 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully created port: af04237a-1f79-4f68-a18e-1ceb4911605b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:08:22 compute-0 nova_compute[187208]: 2025-12-05 12:08:22.932 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:22 compute-0 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [NOTICE]   (229365) : haproxy version is 2.8.14-c23fe91
Dec 05 12:08:22 compute-0 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [NOTICE]   (229365) : path to executable is /usr/sbin/haproxy
Dec 05 12:08:22 compute-0 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [WARNING]  (229365) : Exiting Master process...
Dec 05 12:08:22 compute-0 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [ALERT]    (229365) : Current worker (229367) exited with code 143 (Terminated)
Dec 05 12:08:22 compute-0 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [WARNING]  (229365) : All workers exited. Exiting... (0)
Dec 05 12:08:22 compute-0 systemd[1]: libpod-a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5.scope: Deactivated successfully.
Dec 05 12:08:22 compute-0 podman[229557]: 2025-12-05 12:08:22.944444566 +0000 UTC m=+0.239881422 container died a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.028 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2bfra8dh" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.087 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port d35fce09-856e-4ebf-b944-0c0953a9492b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.087 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:23 compute-0 kernel: tapdf4eecd2-b2: entered promiscuous mode
Dec 05 12:08:23 compute-0 NetworkManager[55691]: <info>  [1764936503.0976] manager: (tapdf4eecd2-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/251)
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.101 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:23 compute-0 ovn_controller[95610]: 2025-12-05T12:08:23Z|00618|binding|INFO|Claiming lport df4eecd2-b2e2-445a-acac-232f66123555 for this chassis.
Dec 05 12:08:23 compute-0 ovn_controller[95610]: 2025-12-05T12:08:23Z|00619|binding|INFO|df4eecd2-b2e2-445a-acac-232f66123555: Claiming fa:16:3e:40:3b:49 10.100.0.11
Dec 05 12:08:23 compute-0 NetworkManager[55691]: <info>  [1764936503.1100] device (tapdf4eecd2-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:08:23 compute-0 NetworkManager[55691]: <info>  [1764936503.1115] device (tapdf4eecd2-b2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:08:23 compute-0 ovn_controller[95610]: 2025-12-05T12:08:23Z|00620|binding|INFO|Setting lport df4eecd2-b2e2-445a-acac-232f66123555 ovn-installed in OVS
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.122 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.124 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:23 compute-0 systemd-machined[153543]: New machine qemu-78-instance-00000047.
Dec 05 12:08:23 compute-0 systemd[1]: Started Virtual Machine qemu-78-instance-00000047.
Dec 05 12:08:23 compute-0 ovn_controller[95610]: 2025-12-05T12:08:23Z|00621|binding|INFO|Setting lport df4eecd2-b2e2-445a-acac-232f66123555 up in Southbound
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.179 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:3b:49 10.100.0.11'], port_security=['fa:16:3e:40:3b:49 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf30ed1956544c7eae67c989042126e4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c4ee2104-41f1-480e-ab3a-db882b9c2d98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bb90128-3616-41a6-a999-156ce64fbcf7, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=df4eecd2-b2e2-445a-acac-232f66123555) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.202 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5-userdata-shm.mount: Deactivated successfully.
Dec 05 12:08:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-c4532b7759c030458be51c068284cef48e94b6377a17ce91486e109cbe6a7f64-merged.mount: Deactivated successfully.
Dec 05 12:08:23 compute-0 podman[229557]: 2025-12-05 12:08:23.332483297 +0000 UTC m=+0.627920143 container cleanup a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:08:23 compute-0 systemd[1]: libpod-conmon-a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5.scope: Deactivated successfully.
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.541 187212 DEBUG nova.network.neutron [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.558 187212 INFO nova.compute.manager [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Took 0.70 seconds to deallocate network for instance.
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.600 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.600 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:23 compute-0 podman[229629]: 2025-12-05 12:08:23.655398847 +0000 UTC m=+0.296014991 container remove a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.662 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d43d98ce-6f1a-4bc9-b456-621a19ac3e04]: (4, ('Fri Dec  5 12:08:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a (a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5)\na66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5\nFri Dec  5 12:08:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a (a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5)\na66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.667 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[88ccd2b5-da90-4d5a-b525-264d03ac3fe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.668 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba5c1b46-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.700 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:23 compute-0 kernel: tapba5c1b46-c0: left promiscuous mode
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.721 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.724 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.729 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a0be8750-06d0-4fdf-b697-3b5259fa6b0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.743 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f64e9cb-346b-4319-8f15-c291a8cfbbd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.744 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[254951da-7c25-4279-b2f5-e023d326aba6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.761 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[538f8118-7705-4c28-9355-720f91e8a10e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386898, 'reachable_time': 35191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229647, 'error': None, 'target': 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 systemd[1]: run-netns-ovnmeta\x2dba5c1b46\x2dc606\x2d429f\x2db268\x2d8a88a7b3641a.mount: Deactivated successfully.
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.766 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.766 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a95a30ea-d5bf-435e-b58d-c18fb195322b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.767 104471 INFO neutron.agent.ovn.metadata.agent [-] Port df4eecd2-b2e2-445a-acac-232f66123555 in datapath 02d8cc87-efdf-4db2-b7ab-393e2480966a unbound from our chassis
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.770 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02d8cc87-efdf-4db2-b7ab-393e2480966a
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.771 187212 DEBUG nova.compute.provider_tree [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.781 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cecd87c7-ed40-48ba-8440-0c66d3cc5949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.782 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap02d8cc87-e1 in ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.784 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap02d8cc87-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.784 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91af8f47-5d2a-4a62-8afb-85ffbc7ed9b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.785 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5cfe9c1f-0bd3-44e9-aa77-86a35850a6a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.784 187212 DEBUG nova.scheduler.client.report [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.801 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[5187c169-b2ea-4c8d-b692-1b92ce38ad4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.806 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.827 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f40dd680-b448-4af6-b5c1-a237e078ce68]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.829 187212 INFO nova.scheduler.client.report [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Deleted allocations for instance 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.856 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4a0780af-45c9-4d41-9552-2b13fda9c371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.861 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7baddfa4-e467-4d8d-864e-cfdc8649f2ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 NetworkManager[55691]: <info>  [1764936503.8633] manager: (tap02d8cc87-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/252)
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.894 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.895 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a454ff82-0af6-4ceb-a689-094a9f685687]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.900 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a63bb927-13e1-43ed-8441-f2e638e53575]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 NetworkManager[55691]: <info>  [1764936503.9229] device (tap02d8cc87-e0): carrier: link connected
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.926 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1624e3e2-c040-42e3-9055-5bd203a22b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.944 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0a0ae1-77c9-4f99-8a6b-4eb6ca20b572]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02d8cc87-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:79:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388448, 'reachable_time': 24812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229683, 'error': None, 'target': 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.961 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac907d3-af80-4a06-abdf-ec956fea284c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:795a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388448, 'tstamp': 388448}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229685, 'error': None, 'target': 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.976 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[36e27eb3-e62f-4a94-b27f-571766fba41b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02d8cc87-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:79:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388448, 'reachable_time': 27146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229686, 'error': None, 'target': 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.982 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936503.9821005, b235a96f-7a12-4bd2-8627-33b128346aa4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:23 compute-0 nova_compute[187208]: 2025-12-05 12:08:23.983 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] VM Started (Lifecycle Event)
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.004 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.009 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936503.9822052, b235a96f-7a12-4bd2-8627-33b128346aa4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.010 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] VM Paused (Lifecycle Event)
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.017 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[837b553a-49c7-4f9d-ba9e-c6494e537b50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.037 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.041 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.069 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.073 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d8e6db9-b36f-4000-8b61-30ee798359ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.075 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02d8cc87-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.075 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.076 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02d8cc87-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:24 compute-0 NetworkManager[55691]: <info>  [1764936504.0792] manager: (tap02d8cc87-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Dec 05 12:08:24 compute-0 kernel: tap02d8cc87-e0: entered promiscuous mode
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.084 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.088 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02d8cc87-e0, col_values=(('external_ids', {'iface-id': '0dffa729-6b55-4e58-afef-f1cdc22c22fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.089 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:24 compute-0 ovn_controller[95610]: 2025-12-05T12:08:24Z|00622|binding|INFO|Releasing lport 0dffa729-6b55-4e58-afef-f1cdc22c22fb from this chassis (sb_readonly=0)
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.106 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.107 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.108 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02d8cc87-efdf-4db2-b7ab-393e2480966a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02d8cc87-efdf-4db2-b7ab-393e2480966a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.110 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb2f136-ff40-46b9-8250-07a167e8dbd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.111 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-02d8cc87-efdf-4db2-b7ab-393e2480966a
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/02d8cc87-efdf-4db2-b7ab-393e2480966a.pid.haproxy
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 02d8cc87-efdf-4db2-b7ab-393e2480966a
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:08:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.112 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'env', 'PROCESS_TAG=haproxy-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/02d8cc87-efdf-4db2-b7ab-393e2480966a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.190 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.190 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 WARNING nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b for instance with vm_state active and task_state None.
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 WARNING nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b for instance with vm_state active and task_state None.
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-changed-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Refreshing instance network info cache due to event network-changed-df4eecd2-b2e2-445a-acac-232f66123555. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG nova.network.neutron [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Refreshing network info cache for port df4eecd2-b2e2-445a-acac-232f66123555 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.444 187212 INFO nova.virt.libvirt.driver [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance destroyed successfully.
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.445 187212 DEBUG nova.objects.instance [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'resources' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.459 187212 DEBUG nova.virt.libvirt.vif [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1539570170',display_name='tempest-ServerActionsTestOtherB-server-1539570170',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1539570170',id=68,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-bfsh2n18',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member',shelved_at='2025-12-05T12:08:19.804531',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='4f3e32d3-f28d-4124-97de-ec6d4f73bf1d'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:17Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=5659bd52-8c24-483d-80a4-8eb6b28e1349,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.460 187212 DEBUG nova.network.os_vif_util [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.461 187212 DEBUG nova.network.os_vif_util [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.461 187212 DEBUG os_vif [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.464 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.464 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29e412e9-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.467 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.471 187212 INFO os_vif [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3')
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.472 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Deleting instance files /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349_del
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.473 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Deletion of /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349_del complete
Dec 05 12:08:24 compute-0 podman[229721]: 2025-12-05 12:08:24.539977045 +0000 UTC m=+0.066084348 container create 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.564 187212 INFO nova.scheduler.client.report [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Deleted allocations for instance 5659bd52-8c24-483d-80a4-8eb6b28e1349
Dec 05 12:08:24 compute-0 systemd[1]: Started libpod-conmon-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1.scope.
Dec 05 12:08:24 compute-0 podman[229721]: 2025-12-05 12:08:24.507478202 +0000 UTC m=+0.033585525 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.606 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.606 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:24 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:08:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3936c809ee6a8001892b1f5e8b230731bdba206d1aa032e18836cd92f8d64675/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:08:24 compute-0 podman[229721]: 2025-12-05 12:08:24.633142979 +0000 UTC m=+0.159250302 container init 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:08:24 compute-0 podman[229721]: 2025-12-05 12:08:24.638812103 +0000 UTC m=+0.164919406 container start 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:08:24 compute-0 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [NOTICE]   (229739) : New worker (229741) forked
Dec 05 12:08:24 compute-0 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [NOTICE]   (229739) : Loading success.
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.761 187212 DEBUG nova.compute.provider_tree [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.779 187212 DEBUG nova.scheduler.client.report [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.808 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:24 compute-0 nova_compute[187208]: 2025-12-05 12:08:24.862 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 8.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:25 compute-0 nova_compute[187208]: 2025-12-05 12:08:25.090 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully updated port: af04237a-1f79-4f68-a18e-1ceb4911605b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:08:25 compute-0 nova_compute[187208]: 2025-12-05 12:08:25.104 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:25 compute-0 nova_compute[187208]: 2025-12-05 12:08:25.105 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:25 compute-0 nova_compute[187208]: 2025-12-05 12:08:25.105 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:08:25 compute-0 nova_compute[187208]: 2025-12-05 12:08:25.381 187212 WARNING nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it
Dec 05 12:08:25 compute-0 nova_compute[187208]: 2025-12-05 12:08:25.382 187212 WARNING nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it
Dec 05 12:08:25 compute-0 nova_compute[187208]: 2025-12-05 12:08:25.739 187212 DEBUG nova.network.neutron [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Updated VIF entry in instance network info cache for port df4eecd2-b2e2-445a-acac-232f66123555. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:08:25 compute-0 nova_compute[187208]: 2025-12-05 12:08:25.739 187212 DEBUG nova.network.neutron [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Updating instance_info_cache with network_info: [{"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:25 compute-0 nova_compute[187208]: 2025-12-05 12:08:25.760 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.565 187212 INFO nova.compute.manager [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Rescuing
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.566 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.566 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.567 187212 DEBUG nova.network.neutron [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.972 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-unplugged-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.972 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.972 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] No waiting events found dispatching network-vif-unplugged-b66066cc-97eb-4896-a98d-267498dedf74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 WARNING nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received unexpected event network-vif-unplugged-b66066cc-97eb-4896-a98d-267498dedf74 for instance with vm_state deleted and task_state None.
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] No waiting events found dispatching network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 WARNING nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received unexpected event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 for instance with vm_state deleted and task_state None.
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-af04237a-1f79-4f68-a18e-1ceb4911605b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:08:27 compute-0 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.866 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.928 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.929 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.929 187212 DEBUG nova.network.neutron [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port af04237a-1f79-4f68-a18e-1ceb4911605b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.932 187212 DEBUG nova.virt.libvirt.vif [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.932 187212 DEBUG nova.network.os_vif_util [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.933 187212 DEBUG nova.network.os_vif_util [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.933 187212 DEBUG os_vif [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.934 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.934 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.934 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.937 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf04237a-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.937 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf04237a-1f, col_values=(('external_ids', {'iface-id': 'af04237a-1f79-4f68-a18e-1ceb4911605b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:f6:34', 'vm-uuid': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.939 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:28 compute-0 NetworkManager[55691]: <info>  [1764936508.9399] manager: (tapaf04237a-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.953 187212 INFO os_vif [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f')
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.956 187212 DEBUG nova.virt.libvirt.vif [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.956 187212 DEBUG nova.network.os_vif_util [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.956 187212 DEBUG nova.network.os_vif_util [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.962 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.963 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.963 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.963 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.964 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Processing event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.965 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-deleted-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.966 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.966 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.966 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.966 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.967 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] No waiting events found dispatching network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.967 187212 WARNING nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received unexpected event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 for instance with vm_state building and task_state spawning.
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.968 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.971 187212 DEBUG nova.virt.libvirt.guest [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec 05 12:08:28 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:54:f6:34"/>
Dec 05 12:08:28 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:08:28 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:08:28 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:08:28 compute-0 nova_compute[187208]:   <target dev="tapaf04237a-1f"/>
Dec 05 12:08:28 compute-0 nova_compute[187208]: </interface>
Dec 05 12:08:28 compute-0 nova_compute[187208]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.975 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936508.9725707, b235a96f-7a12-4bd2-8627-33b128346aa4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.975 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] VM Resumed (Lifecycle Event)
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.979 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:08:28 compute-0 kernel: tapaf04237a-1f: entered promiscuous mode
Dec 05 12:08:28 compute-0 NetworkManager[55691]: <info>  [1764936508.9860] manager: (tapaf04237a-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/255)
Dec 05 12:08:28 compute-0 nova_compute[187208]: 2025-12-05 12:08:28.986 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:28 compute-0 ovn_controller[95610]: 2025-12-05T12:08:28Z|00623|binding|INFO|Claiming lport af04237a-1f79-4f68-a18e-1ceb4911605b for this chassis.
Dec 05 12:08:28 compute-0 ovn_controller[95610]: 2025-12-05T12:08:28Z|00624|binding|INFO|af04237a-1f79-4f68-a18e-1ceb4911605b: Claiming fa:16:3e:54:f6:34 10.100.0.10
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:28.997 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:f6:34 10.100.0.10'], port_security=['fa:16:3e:54:f6:34 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=af04237a-1f79-4f68-a18e-1ceb4911605b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:28.998 104471 INFO neutron.agent.ovn.metadata.agent [-] Port af04237a-1f79-4f68-a18e-1ceb4911605b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.001 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.002 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:29 compute-0 ovn_controller[95610]: 2025-12-05T12:08:29Z|00625|binding|INFO|Setting lport af04237a-1f79-4f68-a18e-1ceb4911605b ovn-installed in OVS
Dec 05 12:08:29 compute-0 ovn_controller[95610]: 2025-12-05T12:08:29Z|00626|binding|INFO|Setting lport af04237a-1f79-4f68-a18e-1ceb4911605b up in Southbound
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.009 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.011 187212 INFO nova.virt.libvirt.driver [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance spawned successfully.
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.011 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.025 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.021 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cbaefc7a-aaa8-4cce-a688-f95fb3d3f28c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:29 compute-0 systemd-udevd[229758]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:08:29 compute-0 NetworkManager[55691]: <info>  [1764936509.0539] device (tapaf04237a-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:08:29 compute-0 NetworkManager[55691]: <info>  [1764936509.0545] device (tapaf04237a-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.054 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.055 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.055 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.055 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.056 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.056 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.061 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[621d4b56-19b2-41a3-b17a-e1824a3d3efb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.064 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b653d1b9-7752-44a1-be63-a2def887c341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.079 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.093 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9d2caf07-adcf-4db2-979c-d5fe2348b753]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.111 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8517e25-df0c-40fb-bcc1-d78b1cbdae20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229764, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.123 187212 INFO nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Took 11.68 seconds to spawn the instance on the hypervisor.
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.123 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.125 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[130d708a-c998-4c7c-970b-701a127451ed]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229765, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229765, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.127 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.130 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.131 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.131 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:29 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.132 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.134 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.134 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.134 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:01:99:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.135 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:b8:01:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.138 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:54:f6:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.190 187212 DEBUG nova.virt.libvirt.guest [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:08:29 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:08:29 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec 05 12:08:29 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:08:29</nova:creationTime>
Dec 05 12:08:29 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:08:29 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:08:29 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:08:29 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:08:29 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:08:29 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec 05 12:08:29 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec 05 12:08:29 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec 05 12:08:29 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:08:29 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:08:29 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:08:29 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:08:29 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.246 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.279 187212 INFO nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Took 12.39 seconds to build instance.
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.329 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.573 187212 DEBUG nova.network.neutron [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:29 compute-0 nova_compute[187208]: 2025-12-05 12:08:29.594 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:30 compute-0 podman[229767]: 2025-12-05 12:08:30.208665851 +0000 UTC m=+0.054256725 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:08:30 compute-0 podman[229766]: 2025-12-05 12:08:30.225727176 +0000 UTC m=+0.075344207 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 05 12:08:30 compute-0 nova_compute[187208]: 2025-12-05 12:08:30.261 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:08:31 compute-0 ovn_controller[95610]: 2025-12-05T12:08:31Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:54:f6:34 10.100.0.10
Dec 05 12:08:31 compute-0 ovn_controller[95610]: 2025-12-05T12:08:31Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:54:f6:34 10.100.0.10
Dec 05 12:08:32 compute-0 nova_compute[187208]: 2025-12-05 12:08:32.003 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936497.0025473, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:32 compute-0 nova_compute[187208]: 2025-12-05 12:08:32.004 187212 INFO nova.compute.manager [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Stopped (Lifecycle Event)
Dec 05 12:08:32 compute-0 nova_compute[187208]: 2025-12-05 12:08:32.401 187212 DEBUG nova.compute.manager [None req-3ff18a4c-6715-4e1d-89df-9cc7019e3e42 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:32 compute-0 nova_compute[187208]: 2025-12-05 12:08:32.925 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.270 187212 DEBUG nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.270 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.270 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 DEBUG nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 WARNING nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b for instance with vm_state active and task_state None.
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 DEBUG nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.272 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.272 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.272 187212 DEBUG nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.272 187212 WARNING nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b for instance with vm_state active and task_state None.
Dec 05 12:08:33 compute-0 ovn_controller[95610]: 2025-12-05T12:08:33Z|00627|binding|INFO|Releasing lport 0dffa729-6b55-4e58-afef-f1cdc22c22fb from this chassis (sb_readonly=0)
Dec 05 12:08:33 compute-0 ovn_controller[95610]: 2025-12-05T12:08:33Z|00628|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec 05 12:08:33 compute-0 ovn_controller[95610]: 2025-12-05T12:08:33Z|00629|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:08:33 compute-0 ovn_controller[95610]: 2025-12-05T12:08:33Z|00630|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.449 187212 DEBUG nova.network.neutron [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port af04237a-1f79-4f68-a18e-1ceb4911605b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.450 187212 DEBUG nova.network.neutron [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.467 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.479 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:33 compute-0 nova_compute[187208]: 2025-12-05 12:08:33.939 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:34 compute-0 nova_compute[187208]: 2025-12-05 12:08:34.285 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.237 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.238 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.238 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.238 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.238 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.240 187212 INFO nova.compute.manager [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Terminating instance
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.241 187212 DEBUG nova.compute.manager [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:08:35 compute-0 kernel: tap48b30c48-78 (unregistering): left promiscuous mode
Dec 05 12:08:35 compute-0 NetworkManager[55691]: <info>  [1764936515.2827] device (tap48b30c48-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:08:35 compute-0 ovn_controller[95610]: 2025-12-05T12:08:35Z|00631|binding|INFO|Releasing lport 48b30c48-7858-408b-aeab-df46f6277546 from this chassis (sb_readonly=0)
Dec 05 12:08:35 compute-0 ovn_controller[95610]: 2025-12-05T12:08:35Z|00632|binding|INFO|Setting lport 48b30c48-7858-408b-aeab-df46f6277546 down in Southbound
Dec 05 12:08:35 compute-0 ovn_controller[95610]: 2025-12-05T12:08:35Z|00633|binding|INFO|Removing iface tap48b30c48-78 ovn-installed in OVS
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.297 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.308 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bb:58 10.100.0.8'], port_security=['fa:16:3e:62:bb:58 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '57e94004-ae40-473b-8b25-6fa2c9e8cf2d 994c2a79-1398-403d-88c3-e4993363396a fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=48b30c48-7858-408b-aeab-df46f6277546) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.309 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 48b30c48-7858-408b-aeab-df46f6277546 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 unbound from our chassis
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.311 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd355bd0-560e-4b18-a504-3a5134c930f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac3f7af-e35f-4656-80e9-2723304e8825]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.317 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 namespace which is not needed anymore
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:35 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Dec 05 12:08:35 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000003f.scope: Consumed 18.388s CPU time.
Dec 05 12:08:35 compute-0 systemd-machined[153543]: Machine qemu-67-instance-0000003f terminated.
Dec 05 12:08:35 compute-0 podman[229838]: 2025-12-05 12:08:35.368639224 +0000 UTC m=+0.065813061 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.381 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-08b15784-5374-4fb3-9f63-82412f709db4" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.382 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-08b15784-5374-4fb3-9f63-82412f709db4" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.383 187212 DEBUG nova.objects.instance [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:35 compute-0 podman[229841]: 2025-12-05 12:08:35.405076201 +0000 UTC m=+0.088338164 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.451 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000047', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'bf30ed1956544c7eae67c989042126e4', 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'hostId': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:08:35 compute-0 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [NOTICE]   (226767) : haproxy version is 2.8.14-c23fe91
Dec 05 12:08:35 compute-0 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [NOTICE]   (226767) : path to executable is /usr/sbin/haproxy
Dec 05 12:08:35 compute-0 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [WARNING]  (226767) : Exiting Master process...
Dec 05 12:08:35 compute-0 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [WARNING]  (226767) : Exiting Master process...
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.458 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'name': 'tempest-ServerActionsTestOtherB-server-63085993', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000041', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58cbd93e463049988ccd6d013893e7d6', 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'hostId': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:08:35 compute-0 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [ALERT]    (226767) : Current worker (226769) exited with code 143 (Terminated)
Dec 05 12:08:35 compute-0 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [WARNING]  (226767) : All workers exited. Exiting... (0)
Dec 05 12:08:35 compute-0 systemd[1]: libpod-3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3.scope: Deactivated successfully.
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:35 compute-0 podman[229904]: 2025-12-05 12:08:35.467820692 +0000 UTC m=+0.056002546 container died 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.480 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3-userdata-shm.mount: Deactivated successfully.
Dec 05 12:08:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-243fd1c5de96a14826aa2f40632d2c7e7d72bd7fcfdbb36dbcf9215a94d0a31f-merged.mount: Deactivated successfully.
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.510 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '442a804e3368417d9de1636d533a25e0', 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'hostId': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.513 187212 INFO nova.virt.libvirt.driver [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance destroyed successfully.
Dec 05 12:08:35 compute-0 podman[229904]: 2025-12-05 12:08:35.513872858 +0000 UTC m=+0.102054712 container cleanup 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.513 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000046', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e846fccb774e44f585d8847897bc4229', 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'hostId': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.514 187212 DEBUG nova.objects.instance [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'resources' on Instance uuid e9f9bf08-7688-4213-91ff-74f2271ec71d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.516 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000043', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98681240c47b41cba28d91e1c11fd71f', 'user_id': '242b773b0af24caf814e2a84178332d5', 'hostId': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.518 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000036', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58cbd93e463049988ccd6d013893e7d6', 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'hostId': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.518 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 12:08:35 compute-0 systemd[1]: libpod-conmon-3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3.scope: Deactivated successfully.
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.532 187212 DEBUG nova.virt.libvirt.vif [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1685847021',display_name='tempest-SecurityGroupsTestJSON-server-1685847021',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1685847021',id=63,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-52243xe8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:34Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=e9f9bf08-7688-4213-91ff-74f2271ec71d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.532 187212 DEBUG nova.network.os_vif_util [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.533 187212 DEBUG nova.network.os_vif_util [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.533 187212 DEBUG os_vif [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.535 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.536 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48b30c48-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.539 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.542 187212 INFO os_vif [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78')
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.542 187212 INFO nova.virt.libvirt.driver [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Deleting instance files /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d_del
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.543 187212 INFO nova.virt.libvirt.driver [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Deletion of /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d_del complete
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.558 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.559 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 podman[229948]: 2025-12-05 12:08:35.586428824 +0000 UTC m=+0.047210651 container remove 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.587 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.588 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.589 12 DEBUG ceilometer.compute.pollsters [-] Instance e9f9bf08-7688-4213-91ff-74f2271ec71d was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.596 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1806fd-f66f-41e4-96fc-305c12657505]: (4, ('Fri Dec  5 12:08:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 (3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3)\n3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3\nFri Dec  5 12:08:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 (3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3)\n3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.600 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cd3ad137-9fc9-4892-a8b9-5dcbd06dda2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.602 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:35 compute-0 kernel: tapdd355bd0-50: left promiscuous mode
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.617 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.623 187212 INFO nova.compute.manager [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.624 187212 DEBUG oslo.service.loopingcall [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.624 187212 DEBUG nova.compute.manager [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:08:35 compute-0 nova_compute[187208]: 2025-12-05 12:08:35.624 187212 DEBUG nova.network.neutron [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.628 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.628 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.631 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[13177d45-fa8b-4396-960b-56d16eccfb44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.653 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[04b58368-8ba2-4ece-8498-7cbe0d1a446e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.655 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb817bd-dd57-40ad-8881-b7c674b6a788]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.671 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9470fe06-033b-403d-9b20-2595ecbd9799]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376688, 'reachable_time': 30255, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229963, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:35 compute-0 systemd[1]: run-netns-ovnmeta\x2ddd355bd0\x2d560e\x2d4b18\x2da504\x2d3a5134c930f4.mount: Deactivated successfully.
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.676 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:08:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.676 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0d946dc6-48b5-4cb6-b42b-8212fa1f1d02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.683 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.bytes volume: 73019392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.683 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.711 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.bytes volume: 73105408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.712 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb8c4880-5e08-48cc-839e-eadb52b740a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '201d729a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '05ce5119b35f1c15bb70015d35f675c20943ac38e5cf80130865442d6302287e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '201d7f38-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '3aa8da13f6537be734610b3d00dc2921f099831d6f3e9617ca06c0a3f381c053'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2021e53c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': 'cdb4184af8911c1a5542c1eec99f9823ffcda3da0be45878e44ae03ce76c5525'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2021efa0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '778c5a04b03139cd59cf0fe5f3d3dcc3ae64241e98ec4611cae116dfaabeedd4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2028091c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '8d8e161b8695456bfeacc512e972dcaade5f950cb44253ae1483b96b2f349f33'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, '
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '20281498-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '7762c0161d44fe9d1305999163af85e9750b34b55f6fedfa77316f16557d8198'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73019392, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20306ec2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'bbeacdc1c1b0e83b93167431b711a16d24286dab58d99c959efe67e2669833f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '20307b4c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '1c051f7bd09d67bec0a5abf502ad90ad0e675311b4a9b1980bab797ccc224975'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73105408, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2034c256-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '4e66922adfdad750824001caf1945d0587a03653a24e6dd7d7e009b201d979a3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2034cf8a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '2cebb1a0dd1d1841251a31b13c309ac0bb9971bd520c5a822a442eabb59c3199'}]}, 'timestamp': '2025-12-05 12:08:35.712361', '_unique_id': '77b5c73fee9f49bb9b3b3dd3377796f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.714 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.718 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b235a96f-7a12-4bd2-8627-33b128346aa4 / tapdf4eecd2-b2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.719 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.721 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 854e3893-3908-4b4a-b29c-7fb4384e4f0c / tap1b4ab157-dd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.721 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.722 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.728 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2e537618-f998-4c4d-8e1e-e9cc79219330 / tap11c7fa90-6a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.728 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.733 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f1e72d05-87e7-495d-9dbb-1a10b112c69f / tapf7a6775e-6d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.734 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f1e72d05-87e7-495d-9dbb-1a10b112c69f / tapd35fce09-85 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.734 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f1e72d05-87e7-495d-9dbb-1a10b112c69f / tapaf04237a-1f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.735 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets volume: 32 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.735 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.735 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.737 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets volume: 34 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d19fad0-6fd5-45bb-ac70-dff93151e07f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '2035e71c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'fee47db44f162a3fd08d596d7a7da0a9bd15309cc9cc58761439137ea80f0418'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '20363852-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '7650bdac89fdfe95a9ca31feb8adf3d90c788f43b59e262eb3d3373f64389259'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '20375dcc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'ff58d63d451aa2a6aea7ecd244a2521212558ab1433273cec29e43b41b4e2ef8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 32, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '203854a2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'e7b7b9d594a828eb42e56c55333d138255e3df2bc1215e6c97d9c0e52951bba8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '20385f38-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '0e78845ca2b91f4041690b7f9bb60230c99b79c7cd9bb4a3d93056d25e43d2e1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata':
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]:  {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '20386758-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'c38f797055d222468e7d6a1bd0477b8fe880f865345f42f162f568e8caf73a99'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 34, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '2038bd5c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '8352f53fc8d6862c0549cd85ee8fab362a3de0ec4a26b683636f094674a7c6c2'}]}, 'timestamp': '2025-12-05 12:08:35.738134', '_unique_id': '8492777c5f844d69be0bc228e331b9d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>]
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.741 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.741 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.742 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.742 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.742 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.743 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '684fa6c5-8297-4948-bcde-04cf6db486b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '20392b48-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'ad1af126ea2fc9873f86bfb456f70274b7e13298888d39c7f4e917e70cfe9f36'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '20393840-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': 'a1c3000ab6f020c599d36fef60df1ee4ac76d5e3a370c1a6ee8133a5663e01b9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '20395fc8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'eb91623c92f4fa75292aed4f68de5727e9d1f2d7b1a283a99b83b6b893e643f6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '20396bda-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '19e36f7c7da8813e5065095a9e2b1806c11e286b67045fa3e7e5f2492351b9ee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '20397742-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '2c78b419613ff1f67308bfb72e4b4ac5f634ff2c9f0567c2df73914c832e25be'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 35.740572', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '2039828c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '15dce7b4e31b4b1d6f045064b5145f2095712f63e6ff46939cf934a684593742'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '20398eee-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': 'e4006fb7bfa503ba471a6f283fdaab4d05eb2a395d4212cc5311b725ec284013'}]}, 'timestamp': '2025-12-05 12:08:35.743459', '_unique_id': 'f9c2a1c40a394b70bcbe506f45760e59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.763 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/cpu volume: 6460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:35 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:35 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.800 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/cpu volume: 11640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.802 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.817 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/cpu volume: 11750000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.834 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/cpu volume: 11840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.849 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/cpu volume: 12410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef8736be-1abc-444c-bcfa-be4358f61eb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6460000000, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '203cab92-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.384245744, 'message_signature': 'e93ab0bd6cfadb4de6f6f58576f0d3b11233ae6c19c59fc87d32b5985197b768'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11640000000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '204251dc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.421296659, 'message_signature': '1cc5efe385103b4e43035e373c362774b0201907338d7cacb7164ef298890f97'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11750000000, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2044ffe0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.438771116, 'message_signature': '0ed32c2ee2dcfb7421c7ce31f2a6b9c213fad6ef526909cab2588c44d801e4e9'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11840000000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '20478756-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.455383268, 'message_signature': 'b4c82fcb10738963592b05ad7ff1a3824922fcb36782a8b76da6cca44bc57ae2'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12410000000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2049d646-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.47061859, 'message_signature': 'b3ff0d3e0e75285aa1014e19563464cb66b0a88674e5a38594fff850f39a9476'}]}, 'timestamp': '2025-12-05 12:08:35.850313', '_unique_id': '0cd0abfbdb7c41e4a9ff95d34c4f0c21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.852 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.852 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.852 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.853 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.853 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.853 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.854 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.854 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.854 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56f2ab59-ec64-445d-8489-d9d0fe0f68cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '204a3af0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'f9c283d1980d97f2ba9e9db6ada088314ced6e1c1e2ec2d49111a92242ec5d1c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '204a441e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '06399d1f669ba9bc57eccd4f5738767c8d68c86e520c88ed0e4e702d2b1085c8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '204a666a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '8f22784105ec603a24910391a628597212ef4df43abb4c61504686a61f2945a6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '204a6ebc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '79248c156ec8964ae54c3c511d267e3d9ffeecf82e6282127b31a0874d32f1d8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '204a7ca4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'bf0a14baf20553b5ec460431995f1be90fd75428710e85fc641ce833c7ce3679'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:35.852
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 407', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '204a8780-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'ca7ceb7123b4d4815b4cea0e4bcc8c8a0e488f46b1b3541763dc5f72a4d56e94'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '204a91da-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '112eb29c08536e7732860c4f8cfc807c64e9529c9ea332d92925ac5986a25d85'}]}, 'timestamp': '2025-12-05 12:08:35.854927', '_unique_id': '3723bffae24345ed991a5fbd1b99bcca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.856 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.856 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.latency volume: 171140698 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.856 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.latency volume: 581557 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.857 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.latency volume: 175918589 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.857 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.latency volume: 18556500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.858 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.858 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.latency volume: 220016827 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.858 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.latency volume: 27449958 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.859 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.latency volume: 437318709 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.859 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.latency volume: 27936399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.860 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.latency volume: 227447368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.860 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.latency volume: 33644734 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0da2d582-4f3e-40c6-8f84-1ea8704be4b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 171140698, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204ae2b6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '48d0bd905c78517fda7a8d275cab62bc19e210c806f91c74a172fb9880fe1b8d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 581557, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204aee5a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '66eb2e2d42804a7920d2a8bc223343aa1620f89abe6c91c14a0c81d00a6dbdfe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 175918589, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204af850-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '9c7a42f37aed46bc0673069f64d1a5fbbd9564a1298c524781036fda334cb6da'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18556500, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204b01e2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '91933af07e9b3aaf4cbef7f065057a3c1d8775822ae61dd185921c550040ccdc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 220016827, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204b2a6e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': 'd991d3cc90d90a55a7847003cd93bc73bba4804b182bedfa123adc3c4437bfd6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27449958, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: ': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204b3392-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '9dfe42c9ad59b8b605138fd3e6f6e813eca9be246731e8b40bd50080695f8915'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 437318709, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204b3d38-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'dd69db352f1997eb9eb68126ee2cad6e698203eb621e93fb11ef72b592ec6209'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27936399, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204b633a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '0150a2b2d56408697c93ad7f9f1f2285b0ca5fe89b20bec99315dfce5eec6562'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 227447368, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204b7460-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '2b1d378e9b884b882bf5c781e5c5ea7ae07bc41c5cc3627eb5dfd7283339da06'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33644734, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204b7c76-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': 'efcbd0729d7831da5bfe630800050c1fa5d463864ae71efd9069baf8d6e0f284'}]}, 'timestamp': '2025-12-05 12:08:35.860906', '_unique_id': '7146063ded1d47dd9bb4f70cf3235dd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.875 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.875 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.888 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.888 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.889 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.898 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.899 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.910 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.911 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.922 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.922 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79d023e4-f7dd-4af2-946d-d2f9a2b938a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204dc49a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': '92652fdd6e1cc11a801964655f929d22359fdf6c3eae6714c6f637b813829ff2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204dd2a0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': 'd1ee9675c2f2047665e7ca41987050a5c649745ddd8a896b1b81928da07e766b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204fbc28-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': '1adac6ec33343fb0b8982e381b7353606d8f53004e3b32c7939e5c21dfd85dcc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204fc830-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': 'b62fea3cc7bd5c0b25669c8c564c5ab964178925558ca02234d1799684c41ba3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20515bfa-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': 'cbbe61c7b289ae1487bdd5a5fc7e73dacc34e9aa138aac6a17a7d72e778fdd7a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 'disk_name': 'sda'}, 'message_id': '205167d0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': 'c6f0e2bc36a206294f2ebd9c161fca9053b7abd685319ebb7a7fea185950dc6b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20533178-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '603113f12d9024a24a7a58811110ed0004407db8e02c9e9b49a6ccbd2d11d86e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2053412c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '58ec954ce9ac1cebba4f3d1d8b3847be9c91168e6625731636cab3fdf6179340'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2054ef04-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': 'ee38ed3c6485eb9ce7538ff885ffdc5e61e5d43b1ac49ec78ce814909675c436'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2054fefe-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': '9ecdd67f6991a70735ea5c187e8e66e83fdf2c9917d1ad18c2c7324dc65a060a'}]}, 'timestamp': '2025-12-05 12:08:35.923281', '_unique_id': 'b1c184d9c202449c95fc8eaec021659b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.925 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.925 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.926 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.926 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.927 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.928 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.928 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.928 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.928 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.929 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.929 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04286a0d-52dc-4cc0-8c14-5d96855cf353', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20556736-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': 'f59782f43fb77a4ef85e71fff0bf05ece2bbdf90f90960d7a26d742f67ed1790'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205572bc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': 'ec99e3110d61c4cd918aed58bade52dabdaa87d57696ca054f38e898ccaab1a3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20557e10-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': '9acbd53a2456f715d9dce89498efbe3c355929f72dabe73fe0450071a38da7a5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2055884c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': '72285ce06828400754c2b30c5b1b358187046ede9a857e5cb22847ad55218f0c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2055c9e2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': '74540af1f3ca813e05c212029cae5e0aed51c2ff3c4bdcdedf2b253b126dfb3c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'epheme
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: ral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2055d400-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': '07a44bfa568a74087e52955ea718199fb42d6402f22fd2eeb1d880054616f34f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2055dd7e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': 'fbe6c3f59a441a4fe15b33b75b1e0a134d0c8dbc1d114275dbee6c58d8438e9b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2055e5f8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '7284fcb8448c66e7efd087dda5df246788e600dbf54e3c1ebd370b53aeacb7f5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2055f020-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': 'b9ce7650f177220b91b2b725b814307ad47ea147e4f08c374a5525c476c9aa67'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2055f926-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': '90470fe4264809190868691f32b92dd46a0c702791d72b9dc5600f08f70c0189'}]}, 'timestamp': '2025-12-05 12:08:35.929636', '_unique_id': 'd6f92181b81d4177a2fb968c33478cfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.931 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>]
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.latency volume: 2084425918 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.933 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.934 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.934 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.latency volume: 4916924415 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.934 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.935 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.latency volume: 3430287673 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.935 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.935 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.latency volume: 5304399030 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.935 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58abde05-b4cb-4413-a588-a3e332eed4fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20566cee-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '1369d420eb253d8392cf4c73bb1f9697cb1131b3994c0fea05c81f5fb4615bab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2056791e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': 'f9050353c1c5447f38d42b8b9b266ab0ecebe0506ca1218d3442c79679c6754c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2084425918, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20568468-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': 'dd24d851a33264abf124d0da7eed6fb19b80236b7646d2b01378c39f23f5dae8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '20568eae-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '3244ca63c622fb4efd93365c7cd3b914d62660748a3e244daaeb7ce875c4571f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4916924415, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2056c752-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '42b2adafda3796ebf21b481569438a79366f2ac56ddc46048ff1b0c55904f30a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2056d2c4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': 'f0798bfeb17a6b605fed10cbd8e1a5b1ffe7fb9795cc2e6085b332949d9f3e2a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3430287673, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2056dd1e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'eea674d79e02c3c92517a94538cdb469911e9b4cd01abdee14e2a6c90aa10366'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2056e7a0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '24b86b97a654f4245bac81bae5715f4df9704a84feb534e04a2fc1b7dee33dbf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5304399030, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2056f0e2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '0a45b017384bf5d5027f773acfc56649187e128b86acbb16087bb45b4cd628e1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205720bc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '1030c451b7804b01ac3c1efb249357bbd23b339c4b675ac3b143e7a90accede0'}]}, 'timestamp': '2025-12-05 12:08:35.937433', '_unique_id': 'de7bef9a548c43699ded393a4874595d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.940 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.940 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.940 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.942 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.942 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.943 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.943 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.943 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.943 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.bytes.delta volume: 336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '267e36dd-81fc-4788-8704-a6e8bc056d12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '2057b0f4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': '971b4475ebffc5e1c472af3e9cd8d33bbea60e3c3eb93eba8bf425ee937b4335'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '2057bd38-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': 'a664120d8af0734f28f344f2509adae6543b108c94837f7d6c5b9024177f07bc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205805ea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '2142aa7d0503c58b5a91aee9ff022885d13ded73871b39988f363f2830a4e38d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205815d0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '3c2b1f839ea7dfaa45325c3ffd0a1eb4d1ea049d2ed3bc71c354b27b932ad69d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '20581f12-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'ab15cfafa366d8099a2e7232d9cb674db5fbbb28331e08632301cd8013ed785d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-AttachInterfa
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: cesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '2058287c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '77bb4fdd8860f27a1abf3b261c8fec205e540a9cd41e4ad1ffb2c9717786bc17'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 336, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '20583326-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '22535c6c6f5cd684cdd8554d4cd60cbfa85b47c2b0d357031474c61f2fd96650'}]}, 'timestamp': '2025-12-05 12:08:35.944281', '_unique_id': 'a2af0191fe2e4dd49a37d478def0556b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.946 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.946 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.947 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.947 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.requests volume: 328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.947 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.948 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.948 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.requests volume: 301 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.requests volume: 331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0e6b3ef-2e84-484f-a91e-21464dd84cdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2058a6f8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': 'd2d70ad0041fe27eebcf42a64b0ee1b16acc5436c55182091f81e8ee4c61cb26'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2058b2c4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '9fb1058908573a1a77250c2850d00cd02779bc7f0ef39608db9cf3315ed25920'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 328, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2058bbd4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '509486d564bf737ad45e63554ed5cd16d49d14a0620a494a77a6d01b633ef78b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2058c502-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '9ae2d3ec74e2fb7ffacf13b11d9906c1b19e957081782903346b831054953f77'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 301, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2058f0a4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '1701c5906a0c49da91a2569597c4bcc4c73b66fb4bed09bb4aa8dfd9ce6e1ac4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_t
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: ype': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2058faea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '7bf52aecf7d0f2c1fc1bae683b2a311e21ff294a4c5d2a448ae4a17fae09495e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20590288-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'eb1ac22815b449aa8f18d861bcfd21b2c7c9bf7ccf0c9fa8c9420516caaf0068'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205909e0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '21c2c74e246b7228e25947699e01373a1fc840524508492d0153e63f78ab6792'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 331, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059135e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '070e19d9813ea5234bd6d760f6810df19a27a3fbee764a0e2827a6e7d946fab9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '20591c0a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '93e717662e00d3b9adf0db352a766db14ad643603230426d90f2be99501cb8dd'}]}, 'timestamp': '2025-12-05 12:08:35.950186', '_unique_id': 'b9448f6c49ce4f87a2d0883a2235e0fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.952 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.952 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.952 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.953 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.bytes volume: 29936128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.953 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.954 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.954 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.bytes volume: 30329344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.bytes volume: 30759424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.bytes volume: 30104064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2dafa48-713c-47d7-bee7-e96133a0d018', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059851e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '35de9dc9562f0e7382b0b6d71a1793af53ee8f848db09627ff817388d93f4a40'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205991f8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '0e954dfc42a65855e0bff8f460764b91dc1c55422c0d737b3935626725824a62'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29936128, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20599cd4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '332f3d3a1ae17de52ab362b901fb9d3fb9229e6fa318793742cc894c802a2666'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2059a7ba-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': 'c974c083f834b369d57909248254314af6eb2c3fd844c60ec5a9e16cb5dc3db4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30329344, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059da3c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': 'd85a3dcb9be9a48daa1165c58d878068dd50751fa4962d1d5fb5d62b100db564'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memo
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: ry_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2059e5cc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '068edc69447c3c99b61675e7330bd60d5c0be5994900c1b32c797ed36e7ce84f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30759424, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059ed9c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '2b2a9aef1dbeda1390f8aabb59aae4f63593e95b79c35a765b915f65bf60c677'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2059f4ea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'a34d97ca2c13ea596343fc6be1f7dc9ab9b49fffc30677eabfc0060b6249b06a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30104064, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059fc56-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': 'c422954a43d226c8c1c1a2e78a055c2f69371900bb07384129f873adb656011c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205a039a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': 'bf0dcc722b4da9ad6a0ad8446911b01987b5001eeff1b823b88a7cf29918f20e'}]}, 'timestamp': '2025-12-05 12:08:35.956138', '_unique_id': '7b201022e817404b9a3994a28a92505e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.958 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.958 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.959 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.960 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.960 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.960 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.960 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.961 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0790796b-5252-4eab-91cb-fd0b4ad48ccf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '205a6704-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'd2cf64aeb8c5c7633ba9547e6340bbaf25def0e409c069314f6479d736e3b64c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '205a7320-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '192258da666b7b1d29133d7a0f6aaa6daab73a1a9c0f45a8a4b7cd08a4580d9d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205aa660-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'a95cfb0768f494fb3aed5dbe0272bd47fd71a7b3525d55eaf84eb8ab8a48a0d8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205ab0ba-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'aabd79c1a173e35cc32a2360cddfcf6fd5365d972296ba4230eb7e2a93dfe5c2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '205aba10-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'c1c4e6e9dd37d69e84480309cba23f12f9150ff7bb8d667ef9def6abce837523'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 35.958330', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '205ac370-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '1724bb5978c7a383b5cdb3e61e73cce15369eb354c22ee372e164e57c6010295'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '205acde8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': 'ef4e98ff93bfbd694cde8ab8279fe9002343e628f01e1afb5c7e0b0a012c44e4'}]}, 'timestamp': '2025-12-05 12:08:35.961331', '_unique_id': '19f8ef7eb173439f9710c538e4c426a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.963 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.963 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.963 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>]
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.964 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.964 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.964 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.965 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.966 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.bytes volume: 1200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.966 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.966 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.966 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.967 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b0ec32b-d5d5-4a0d-9208-e7412f7a909e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '205b4ade-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': '574ada1abe9f51229118763463cbaaaf4076d155adeb445d5f30ea0a8dc1f0fb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '205b5632-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '612dd2e38eb6ffd1ecb689269ffd61510c26f2283529256d3237d23945b6dbb1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1200, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205b9368-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'df09b6e9d306f0124b4101ede5785fc5282812225331af07dc1fe8bceada39ba'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205b9ffc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '1589adabc2ef5bb5ec90ec3a47dd986c42f7ca71c7616a077fba504ca75f658b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '205baa7e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '7813e7de478ff46b4c5cbdc3b40fb1b731e8e1e7ab0066ef7d7f716f36f60cf0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-Atta
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: chInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '205bb69a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '042b05d07848fbf1de813ac97a4b87baf31160b3df51ba801bc5a95d32b4efec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '205bc112-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': 'd4b8aa1c4f3150ca766a6395e61630273acc548732263351e88c7416669a96f9'}]}, 'timestamp': '2025-12-05 12:08:35.967568', '_unique_id': '759b6577b8bd4a0ea287dd271c2599c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.969 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.970 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.970 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.971 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.971 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.971 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.972 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.972 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.972 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adf4f7b3-9be8-406e-a73f-22afad0cf83c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '205c320a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'df92b9c81b2d9badb310091f08d890b7b0e749bec11069092b5c1348cc6bc0f7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '205c3e62-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': 'f49a18cc7586efe503938f351b10395c8138f918b57cf87fdc6759b8b05597a0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205c670c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '9e252fab699c90c422b937a6c9fcd02739889e0446c985824a810e09c3b183c4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205c71de-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '8347b558746a4dd282e0fbaff7313bcb9563e7f364d1d7ec133899c8aabbfa5c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '205c7d6e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '22a033789af5b904890514ed34a073963ef9e39d600f113818048063929931c6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:35.970
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 136', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '205c87d2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'ccf2975fc094577103fc8cc89e1cd2b1d58778e61ba6ca471598a406151f6135'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '205c9254-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '6d40919750b9d2cf2d708d21862354ca76ffcbd7e31ed378cac8ad9656104c46'}]}, 'timestamp': '2025-12-05 12:08:35.972920', '_unique_id': '80c8856510b24661bcd86802c0b0fc15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.975 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.975 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.975 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.975 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.requests volume: 1074 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.976 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.977 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.977 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.requests volume: 1094 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.977 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.977 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.requests volume: 1111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.978 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.978 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.requests volume: 1087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.978 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a8f07f6-13f7-4036-9643-5a9ba7ced4d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205cfc08-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': 'b669f3d2848f8e8d3719fae78ff0d9a2865ab0667241a0c90304f18a56bd6cfe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d061c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '99911f554e89d012dd1a8307fab2034bb13588e684a253b33cfed396d753761c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1074, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205d0f40-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': 'c864d285e42cc563b5600b165684654a82b1ea53e4b85ba0c0334dc1783cf6e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d19fe-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '997d7f7322e4677b26988a9f7cb117cdb9358c574d6f26ec9c8bd6c627d51ead'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1094, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205d4adc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '3e14b7c441e3c121da6309a36c5aecf1536e020168da300cbca767c792cd10c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: _type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d54a0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '83d166e19e2ccfc7b83ff5f16a6068f082cb7b27660e3326c28d69f7b8604c85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1111, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205d5e28-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'd7f3c55e30f867a065831382445a2037be5c916623922971ddb35277aeb678b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d6df0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '699e0cf9cec708635e0769c755e5f7a4b6e71ccc7ae288213b0587b0fb0e7699'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1087, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205d793a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': 'ed29f945406ed6e3f24a0eb3b7ec3c2ed4f8d4023acb030d3ab8fa689481e82e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d829a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '6c5bf30981946e5806d2396508ec938fe52b9a20af598012f2061ecea0581b90'}]}, 'timestamp': '2025-12-05 12:08:35.979080', '_unique_id': '2143db660d7b40a0b17a869adf270ee9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.981 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.981 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>]
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.982 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.982 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.982 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.983 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.984 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.984 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.984 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.985 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.985 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.985 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.985 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba73126b-a954-411a-82e8-60b1d010d68e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e0e54-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': '5011df8c6adfd1c420bc0d0969327ff0715110167741fd6d79b06dab2d1c8aee'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e1ab6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': '741270a0c279cc60dbf919aea9cb0ffaed05d7334c91082199c3fc5a4f56ed87'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e2538-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': 'f08dfae0e5e54b88bf8e9fd9cccaf748bd12620ea14a021c30fe8fd53fef6f4e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e33b6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': 'd992f150d429946fef18d21ad61eb98c68de807f399a4a0007e3099470e04243'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e68d6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': '87a74484e968a10538e982b8bc675374fb808b61e1b9d63aada9f0d9cb81920e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ep
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: hemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e73d0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': 'c72c8ba2939b708a38e95ef53b1c7ac8f6e121fb4533611cb7ae692a3a60bcac'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e7d44-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '490166e199e878f4f2104aa48cf3ae43792a41669495580db642a023d459be02'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e8654-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '2bfcd6db82821a3d1ccc855b34979fcd759c2fc04df6f7fd5e97c8e0dc417075'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e9004-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': '5c2b1c974cdef080e5a4de4c4be5460b0619cf1e631d14e8c83940c2056bac56'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e9b1c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': 'f66664f5c35e34c257144634e918fcf7706cb3dd6de8075d738b2543fc3c562c'}]}, 'timestamp': '2025-12-05 12:08:35.986251', '_unique_id': '22f74bf530c34cd09c45f8659f2149cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.988 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.988 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.988 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b235a96f-7a12-4bd2-8627-33b128346aa4: ceilometer.compute.pollsters.NoVolumeException
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.989 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/memory.usage volume: 42.27734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.990 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.990 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.990 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/memory.usage volume: 44.1328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.990 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/memory.usage volume: 42.59765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efa74d28-4b72-4c89-940d-56fd02f2da77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.27734375, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'timestamp': '2025-12-05T12:08:35.988879', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '205f1826-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.421296659, 'message_signature': '216d89c1585e5c1aff73768c8f6fbf64fed1ccca34fa6066d8c317c478a1b876'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'timestamp': '2025-12-05T12:08:35.988879', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '205f4bc0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.438771116, 'message_signature': '60ab1233c983f133ba2f4f14c812034982a66cee15c222225f2127743a804c6e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.1328125, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'timestamp': '2025-12-05T12:08:35.988879', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '205f55ac-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.455383268, 'message_signature': '2688011807e02f9dec6d304e80e2446fbd09de606d1149c38f2ff1c1f884641b'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.59765625, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'timestamp': '2025-12-05T12:08:35.988879', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '205f5fd4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.47061859, 'message_signature': '4f17be869f03b2133ffd16c549f569ba0e8f379cfef090b5e31779ad031e179c'}]}, 'timestamp': '2025-12-05 12:08:35.991266', '_unique_id': '0e7943bad0a14df19eee7701771f7fdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.993 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.993 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.994 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.994 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.995 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.995 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.995 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.995 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4c0c452-d841-412e-9c0e-217b9902f0d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '205fbb00-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'd91ffbc6ddb3c08d5d4154548cdc9d13a96d735ae84b0bdd059a58ff1e50e339'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '205fc686-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '90d30b7a924313553fc2c252b4cea96df66130235cb08e501c364a88b9a1c63c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205ff16a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'eeb56b26c46c0332f81297033af7057213c1808966e8395feca3790b1968d2a1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205ffe80-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'ba0c1cee425abbf57f08fd31b53e55955a5c194dc384d1d3400ceb3c56a95061'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '2060090c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'fad65b0f0d71bcdb342bd7f019a0f4d99988d98a4f17f4e96aef39f30551c215'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-AttachInterfa
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: cesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '20601352-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '91da91625f745b556188f54e38fffa9182760ef139ae308e326416dbafbddf0f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '20601db6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '751ab3db8877f86fa692b43e0d0fd5e92be0240b48fe967eed72f79edc009ec5'}]}, 'timestamp': '2025-12-05 12:08:35.996168', '_unique_id': 'cddf595005d34c5694baeb1462c6e04d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.998 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.998 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.998 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.999 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.999 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.999 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.999 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.000 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.000 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c26dd6a-b2c9-42ad-9285-6579de3bd5b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '20607be4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': '7a06ae9dfc0dcb1af5f6e801cc7e26a14274f50ee8abe1cc864951e400c68692'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '206086a2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '3f32cdd01a0e8c3bad1662f1f92153785286100b0464107e4ca67d8c7f2cecad'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '2060aa56-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '2075f4b56a3dbfda0d71173ff0357ac91b1d5fb824eb78f79727c9fba508a7d4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '2060b4c4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '24e0fe6c185ac8eb0181284d1e2131e6f35cb8320407f9109c22a919490a3854'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '2060bfdc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'a416d1207db9b4c0a3c6afef0370c034930f7001c42f47d0c21664618a250a3c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata':
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]:  {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '2060c9be-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '43665c3fe8afb1c17b5e3756bacae5779c5458614b575fde2c961b138abe4608'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '2060d418-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '05848860ebcb94e989e57091f2b188b0c0ebcc639e24a939e8d0c6223e56f565'}]}, 'timestamp': '2025-12-05 12:08:36.000802', '_unique_id': 'a2a21b98096a4f5d874b033382fe0b33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.002 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.002 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.bytes volume: 1514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.003 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.003 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.bytes volume: 532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.004 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes volume: 4447 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.004 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.004 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes volume: 1346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.004 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.bytes volume: 4531 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4433084-a900-4741-b5d7-47eea4b354a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '2061274c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'fe0b1aa6fa68533275b4e3174802b4f61a923cc0eec1cd76da31f850de5d8fa4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1514, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '2061328c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '4481ee7984dc8efdb1a24f5ca762f7bed7f1ffe916abf47d8576ca6e176bfbf5'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 532, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '2061519a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '4c48751a996f5842690ad3048be2af1627a010ccf21402d481014076e841e75a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4447, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '20615d20-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'dd7865b7c3700ed2fd68616fb60574b10f4ec8982735251dbbfc86a8c40a4f60'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '2061672a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '3145e2675519b223dc27630afb0e272709b4de1cb5208ee5eb9b987bb96cf0d1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1346, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-Atta
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: chInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '2061713e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'c99a09c46f3943103555977f1cc9d529fed04688226d2c43275f4ecbf914ceee'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4531, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '20617c74-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': 'c43ffa0cbaa4887153978df74d5fdcddb7dc60e4c57dee55bce7366ab0a6c4be'}]}, 'timestamp': '2025-12-05 12:08:36.005135', '_unique_id': '1e724de2ee974b3ca320dcb37a537b9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:08:36 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:08:36 compute-0 nova_compute[187208]: 2025-12-05 12:08:36.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:36 compute-0 nova_compute[187208]: 2025-12-05 12:08:36.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 12:08:36 compute-0 nova_compute[187208]: 2025-12-05 12:08:36.338 187212 DEBUG nova.objects.instance [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:36 compute-0 nova_compute[187208]: 2025-12-05 12:08:36.356 187212 DEBUG nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.015 187212 DEBUG nova.policy [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.082 187212 DEBUG nova.network.neutron [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.097 187212 INFO nova.compute.manager [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Took 1.47 seconds to deallocate network for instance.
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.144 187212 DEBUG nova.compute.manager [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-unplugged-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.145 187212 DEBUG oslo_concurrency.lockutils [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.145 187212 DEBUG oslo_concurrency.lockutils [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.145 187212 DEBUG oslo_concurrency.lockutils [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.145 187212 DEBUG nova.compute.manager [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] No waiting events found dispatching network-vif-unplugged-48b30c48-7858-408b-aeab-df46f6277546 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.146 187212 DEBUG nova.compute.manager [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-unplugged-48b30c48-7858-408b-aeab-df46f6277546 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.147 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.147 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.413 187212 DEBUG nova.compute.provider_tree [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.431 187212 DEBUG nova.scheduler.client.report [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.454 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.493 187212 INFO nova.scheduler.client.report [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Deleted allocations for instance e9f9bf08-7688-4213-91ff-74f2271ec71d
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.578 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.733 187212 DEBUG nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully updated port: 08b15784-5374-4fb3-9f63-82412f709db4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.768 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936502.7674022, 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.769 187212 INFO nova.compute.manager [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] VM Stopped (Lifecycle Event)
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.880 187212 DEBUG nova.compute.manager [None req-4cb947f3-3cf3-4528-88b6-811cf3053d24 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.883 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.883 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.883 187212 DEBUG nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:08:37 compute-0 nova_compute[187208]: 2025-12-05 12:08:37.928 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:38 compute-0 nova_compute[187208]: 2025-12-05 12:08:38.134 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:38 compute-0 nova_compute[187208]: 2025-12-05 12:08:38.135 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:38 compute-0 nova_compute[187208]: 2025-12-05 12:08:38.135 187212 INFO nova.compute.manager [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Shelving
Dec 05 12:08:38 compute-0 nova_compute[187208]: 2025-12-05 12:08:38.156 187212 DEBUG nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:08:38 compute-0 nova_compute[187208]: 2025-12-05 12:08:38.218 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:38 compute-0 nova_compute[187208]: 2025-12-05 12:08:38.247 187212 WARNING nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it
Dec 05 12:08:38 compute-0 nova_compute[187208]: 2025-12-05 12:08:38.247 187212 WARNING nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it
Dec 05 12:08:38 compute-0 nova_compute[187208]: 2025-12-05 12:08:38.247 187212 WARNING nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it
Dec 05 12:08:39 compute-0 podman[229969]: 2025-12-05 12:08:39.217052908 +0000 UTC m=+0.063834443 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.305 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:08:40 compute-0 kernel: tap2e9efd6c-74 (unregistering): left promiscuous mode
Dec 05 12:08:40 compute-0 NetworkManager[55691]: <info>  [1764936520.4103] device (tap2e9efd6c-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00634|binding|INFO|Releasing lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 from this chassis (sb_readonly=0)
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00635|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 down in Southbound
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.420 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00636|binding|INFO|Removing iface tap2e9efd6c-74 ovn-installed in OVS
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.422 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.428 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.429 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.431 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.433 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[834d9d90-b444-4de5-900e-e561485d8934]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.479 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[53c7de55-198e-4eef-9a7a-bff5ab5e1112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.483 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[68eec4d1-8d1a-455d-b455-67716c03913d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000036.scope: Deactivated successfully.
Dec 05 12:08:40 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000036.scope: Consumed 20.652s CPU time.
Dec 05 12:08:40 compute-0 systemd-machined[153543]: Machine qemu-58-instance-00000036 terminated.
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.512 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f10650-8575-47e9-8711-ec7af9ac216d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.531 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ef876a1f-349e-4ce2-ad7f-71212ee58877]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230002, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.548 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4791c766-b010-4514-8b24-dc5e9014286c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230003, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230003, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.550 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.552 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.556 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.557 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.558 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.558 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.558 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:40 compute-0 kernel: tap2e9efd6c-74: entered promiscuous mode
Dec 05 12:08:40 compute-0 NetworkManager[55691]: <info>  [1764936520.6411] manager: (tap2e9efd6c-74): new Tun device (/org/freedesktop/NetworkManager/Devices/256)
Dec 05 12:08:40 compute-0 systemd-udevd[229993]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:08:40 compute-0 kernel: tap2e9efd6c-74 (unregistering): left promiscuous mode
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00637|binding|INFO|Claiming lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 for this chassis.
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00638|binding|INFO|2e9efd6c-740c-405b-b9f0-bd46434070a7: Claiming fa:16:3e:ab:5e:ef 10.100.0.5
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.654 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.655 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.658 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00639|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 ovn-installed in OVS
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00640|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 up in Southbound
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.665 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00641|binding|INFO|Releasing lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 from this chassis (sb_readonly=1)
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.666 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00642|if_status|INFO|Dropped 2 log messages in last 127 seconds (most recently, 127 seconds ago) due to excessive rate
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00643|if_status|INFO|Not setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 down as sb is readonly
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00644|binding|INFO|Removing iface tap2e9efd6c-74 ovn-installed in OVS
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00645|binding|INFO|Releasing lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 from this chassis (sb_readonly=0)
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.670 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_controller[95610]: 2025-12-05T12:08:40Z|00646|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 down in Southbound
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.672 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b04e03cb-6b39-4e1b-be8e-fe97b97bc772]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.678 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.681 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.701 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e52e01-c6e8-4b20-9b83-acc814b03003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.704 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[08618253-02b7-439f-a0b1-73ddc9714ba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.734 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6d1807-61ab-4856-aa80-43d16fb5085d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.755 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2c110524-8b21-464e-b0c4-a9de111a2f3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230028, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.776 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c995eccd-ec0a-4a0d-a60a-633f91d1adbe]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230029, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230029, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.776 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.777 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.777 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.777 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.777 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] No waiting events found dispatching network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 WARNING nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received unexpected event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 for instance with vm_state deleted and task_state None.
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-deleted-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-08b15784-5374-4fb3-9f63-82412f709db4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-08b15784-5374-4fb3-9f63-82412f709db4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.778 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.785 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.785 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.786 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.786 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.787 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.790 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.806 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3f6450fd-1b47-41af-8ccf-2aebbdf5744b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.837 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b10c3be7-6027-4df4-bea0-d50b39f3a5b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.840 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ec4cdf-b4f7-49a7-a3e9-928ea84649ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.874 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c6aeebcf-ab64-45a4-9671-62d260e19938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.892 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c57083f-917d-4b7a-a1f4-aa772b7ed735]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230036, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.911 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4825dded-fc87-4a5f-9724-47b4ab2bff9a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230037, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230037, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.914 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 nova_compute[187208]: 2025-12-05 12:08:40.919 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.920 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.920 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.921 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.921 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.074 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.099 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.100 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.172 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance shutdown successfully after 3 seconds.
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.180 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance destroyed successfully.
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.180 187212 DEBUG nova.objects.instance [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'numa_topology' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.535 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Beginning cold snapshot process
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.695 187212 DEBUG nova.privsep.utils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:08:41 compute-0 nova_compute[187208]: 2025-12-05 12:08:41.696 187212 DEBUG oslo_concurrency.processutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk /var/lib/nova/instances/snapshots/tmpg0u6kc0n/89f598cb738c44249d2624dce0df0c86 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:42 compute-0 nova_compute[187208]: 2025-12-05 12:08:42.063 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:42 compute-0 nova_compute[187208]: 2025-12-05 12:08:42.100 187212 DEBUG oslo_concurrency.processutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk /var/lib/nova/instances/snapshots/tmpg0u6kc0n/89f598cb738c44249d2624dce0df0c86" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:42 compute-0 nova_compute[187208]: 2025-12-05 12:08:42.100 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot extracted, beginning image upload
Dec 05 12:08:42 compute-0 nova_compute[187208]: 2025-12-05 12:08:42.665 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:42 compute-0 kernel: tap11c7fa90-6a (unregistering): left promiscuous mode
Dec 05 12:08:42 compute-0 NetworkManager[55691]: <info>  [1764936522.7469] device (tap11c7fa90-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:08:42 compute-0 ovn_controller[95610]: 2025-12-05T12:08:42Z|00647|binding|INFO|Releasing lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 from this chassis (sb_readonly=0)
Dec 05 12:08:42 compute-0 ovn_controller[95610]: 2025-12-05T12:08:42Z|00648|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 down in Southbound
Dec 05 12:08:42 compute-0 ovn_controller[95610]: 2025-12-05T12:08:42Z|00649|binding|INFO|Removing iface tap11c7fa90-6a ovn-installed in OVS
Dec 05 12:08:42 compute-0 nova_compute[187208]: 2025-12-05 12:08:42.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:42 compute-0 nova_compute[187208]: 2025-12-05 12:08:42.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:42 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000046.scope: Deactivated successfully.
Dec 05 12:08:42 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000046.scope: Consumed 13.788s CPU time.
Dec 05 12:08:42 compute-0 systemd-machined[153543]: Machine qemu-77-instance-00000046 terminated.
Dec 05 12:08:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:42.837 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:ee:e4 10.100.0.2'], port_security=['fa:16:3e:e4:ee:e4 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-034629ef-6cd1-463c-b963-3d0d9c530038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e846fccb774e44f585d8847897bc4229', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a77f7593-d6d1-44fb-8125-66cdfc38709c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4c9d894-0fc3-4aad-a4d5-6bee101a530c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=11c7fa90-6a48-487a-a375-5adf7f41cb90) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:42.838 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 11c7fa90-6a48-487a-a375-5adf7f41cb90 in datapath 034629ef-6cd1-463c-b963-3d0d9c530038 unbound from our chassis
Dec 05 12:08:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:42.840 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 034629ef-6cd1-463c-b963-3d0d9c530038 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:08:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:42.841 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2293cec9-e60a-4a6b-9cf1-8f162a56ca62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:42 compute-0 ovn_controller[95610]: 2025-12-05T12:08:42Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:3b:49 10.100.0.11
Dec 05 12:08:42 compute-0 ovn_controller[95610]: 2025-12-05T12:08:42Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:3b:49 10.100.0.11
Dec 05 12:08:42 compute-0 nova_compute[187208]: 2025-12-05 12:08:42.930 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.321 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance shutdown successfully after 13 seconds.
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.326 187212 INFO nova.virt.libvirt.driver [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance destroyed successfully.
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.327 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.467 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Attempting rescue
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.469 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.473 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.473 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating image(s)
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.474 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.474 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.475 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.475 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.499 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.500 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.510 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.567 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.568 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.604 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.606 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.607 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'migration_context' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.626 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.628 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start _get_guest_xml network_info=[{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "vif_mac": "fa:16:3e:e4:ee:e4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a6987852-063f-405d-a848-6b382694811e', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.629 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'resources' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.665 187212 WARNING nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.678 187212 DEBUG nova.virt.libvirt.host [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.679 187212 DEBUG nova.virt.libvirt.host [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.684 187212 DEBUG nova.virt.libvirt.host [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.685 187212 DEBUG nova.virt.libvirt.host [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.686 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.686 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.686 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.688 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.688 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.688 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.688 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.689 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.707 187212 DEBUG nova.virt.libvirt.vif [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1436335913',display_name='tempest-ServerRescueTestJSONUnderV235-server-1436335913',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1436335913',id=70,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e846fccb774e44f585d8847897bc4229',ramdisk_id='',reservation_id='r-230fx5t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1035500959',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1035500959-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:08:20Z,user_data=None,user_id='6a2cefdbcaae4db3b3ece95c8227d77e',uuid=2e537618-f998-4c4d-8e1e-e9cc79219330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "vif_mac": "fa:16:3e:e4:ee:e4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.707 187212 DEBUG nova.network.os_vif_util [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converting VIF {"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "vif_mac": "fa:16:3e:e4:ee:e4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.708 187212 DEBUG nova.network.os_vif_util [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.709 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.740 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <uuid>2e537618-f998-4c4d-8e1e-e9cc79219330</uuid>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <name>instance-00000046</name>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1436335913</nova:name>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:08:43</nova:creationTime>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:08:43 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:08:43 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:08:43 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:08:43 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:08:43 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:08:43 compute-0 nova_compute[187208]:         <nova:user uuid="6a2cefdbcaae4db3b3ece95c8227d77e">tempest-ServerRescueTestJSONUnderV235-1035500959-project-member</nova:user>
Dec 05 12:08:43 compute-0 nova_compute[187208]:         <nova:project uuid="e846fccb774e44f585d8847897bc4229">tempest-ServerRescueTestJSONUnderV235-1035500959</nova:project>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:08:43 compute-0 nova_compute[187208]:         <nova:port uuid="11c7fa90-6a48-487a-a375-5adf7f41cb90">
Dec 05 12:08:43 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <system>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <entry name="serial">2e537618-f998-4c4d-8e1e-e9cc79219330</entry>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <entry name="uuid">2e537618-f998-4c4d-8e1e-e9cc79219330</entry>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </system>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <os>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   </os>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <features>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   </features>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <target dev="vdb" bus="virtio"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config.rescue"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:e4:ee:e4"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <target dev="tap11c7fa90-6a"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/console.log" append="off"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <video>
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </video>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:08:43 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:08:43 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:08:43 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:08:43 compute-0 nova_compute[187208]: </domain>
Dec 05 12:08:43 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.748 187212 INFO nova.virt.libvirt.driver [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance destroyed successfully.
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.827 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.828 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.828 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.828 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No VIF found with MAC fa:16:3e:e4:ee:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.829 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Using config drive
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.849 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:43 compute-0 nova_compute[187208]: 2025-12-05 12:08:43.908 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'keypairs' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.098 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.098 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.220 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.279 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.281 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.342 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.349 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.391 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating config drive at /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config.rescue
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.398 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphwzmm4kj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.421 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.422 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.505 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.510 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.530 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphwzmm4kj" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.577 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.577 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 kernel: tap11c7fa90-6a: entered promiscuous mode
Dec 05 12:08:44 compute-0 NetworkManager[55691]: <info>  [1764936524.5950] manager: (tap11c7fa90-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Dec 05 12:08:44 compute-0 ovn_controller[95610]: 2025-12-05T12:08:44Z|00650|binding|INFO|Claiming lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 for this chassis.
Dec 05 12:08:44 compute-0 ovn_controller[95610]: 2025-12-05T12:08:44Z|00651|binding|INFO|11c7fa90-6a48-487a-a375-5adf7f41cb90: Claiming fa:16:3e:e4:ee:e4 10.100.0.2
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.599 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:44 compute-0 ovn_controller[95610]: 2025-12-05T12:08:44Z|00652|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 ovn-installed in OVS
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:44 compute-0 systemd-udevd[230119]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.624 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:44 compute-0 systemd-machined[153543]: New machine qemu-79-instance-00000046.
Dec 05 12:08:44 compute-0 NetworkManager[55691]: <info>  [1764936524.6326] device (tap11c7fa90-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:08:44 compute-0 NetworkManager[55691]: <info>  [1764936524.6334] device (tap11c7fa90-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:08:44 compute-0 systemd[1]: Started Virtual Machine qemu-79-instance-00000046.
Dec 05 12:08:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:44.639 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:ee:e4 10.100.0.2'], port_security=['fa:16:3e:e4:ee:e4 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-034629ef-6cd1-463c-b963-3d0d9c530038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e846fccb774e44f585d8847897bc4229', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a77f7593-d6d1-44fb-8125-66cdfc38709c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4c9d894-0fc3-4aad-a4d5-6bee101a530c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=11c7fa90-6a48-487a-a375-5adf7f41cb90) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:44 compute-0 ovn_controller[95610]: 2025-12-05T12:08:44Z|00653|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 up in Southbound
Dec 05 12:08:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:44.640 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 11c7fa90-6a48-487a-a375-5adf7f41cb90 in datapath 034629ef-6cd1-463c-b963-3d0d9c530038 bound to our chassis
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.640 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.641 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:44.642 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 034629ef-6cd1-463c-b963-3d0d9c530038 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:08:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:44.643 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4e60e849-3ad1-4ab6-b53e-b2ae9c101880]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.662 187212 DEBUG nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.700 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.700 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.760 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.769 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.826 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.827 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.897 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.904 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.972 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:44 compute-0 nova_compute[187208]: 2025-12-05 12:08:44.973 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.029 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.238 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.240 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5072MB free_disk=72.9872817993164GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.241 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.241 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.251 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 2e537618-f998-4c4d-8e1e-e9cc79219330 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.253 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936525.250553, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.253 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Resumed (Lifecycle Event)
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.259 187212 DEBUG nova.compute.manager [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:45 compute-0 nova_compute[187208]: 2025-12-05 12:08:45.539 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:46 compute-0 podman[230159]: 2025-12-05 12:08:46.204443017 +0000 UTC m=+0.058894937 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.036 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.037 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.037 187212 DEBUG nova.network.neutron [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port 08b15784-5374-4fb3-9f63-82412f709db4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.041 187212 DEBUG nova.virt.libvirt.vif [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.042 187212 DEBUG nova.network.os_vif_util [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.043 187212 DEBUG nova.network.os_vif_util [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.043 187212 DEBUG os_vif [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.044 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.044 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.044 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.049 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.049 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.050 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b15784-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.051 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap08b15784-53, col_values=(('external_ids', {'iface-id': '08b15784-5374-4fb3-9f63-82412f709db4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:21:db', 'vm-uuid': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:47 compute-0 NetworkManager[55691]: <info>  [1764936527.0536] manager: (tap08b15784-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.055 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.063 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.065 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.066 187212 INFO os_vif [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53')
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.067 187212 DEBUG nova.virt.libvirt.vif [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.067 187212 DEBUG nova.network.os_vif_util [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.068 187212 DEBUG nova.network.os_vif_util [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.070 187212 DEBUG nova.virt.libvirt.guest [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec 05 12:08:47 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:d1:21:db"/>
Dec 05 12:08:47 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:08:47 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:08:47 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:08:47 compute-0 nova_compute[187208]:   <target dev="tap08b15784-53"/>
Dec 05 12:08:47 compute-0 nova_compute[187208]: </interface>
Dec 05 12:08:47 compute-0 nova_compute[187208]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 05 12:08:47 compute-0 kernel: tap08b15784-53: entered promiscuous mode
Dec 05 12:08:47 compute-0 NetworkManager[55691]: <info>  [1764936527.0821] manager: (tap08b15784-53): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Dec 05 12:08:47 compute-0 ovn_controller[95610]: 2025-12-05T12:08:47Z|00654|binding|INFO|Claiming lport 08b15784-5374-4fb3-9f63-82412f709db4 for this chassis.
Dec 05 12:08:47 compute-0 ovn_controller[95610]: 2025-12-05T12:08:47Z|00655|binding|INFO|08b15784-5374-4fb3-9f63-82412f709db4: Claiming fa:16:3e:d1:21:db 10.100.0.14
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.089 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:47 compute-0 NetworkManager[55691]: <info>  [1764936527.1026] device (tap08b15784-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:08:47 compute-0 NetworkManager[55691]: <info>  [1764936527.1041] device (tap08b15784-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:08:47 compute-0 ovn_controller[95610]: 2025-12-05T12:08:47Z|00656|binding|INFO|Setting lport 08b15784-5374-4fb3-9f63-82412f709db4 ovn-installed in OVS
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:47 compute-0 ovn_controller[95610]: 2025-12-05T12:08:47Z|00657|binding|INFO|Setting lport 08b15784-5374-4fb3-9f63-82412f709db4 up in Southbound
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.737 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:21:db 10.100.0.14'], port_security=['fa:16:3e:d1:21:db 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-274660127', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-274660127', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=08b15784-5374-4fb3-9f63-82412f709db4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.738 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 08b15784-5374-4fb3-9f63-82412f709db4 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.740 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.754 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3d438086-9f38-4fce-9c45-85146370195b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.783 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c21076cf-5751-4e8a-94c8-c85aa2cbc623]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.786 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7b54973e-e10c-4438-a847-b44fc03a7e12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.797 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936525.2527544, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.797 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Started (Lifecycle Event)
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.816 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2d44534f-bb66-49c0-b90f-5c0a85fdfd27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.829 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 854e3893-3908-4b4a-b29c-7fb4384e4f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance f1e72d05-87e7-495d-9dbb-1a10b112c69f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 2e537618-f998-4c4d-8e1e-e9cc79219330 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b235a96f-7a12-4bd2-8627-33b128346aa4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.831 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1152MB phys_disk=79GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.832 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2a4702-c894-4a37-8900-9f9448d5765a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 10, 'rx_bytes': 784, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 10, 'rx_bytes': 784, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230201, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.846 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c389e3a-e388-4609-ae74-269697792674]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230202, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230202, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.848 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.899 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.902 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.903 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.903 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.904 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.904 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:08:47 compute-0 nova_compute[187208]: 2025-12-05 12:08:47.932 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.096 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:08:48 compute-0 ovn_controller[95610]: 2025-12-05T12:08:48Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:21:db 10.100.0.14
Dec 05 12:08:48 compute-0 ovn_controller[95610]: 2025-12-05T12:08:48Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:21:db 10.100.0.14
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.790 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:48.792 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:08:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:48.793 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.793 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.794 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:01:99:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.794 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:b8:01:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.794 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:54:f6:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.795 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:d1:21:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.828 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.829 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.834 187212 DEBUG nova.virt.libvirt.guest [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:08:48 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:08:48 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec 05 12:08:48 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:08:48</nova:creationTime>
Dec 05 12:08:48 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:08:48 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:08:48 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:08:48 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:08:48 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:08:48 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec 05 12:08:48 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec 05 12:08:48 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec 05 12:08:48 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     <nova:port uuid="08b15784-5374-4fb3-9f63-82412f709db4">
Dec 05 12:08:48 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:08:48 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:08:48 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:08:48 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:08:48 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:08:48 compute-0 nova_compute[187208]: 2025-12-05 12:08:48.838 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:49 compute-0 nova_compute[187208]: 2025-12-05 12:08:49.221 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:08:49 compute-0 nova_compute[187208]: 2025-12-05 12:08:49.222 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:49 compute-0 nova_compute[187208]: 2025-12-05 12:08:49.232 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-08b15784-5374-4fb3-9f63-82412f709db4" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 13.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:50 compute-0 nova_compute[187208]: 2025-12-05 12:08:50.513 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936515.5109434, e9f9bf08-7688-4213-91ff-74f2271ec71d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:50 compute-0 nova_compute[187208]: 2025-12-05 12:08:50.514 187212 INFO nova.compute.manager [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] VM Stopped (Lifecycle Event)
Dec 05 12:08:50 compute-0 nova_compute[187208]: 2025-12-05 12:08:50.532 187212 DEBUG nova.compute.manager [None req-f26b3fca-2f1d-423e-8432-748c57f477f2 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:50 compute-0 nova_compute[187208]: 2025-12-05 12:08:50.874 187212 DEBUG nova.compute.manager [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:50 compute-0 nova_compute[187208]: 2025-12-05 12:08:50.875 187212 DEBUG oslo_concurrency.lockutils [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:50 compute-0 nova_compute[187208]: 2025-12-05 12:08:50.875 187212 DEBUG oslo_concurrency.lockutils [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:50 compute-0 nova_compute[187208]: 2025-12-05 12:08:50.875 187212 DEBUG oslo_concurrency.lockutils [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:50 compute-0 nova_compute[187208]: 2025-12-05 12:08:50.876 187212 DEBUG nova.compute.manager [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:50 compute-0 nova_compute[187208]: 2025-12-05 12:08:50.876 187212 WARNING nova.compute.manager [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state active and task_state shelving_image_uploading.
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.282 187212 DEBUG nova.network.neutron [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port 08b15784-5374-4fb3-9f63-82412f709db4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.283 187212 DEBUG nova.network.neutron [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.300 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.832 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot image upload complete
Dec 05 12:08:51 compute-0 nova_compute[187208]: 2025-12-05 12:08:51.833 187212 DEBUG nova.compute.manager [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:52 compute-0 nova_compute[187208]: 2025-12-05 12:08:52.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:52 compute-0 nova_compute[187208]: 2025-12-05 12:08:52.098 187212 INFO nova.compute.manager [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Shelve offloading
Dec 05 12:08:52 compute-0 nova_compute[187208]: 2025-12-05 12:08:52.106 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance destroyed successfully.
Dec 05 12:08:52 compute-0 nova_compute[187208]: 2025-12-05 12:08:52.107 187212 DEBUG nova.compute.manager [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:52 compute-0 nova_compute[187208]: 2025-12-05 12:08:52.109 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:08:52 compute-0 nova_compute[187208]: 2025-12-05 12:08:52.110 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:08:52 compute-0 nova_compute[187208]: 2025-12-05 12:08:52.110 187212 DEBUG nova.network.neutron [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:08:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:08:52.794 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:52 compute-0 nova_compute[187208]: 2025-12-05 12:08:52.936 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:53 compute-0 podman[230207]: 2025-12-05 12:08:53.211249949 +0000 UTC m=+0.066897136 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.327 187212 DEBUG nova.network.neutron [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.348 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.980 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.981 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.982 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.982 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.982 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.983 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.983 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.983 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.984 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.984 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.984 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.985 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.985 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.985 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.986 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.986 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.986 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.987 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.987 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.987 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.988 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.988 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.988 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.989 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.989 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.989 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.990 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.990 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.990 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.991 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.991 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.991 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.992 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.992 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.992 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.993 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state rescued and task_state None.
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.993 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.993 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.994 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.994 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.994 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.994 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state rescued and task_state None.
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.995 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.995 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.995 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.996 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.996 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:08:54 compute-0 nova_compute[187208]: 2025-12-05 12:08:54.996 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state rescued and task_state None.
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.698 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936520.6973228, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.699 187212 INFO nova.compute.manager [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Stopped (Lifecycle Event)
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.716 187212 DEBUG nova.compute.manager [None req-5c938c21-26de-4056-811d-29ebb3301ac8 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.719 187212 DEBUG nova.compute.manager [None req-5c938c21-26de-4056-811d-29ebb3301ac8 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.772 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.775 187212 INFO nova.compute.manager [None req-5c938c21-26de-4056-811d-29ebb3301ac8 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (shelving_offloading). Skip.
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.838 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance destroyed successfully.
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.838 187212 DEBUG nova.objects.instance [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'resources' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.852 187212 DEBUG nova.virt.libvirt.vif [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCLIKVsEL0lmma4upWYe8NiCB7ZJxacCmu4vu1RJu3M5/Fu5S7w/HUSIKvvOTrl/9nUJ4pE5tXIAyPQxQDsptmV5i8IinhFeAgIm0GlEBvfbCuuhpWud8F+u8GsIwgaqpQ==',key_name='tempest-keypair-776546213',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member',shelved_at='2025-12-05T12:08:51.833545',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='4d0314d0-2208-4446-8d20-5c2197f0bd9d'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.853 187212 DEBUG nova.network.os_vif_util [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.853 187212 DEBUG nova.network.os_vif_util [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.854 187212 DEBUG os_vif [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.856 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.857 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e9efd6c-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.858 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.862 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.863 187212 INFO os_vif [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74')
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.864 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deleting instance files /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54_del
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.870 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deletion of /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54_del complete
Dec 05 12:08:55 compute-0 nova_compute[187208]: 2025-12-05 12:08:55.964 187212 INFO nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Deleted allocations for instance 24358eea-14fb-4863-a6c4-aadcdb495f54
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.020 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.021 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.057 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.077 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.078 187212 DEBUG nova.compute.provider_tree [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.099 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.179 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.266 187212 DEBUG nova.compute.provider_tree [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.349 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.371 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:56 compute-0 nova_compute[187208]: 2025-12-05 12:08:56.419 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 18.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:08:57 compute-0 nova_compute[187208]: 2025-12-05 12:08:57.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.416 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.416 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.416 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.416 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 WARNING nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state rescued and task_state None.
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 WARNING nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 for instance with vm_state active and task_state None.
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.419 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.419 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.419 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.419 187212 WARNING nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 for instance with vm_state active and task_state None.
Dec 05 12:09:00 compute-0 ovn_controller[95610]: 2025-12-05T12:09:00Z|00658|binding|INFO|Releasing lport 0dffa729-6b55-4e58-afef-f1cdc22c22fb from this chassis (sb_readonly=0)
Dec 05 12:09:00 compute-0 ovn_controller[95610]: 2025-12-05T12:09:00Z|00659|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec 05 12:09:00 compute-0 ovn_controller[95610]: 2025-12-05T12:09:00Z|00660|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.846 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:00 compute-0 nova_compute[187208]: 2025-12-05 12:09:00.859 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:01 compute-0 podman[230236]: 2025-12-05 12:09:01.203316189 +0000 UTC m=+0.052806373 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:09:01 compute-0 podman[230235]: 2025-12-05 12:09:01.209698532 +0000 UTC m=+0.062405998 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, io.openshift.expose-services=, release=1755695350)
Dec 05 12:09:02 compute-0 nova_compute[187208]: 2025-12-05 12:09:02.939 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:03.016 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:03 compute-0 nova_compute[187208]: 2025-12-05 12:09:03.981 187212 DEBUG nova.compute.manager [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:03 compute-0 nova_compute[187208]: 2025-12-05 12:09:03.982 187212 DEBUG nova.compute.manager [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:09:03 compute-0 nova_compute[187208]: 2025-12-05 12:09:03.982 187212 DEBUG oslo_concurrency.lockutils [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:03 compute-0 nova_compute[187208]: 2025-12-05 12:09:03.983 187212 DEBUG oslo_concurrency.lockutils [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:03 compute-0 nova_compute[187208]: 2025-12-05 12:09:03.983 187212 DEBUG nova.network.neutron [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.100 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.101 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.121 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.187 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.188 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.199 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.200 187212 INFO nova.compute.claims [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.407 187212 DEBUG nova.compute.provider_tree [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.428 187212 DEBUG nova.scheduler.client.report [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.452 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.452 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.524 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.525 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.543 187212 INFO nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.569 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.673 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.674 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.674 187212 INFO nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Creating image(s)
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.675 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.675 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.676 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.689 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.747 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.748 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.749 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.761 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.816 187212 DEBUG nova.policy [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f805540d6084f53aa7bd5a66912be58', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1bdbd9c8684c4b9b97e00725e41037eb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.824 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.824 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.861 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.863 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.863 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.920 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.921 187212 DEBUG nova.virt.disk.api [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Checking if we can resize image /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.922 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.980 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-d35fce09-856e-4ebf-b944-0c0953a9492b" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.980 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-d35fce09-856e-4ebf-b944-0c0953a9492b" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.988 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.988 187212 DEBUG nova.virt.disk.api [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Cannot resize image /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.989 187212 DEBUG nova.objects.instance [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lazy-loading 'migration_context' on Instance uuid dbbad270-1e3c-41e1-9173-c1b9df0ab2dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:04 compute-0 nova_compute[187208]: 2025-12-05 12:09:04.998 187212 DEBUG nova.objects.instance [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.000 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.001 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Ensure instance console log exists: /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.001 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.002 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.002 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.018 187212 DEBUG nova.virt.libvirt.vif [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.019 187212 DEBUG nova.network.os_vif_util [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.019 187212 DEBUG nova.network.os_vif_util [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.023 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.025 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.030 187212 DEBUG nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Attempting to detach device tapd35fce09-85 from instance f1e72d05-87e7-495d-9dbb-1a10b112c69f from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.030 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:b8:01:47"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <target dev="tapd35fce09-85"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]: </interface>
Dec 05 12:09:05 compute-0 nova_compute[187208]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.036 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.040 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface>not found in domain: <domain type='kvm' id='73'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <name>instance-00000043</name>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <uuid>f1e72d05-87e7-495d-9dbb-1a10b112c69f</uuid>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:08:48</nova:creationTime>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="08b15784-5374-4fb3-9f63-82412f709db4">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:09:05 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <memory unit='KiB'>131072</memory>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <vcpu placement='static'>1</vcpu>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <resource>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <partition>/machine</partition>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </resource>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <sysinfo type='smbios'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <system>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='manufacturer'>RDO</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='serial'>f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='uuid'>f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='family'>Virtual Machine</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </system>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <os>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <boot dev='hd'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <smbios mode='sysinfo'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </os>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <features>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <vmcoreinfo state='on'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </features>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <vendor>AMD</vendor>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='x2apic'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc-deadline'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='hypervisor'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc_adjust'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='spec-ctrl'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='stibp'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='ssbd'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='cmp_legacy'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='overflow-recov'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='succor'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='ibrs'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='amd-ssbd'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='virt-ssbd'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='lbrv'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='tsc-scale'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='vmcb-clean'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='flushbyasid'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='pause-filter'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='pfthreshold'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='xsaves'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='svm'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='topoext'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='npt'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='nrip-save'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <clock offset='utc'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <timer name='hpet' present='no'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <on_poweroff>destroy</on_poweroff>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <on_reboot>restart</on_reboot>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <on_crash>destroy</on_crash>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <disk type='file' device='disk'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk' index='2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <backingStore type='file' index='3'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:         <format type='raw'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:         <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:         <backingStore/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       </backingStore>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='vda' bus='virtio'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='virtio-disk0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <disk type='file' device='cdrom'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config' index='1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <backingStore/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='sda' bus='sata'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <readonly/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='sata0-0-0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pcie.0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='1' port='0x10'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='2' port='0x11'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='3' port='0x12'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='4' port='0x13'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='5' port='0x14'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.5'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='6' port='0x15'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.6'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='7' port='0x16'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.7'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='8' port='0x17'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.8'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='9' port='0x18'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.9'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='10' port='0x19'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.10'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='11' port='0x1a'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.11'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='12' port='0x1b'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.12'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='13' port='0x1c'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.13'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='14' port='0x1d'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.14'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='15' port='0x1e'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.15'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='16' port='0x1f'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.16'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='17' port='0x20'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.17'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='18' port='0x21'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.18'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='19' port='0x22'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.19'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='20' port='0x23'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.20'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='21' port='0x24'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.21'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='22' port='0x25'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.22'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='23' port='0x26'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.23'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='24' port='0x27'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.24'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='25' port='0x28'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.25'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-pci-bridge'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.26'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='usb'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='sata' index='0'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='ide'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:01:99:b0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='tapf7a6775e-6d'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='net0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:b8:01:47'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='tapd35fce09-85'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='net1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:54:f6:34'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='tapaf04237a-1f'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='net2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:d1:21:db'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='tap08b15784-53'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='net3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <serial type='pty'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <source path='/dev/pts/4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log' append='off'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target type='isa-serial' port='0'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:         <model name='isa-serial'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       </target>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <console type='pty' tty='/dev/pts/4'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <source path='/dev/pts/4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log' append='off'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target type='serial' port='0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </console>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <input type='tablet' bus='usb'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='input0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='usb' bus='0' port='1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </input>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <input type='mouse' bus='ps2'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='input1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </input>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <input type='keyboard' bus='ps2'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='input2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </input>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <graphics type='vnc' port='5904' autoport='yes' listen='::0'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <listen type='address' address='::0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </graphics>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <audio id='1' type='none'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <video>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='video0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </video>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <watchdog model='itco' action='reset'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='watchdog0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </watchdog>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <memballoon model='virtio'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <stats period='10'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='balloon0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <rng model='virtio'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <backend model='random'>/dev/urandom</backend>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='rng0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <label>system_u:system_r:svirt_t:s0:c138,c973</label>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c138,c973</imagelabel>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <label>+107:+107</label>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <imagelabel>+107:+107</imagelabel>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:09:05 compute-0 nova_compute[187208]: </domain>
Dec 05 12:09:05 compute-0 nova_compute[187208]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.041 187212 INFO nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tapd35fce09-85 from instance f1e72d05-87e7-495d-9dbb-1a10b112c69f from the persistent domain config.
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.041 187212 DEBUG nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] (1/8): Attempting to detach device tapd35fce09-85 with device alias net1 from instance f1e72d05-87e7-495d-9dbb-1a10b112c69f from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.041 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:b8:01:47"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <target dev="tapd35fce09-85"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]: </interface>
Dec 05 12:09:05 compute-0 nova_compute[187208]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 12:09:05 compute-0 kernel: tapd35fce09-85 (unregistering): left promiscuous mode
Dec 05 12:09:05 compute-0 NetworkManager[55691]: <info>  [1764936545.1032] device (tapd35fce09-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:09:05 compute-0 ovn_controller[95610]: 2025-12-05T12:09:05Z|00661|binding|INFO|Releasing lport d35fce09-856e-4ebf-b944-0c0953a9492b from this chassis (sb_readonly=0)
Dec 05 12:09:05 compute-0 ovn_controller[95610]: 2025-12-05T12:09:05Z|00662|binding|INFO|Setting lport d35fce09-856e-4ebf-b944-0c0953a9492b down in Southbound
Dec 05 12:09:05 compute-0 ovn_controller[95610]: 2025-12-05T12:09:05Z|00663|binding|INFO|Removing iface tapd35fce09-85 ovn-installed in OVS
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.135 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.145 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Received event <DeviceRemovedEvent: 1764936545.1446915, f1e72d05-87e7-495d-9dbb-1a10b112c69f => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.147 187212 DEBUG nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Start waiting for the detach event from libvirt for device tapd35fce09-85 with device alias net1 for instance f1e72d05-87e7-495d-9dbb-1a10b112c69f _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.147 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.151 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface>not found in domain: <domain type='kvm' id='73'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <name>instance-00000043</name>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <uuid>f1e72d05-87e7-495d-9dbb-1a10b112c69f</uuid>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:08:48</nova:creationTime>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="08b15784-5374-4fb3-9f63-82412f709db4">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:09:05 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <memory unit='KiB'>131072</memory>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <vcpu placement='static'>1</vcpu>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <resource>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <partition>/machine</partition>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </resource>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <sysinfo type='smbios'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <system>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='manufacturer'>RDO</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='serial'>f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='uuid'>f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <entry name='family'>Virtual Machine</entry>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </system>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <os>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <boot dev='hd'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <smbios mode='sysinfo'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </os>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <features>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <vmcoreinfo state='on'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </features>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <vendor>AMD</vendor>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='x2apic'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc-deadline'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='hypervisor'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc_adjust'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='spec-ctrl'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='stibp'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='ssbd'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='cmp_legacy'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='overflow-recov'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='succor'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='ibrs'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='amd-ssbd'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='virt-ssbd'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='lbrv'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='tsc-scale'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='vmcb-clean'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='flushbyasid'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='pause-filter'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='pfthreshold'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='xsaves'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='svm'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='require' name='topoext'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='npt'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <feature policy='disable' name='nrip-save'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <clock offset='utc'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <timer name='hpet' present='no'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <on_poweroff>destroy</on_poweroff>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <on_reboot>restart</on_reboot>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <on_crash>destroy</on_crash>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <disk type='file' device='disk'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk' index='2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <backingStore type='file' index='3'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:         <format type='raw'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:         <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:         <backingStore/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       </backingStore>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='vda' bus='virtio'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='virtio-disk0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <disk type='file' device='cdrom'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config' index='1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <backingStore/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='sda' bus='sata'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <readonly/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='sata0-0-0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pcie.0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='1' port='0x10'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='2' port='0x11'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='3' port='0x12'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='4' port='0x13'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='5' port='0x14'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.5'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='6' port='0x15'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.6'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='7' port='0x16'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.7'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='8' port='0x17'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.8'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='9' port='0x18'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.9'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='10' port='0x19'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.10'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='11' port='0x1a'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.11'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='12' port='0x1b'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.12'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='13' port='0x1c'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.13'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='14' port='0x1d'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.14'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='15' port='0x1e'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.15'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='16' port='0x1f'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.16'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='17' port='0x20'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.17'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='18' port='0x21'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.18'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='19' port='0x22'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.19'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='20' port='0x23'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.20'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='21' port='0x24'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.21'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='22' port='0x25'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.22'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='23' port='0x26'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.23'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='24' port='0x27'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.24'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target chassis='25' port='0x28'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.25'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model name='pcie-pci-bridge'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='pci.26'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='usb'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <controller type='sata' index='0'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='ide'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:01:99:b0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='tapf7a6775e-6d'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='net0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:54:f6:34'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='tapaf04237a-1f'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='net2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:d1:21:db'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target dev='tap08b15784-53'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='net3'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <serial type='pty'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <source path='/dev/pts/4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log' append='off'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target type='isa-serial' port='0'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:         <model name='isa-serial'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       </target>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <console type='pty' tty='/dev/pts/4'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <source path='/dev/pts/4'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log' append='off'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <target type='serial' port='0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </console>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <input type='tablet' bus='usb'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='input0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='usb' bus='0' port='1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </input>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <input type='mouse' bus='ps2'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='input1'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </input>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <input type='keyboard' bus='ps2'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='input2'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </input>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <graphics type='vnc' port='5904' autoport='yes' listen='::0'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <listen type='address' address='::0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </graphics>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <audio id='1' type='none'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <video>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='video0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </video>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <watchdog model='itco' action='reset'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='watchdog0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </watchdog>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <memballoon model='virtio'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <stats period='10'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='balloon0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <rng model='virtio'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <backend model='random'>/dev/urandom</backend>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <alias name='rng0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <label>system_u:system_r:svirt_t:s0:c138,c973</label>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c138,c973</imagelabel>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <label>+107:+107</label>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <imagelabel>+107:+107</imagelabel>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:09:05 compute-0 nova_compute[187208]: </domain>
Dec 05 12:09:05 compute-0 nova_compute[187208]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.151 187212 INFO nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tapd35fce09-85 from instance f1e72d05-87e7-495d-9dbb-1a10b112c69f from the live domain config.
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.152 187212 DEBUG nova.virt.libvirt.vif [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.152 187212 DEBUG nova.network.os_vif_util [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.153 187212 DEBUG nova.network.os_vif_util [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.153 187212 DEBUG os_vif [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.155 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.155 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd35fce09-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.156 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.161 187212 INFO os_vif [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85')
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.162 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:09:05</nova:creationTime>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     <nova:port uuid="08b15784-5374-4fb3-9f63-82412f709db4">
Dec 05 12:09:05 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:09:05 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:09:05 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:09:05 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:09:05 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.216 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:01:47 10.100.0.3'], port_security=['fa:16:3e:b8:01:47 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d35fce09-856e-4ebf-b944-0c0953a9492b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.217 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d35fce09-856e-4ebf-b944-0c0953a9492b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.220 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.235 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d869dfb1-086b-4ab7-8725-0b60458d6585]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.263 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[29df3787-cf94-4b95-b554-e4ec87fd7c68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.268 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[905dcdd7-aa67-483e-8f71-91169d3cf6d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.303 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0570f9ea-1ebf-4865-8be4-6cf6d8a2d058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.319 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[87fb661e-e645-48db-ae94-b8494bf7284a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 12, 'rx_bytes': 868, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 12, 'rx_bytes': 868, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230300, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.341 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f08b6aa6-0197-44c3-8eca-0054f1988f1a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230301, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230301, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.343 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:05 compute-0 nova_compute[187208]: 2025-12-05 12:09:05.346 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.346 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.346 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.347 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.347 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:06 compute-0 podman[230302]: 2025-12-05 12:09:06.201292387 +0000 UTC m=+0.053076801 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:09:06 compute-0 podman[230303]: 2025-12-05 12:09:06.245133012 +0000 UTC m=+0.084528851 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Dec 05 12:09:06 compute-0 nova_compute[187208]: 2025-12-05 12:09:06.725 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Successfully created port: cf99cdda-7071-4c18-8462-3a556234d81d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:09:07 compute-0 nova_compute[187208]: 2025-12-05 12:09:07.981 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.011 187212 DEBUG nova.network.neutron [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.012 187212 DEBUG nova.network.neutron [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.120 187212 DEBUG oslo_concurrency.lockutils [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.380 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.380 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.380 187212 DEBUG nova.network.neutron [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.411 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.412 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.412 187212 INFO nova.compute.manager [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Unshelving
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.494 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Successfully updated port: cf99cdda-7071-4c18-8462-3a556234d81d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.512 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.513 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquired lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.513 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.522 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.523 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.527 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_requests' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.551 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'numa_topology' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.570 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.570 187212 INFO nova.compute.claims [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.676 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.787 187212 DEBUG nova.compute.provider_tree [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.802 187212 DEBUG nova.scheduler.client.report [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:08 compute-0 nova_compute[187208]: 2025-12-05 12:09:08.822 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.438 187212 INFO nova.network.neutron [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating port 2e9efd6c-740c-405b-b9f0-bd46434070a7 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.483 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Updating instance_info_cache with network_info: [{"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.517 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Releasing lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.518 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance network_info: |[{"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.521 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start _get_guest_xml network_info=[{"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.527 187212 WARNING nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.531 187212 DEBUG nova.virt.libvirt.host [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.531 187212 DEBUG nova.virt.libvirt.host [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.535 187212 DEBUG nova.virt.libvirt.host [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.536 187212 DEBUG nova.virt.libvirt.host [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.537 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.538 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.539 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.539 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.539 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.539 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.541 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.545 187212 DEBUG nova.virt.libvirt.vif [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1390207148',display_name='tempest-ServerMetadataTestJSON-server-1390207148',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1390207148',id=72,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bdbd9c8684c4b9b97e00725e41037eb',ramdisk_id='',reservation_id='r-8eo91pmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-355236921',owner_user_name='tempest-ServerMetadataTestJSON-355236921-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:04Z,user_data=None,user_id='4f805540d6084f53aa7bd5a66912be58',uuid=dbbad270-1e3c-41e1-9173-c1b9df0ab2dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.546 187212 DEBUG nova.network.os_vif_util [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converting VIF {"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.547 187212 DEBUG nova.network.os_vif_util [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.547 187212 DEBUG nova.objects.instance [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lazy-loading 'pci_devices' on Instance uuid dbbad270-1e3c-41e1-9173-c1b9df0ab2dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.563 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <uuid>dbbad270-1e3c-41e1-9173-c1b9df0ab2dd</uuid>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <name>instance-00000048</name>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerMetadataTestJSON-server-1390207148</nova:name>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:09:09</nova:creationTime>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:09:09 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:09:09 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:09:09 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:09:09 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:09:09 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:09:09 compute-0 nova_compute[187208]:         <nova:user uuid="4f805540d6084f53aa7bd5a66912be58">tempest-ServerMetadataTestJSON-355236921-project-member</nova:user>
Dec 05 12:09:09 compute-0 nova_compute[187208]:         <nova:project uuid="1bdbd9c8684c4b9b97e00725e41037eb">tempest-ServerMetadataTestJSON-355236921</nova:project>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:09:09 compute-0 nova_compute[187208]:         <nova:port uuid="cf99cdda-7071-4c18-8462-3a556234d81d">
Dec 05 12:09:09 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <system>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <entry name="serial">dbbad270-1e3c-41e1-9173-c1b9df0ab2dd</entry>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <entry name="uuid">dbbad270-1e3c-41e1-9173-c1b9df0ab2dd</entry>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     </system>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <os>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   </os>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <features>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   </features>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.config"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:60:68:ad"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <target dev="tapcf99cdda-70"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/console.log" append="off"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <video>
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     </video>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:09:09 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:09:09 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:09:09 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:09:09 compute-0 nova_compute[187208]: </domain>
Dec 05 12:09:09 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.564 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Preparing to wait for external event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.564 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.565 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.565 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.566 187212 DEBUG nova.virt.libvirt.vif [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1390207148',display_name='tempest-ServerMetadataTestJSON-server-1390207148',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1390207148',id=72,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bdbd9c8684c4b9b97e00725e41037eb',ramdisk_id='',reservation_id='r-8eo91pmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-355236921',owner_user_name='tempest-ServerMetadataTestJSON-355236921-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:04Z,user_data=None,user_id='4f805540d6084f53aa7bd5a66912be58',uuid=dbbad270-1e3c-41e1-9173-c1b9df0ab2dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.566 187212 DEBUG nova.network.os_vif_util [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converting VIF {"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.567 187212 DEBUG nova.network.os_vif_util [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.567 187212 DEBUG os_vif [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.567 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.568 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.568 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.573 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.574 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcf99cdda-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.574 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcf99cdda-70, col_values=(('external_ids', {'iface-id': 'cf99cdda-7071-4c18-8462-3a556234d81d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:68:ad', 'vm-uuid': 'dbbad270-1e3c-41e1-9173-c1b9df0ab2dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:09 compute-0 NetworkManager[55691]: <info>  [1764936549.5774] manager: (tapcf99cdda-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.582 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.584 187212 INFO os_vif [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70')
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.659 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.659 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.660 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] No VIF found with MAC fa:16:3e:60:68:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:09:09 compute-0 nova_compute[187208]: 2025-12-05 12:09:09.660 187212 INFO nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Using config drive
Dec 05 12:09:09 compute-0 podman[230355]: 2025-12-05 12:09:09.690161756 +0000 UTC m=+0.066311169 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 12:09:11 compute-0 nova_compute[187208]: 2025-12-05 12:09:11.474 187212 INFO nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Creating config drive at /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.config
Dec 05 12:09:11 compute-0 nova_compute[187208]: 2025-12-05 12:09:11.479 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5l9s9dwu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:11 compute-0 nova_compute[187208]: 2025-12-05 12:09:11.608 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5l9s9dwu" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:11 compute-0 kernel: tapcf99cdda-70: entered promiscuous mode
Dec 05 12:09:11 compute-0 NetworkManager[55691]: <info>  [1764936551.6797] manager: (tapcf99cdda-70): new Tun device (/org/freedesktop/NetworkManager/Devices/261)
Dec 05 12:09:11 compute-0 ovn_controller[95610]: 2025-12-05T12:09:11Z|00664|binding|INFO|Claiming lport cf99cdda-7071-4c18-8462-3a556234d81d for this chassis.
Dec 05 12:09:11 compute-0 ovn_controller[95610]: 2025-12-05T12:09:11Z|00665|binding|INFO|cf99cdda-7071-4c18-8462-3a556234d81d: Claiming fa:16:3e:60:68:ad 10.100.0.4
Dec 05 12:09:11 compute-0 nova_compute[187208]: 2025-12-05 12:09:11.680 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.688 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:68:ad 10.100.0.4'], port_security=['fa:16:3e:60:68:ad 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dbbad270-1e3c-41e1-9173-c1b9df0ab2dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bdbd9c8684c4b9b97e00725e41037eb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d91f504-323f-40f6-96ee-8e841aa785bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd596033-693a-40ca-949c-841d866181bd, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=cf99cdda-7071-4c18-8462-3a556234d81d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.689 104471 INFO neutron.agent.ovn.metadata.agent [-] Port cf99cdda-7071-4c18-8462-3a556234d81d in datapath d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 bound to our chassis
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.691 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22
Dec 05 12:09:11 compute-0 nova_compute[187208]: 2025-12-05 12:09:11.695 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:11 compute-0 ovn_controller[95610]: 2025-12-05T12:09:11Z|00666|binding|INFO|Setting lport cf99cdda-7071-4c18-8462-3a556234d81d ovn-installed in OVS
Dec 05 12:09:11 compute-0 ovn_controller[95610]: 2025-12-05T12:09:11Z|00667|binding|INFO|Setting lport cf99cdda-7071-4c18-8462-3a556234d81d up in Southbound
Dec 05 12:09:11 compute-0 nova_compute[187208]: 2025-12-05 12:09:11.699 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.702 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c9300f9f-6a5b-4543-b835-c2fc98d5e57a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.703 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd5794fbb-c1 in ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.705 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd5794fbb-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.705 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca3547f-8479-48e4-a53f-9c7db733cbdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 systemd-udevd[230394]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.707 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f207c841-7baf-4e47-b436-0d412d33d5c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 systemd-machined[153543]: New machine qemu-80-instance-00000048.
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.721 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0d16414f-662d-42cc-8f3e-202703c9c6c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 NetworkManager[55691]: <info>  [1764936551.7257] device (tapcf99cdda-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:09:11 compute-0 NetworkManager[55691]: <info>  [1764936551.7268] device (tapcf99cdda-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:09:11 compute-0 systemd[1]: Started Virtual Machine qemu-80-instance-00000048.
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.740 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3171669-0cd5-4223-8d8a-b14d37242f4d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.779 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e316cc-da70-4c16-8f8f-6f0123c15695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 NetworkManager[55691]: <info>  [1764936551.7872] manager: (tapd5794fbb-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/262)
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.786 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[131ee832-c341-4bfc-8177-27e41ab163e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.817 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9725e671-52ad-4f71-abe2-1d8069d174dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.821 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[297396c9-8ee8-4398-8338-0e8387ec7843]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 NetworkManager[55691]: <info>  [1764936551.8441] device (tapd5794fbb-c0): carrier: link connected
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.848 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e53a3cda-9c50-4d9b-877a-d19cde7c566d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.864 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9547a7da-6803-402a-996e-9d2008ed683e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5794fbb-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:2f:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393240, 'reachable_time': 34098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230427, 'error': None, 'target': 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.879 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a84495b8-65d7-472f-8248-9ba2b2cf53f6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:2f2a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 393240, 'tstamp': 393240}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230428, 'error': None, 'target': 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.897 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[42fba68c-9267-4c51-a149-f70e62462ad7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5794fbb-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:2f:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393240, 'reachable_time': 34098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230429, 'error': None, 'target': 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.930 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd2441e-a421-4fa1-bb24-60aeb06ce054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.995 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce8487c-f9a7-4d6b-ab67-f7d6c07327f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.997 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5794fbb-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.997 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.997 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5794fbb-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:12 compute-0 kernel: tapd5794fbb-c0: entered promiscuous mode
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:11.999 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:12 compute-0 NetworkManager[55691]: <info>  [1764936552.0011] manager: (tapd5794fbb-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.002 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd5794fbb-c0, col_values=(('external_ids', {'iface-id': 'da9adcd8-f2a5-4ff7-962a-717d700ad7b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:12 compute-0 ovn_controller[95610]: 2025-12-05T12:09:12Z|00668|binding|INFO|Releasing lport da9adcd8-f2a5-4ff7-962a-717d700ad7b5 from this chassis (sb_readonly=0)
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.003 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.017 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.018 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f958c892-c100-4fde-8caa-f87eb2178987]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.018 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22.pid.haproxy
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:09:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.019 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'env', 'PROCESS_TAG=haproxy-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.302 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936552.3021243, dbbad270-1e3c-41e1-9173-c1b9df0ab2dd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.303 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] VM Started (Lifecycle Event)
Dec 05 12:09:12 compute-0 podman[230467]: 2025-12-05 12:09:12.460873714 +0000 UTC m=+0.069047198 container create 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:09:12 compute-0 systemd[1]: Started libpod-conmon-34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85.scope.
Dec 05 12:09:12 compute-0 podman[230467]: 2025-12-05 12:09:12.41917767 +0000 UTC m=+0.027351194 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.521 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.527 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936552.3024373, dbbad270-1e3c-41e1-9173-c1b9df0ab2dd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.527 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] VM Paused (Lifecycle Event)
Dec 05 12:09:12 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:09:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58192d43ddea201cbf99dd4a079d9f14871a3e9699282f5a030bf56f666b8ee1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:09:12 compute-0 podman[230467]: 2025-12-05 12:09:12.566874089 +0000 UTC m=+0.175047593 container init 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:09:12 compute-0 podman[230467]: 2025-12-05 12:09:12.574040834 +0000 UTC m=+0.182214318 container start 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:09:12 compute-0 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [NOTICE]   (230487) : New worker (230489) forked
Dec 05 12:09:12 compute-0 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [NOTICE]   (230487) : Loading success.
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.666 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.692 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.692 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.692 187212 DEBUG nova.network.neutron [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.781 187212 INFO nova.network.neutron [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Port d35fce09-856e-4ebf-b944-0c0953a9492b from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.809 187212 DEBUG nova.compute.manager [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.809 187212 DEBUG nova.compute.manager [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.809 187212 DEBUG oslo_concurrency.lockutils [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.809 187212 DEBUG oslo_concurrency.lockutils [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.810 187212 DEBUG nova.network.neutron [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.835 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.941 187212 DEBUG nova.compute.manager [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-changed-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.941 187212 DEBUG nova.compute.manager [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Refreshing instance network info cache due to event network-changed-cf99cdda-7071-4c18-8462-3a556234d81d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.942 187212 DEBUG oslo_concurrency.lockutils [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.942 187212 DEBUG oslo_concurrency.lockutils [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.942 187212 DEBUG nova.network.neutron [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Refreshing network info cache for port cf99cdda-7071-4c18-8462-3a556234d81d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:09:12 compute-0 nova_compute[187208]: 2025-12-05 12:09:12.983 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:14 compute-0 nova_compute[187208]: 2025-12-05 12:09:14.215 187212 DEBUG nova.network.neutron [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:09:14 compute-0 nova_compute[187208]: 2025-12-05 12:09:14.216 187212 DEBUG nova.network.neutron [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:14 compute-0 nova_compute[187208]: 2025-12-05 12:09:14.235 187212 DEBUG oslo_concurrency.lockutils [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.420 104471 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 142f48b6-9a20-4cd8-b984-7849deca313b with type ""
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.422 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:21:db 10.100.0.14'], port_security=['fa:16:3e:d1:21:db 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-274660127', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-274660127', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=08b15784-5374-4fb3-9f63-82412f709db4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.424 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 08b15784-5374-4fb3-9f63-82412f709db4 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.426 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:09:14 compute-0 ovn_controller[95610]: 2025-12-05T12:09:14Z|00669|binding|INFO|Removing iface tap08b15784-53 ovn-installed in OVS
Dec 05 12:09:14 compute-0 ovn_controller[95610]: 2025-12-05T12:09:14Z|00670|binding|INFO|Removing lport 08b15784-5374-4fb3-9f63-82412f709db4 ovn-installed in OVS
Dec 05 12:09:14 compute-0 nova_compute[187208]: 2025-12-05 12:09:14.438 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.442 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92a6b6a8-87a3-4e69-bb8d-0eb474f23746]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:14 compute-0 nova_compute[187208]: 2025-12-05 12:09:14.449 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.473 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[617e49b8-a5ed-4a06-b8a6-d0acf1f77a40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.479 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd1c8d4-183c-4b66-b8f3-d01b51b8ed03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.513 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e5259931-f2be-47ab-8f9d-9b3b1db9718b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.534 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1f15a823-8b25-4d21-96b6-9b8865eb4522]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 14, 'rx_bytes': 868, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 14, 'rx_bytes': 868, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230503, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.553 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[71e0cf1f-20b9-45c0-a603-ce68116fced2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230504, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230504, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.555 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:14 compute-0 nova_compute[187208]: 2025-12-05 12:09:14.557 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:14 compute-0 nova_compute[187208]: 2025-12-05 12:09:14.559 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.559 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.560 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.560 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.561 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:14 compute-0 nova_compute[187208]: 2025-12-05 12:09:14.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.220 187212 DEBUG nova.network.neutron [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.407 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.409 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.409 187212 INFO nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating image(s)
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.410 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.410 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.411 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.411 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.415 187212 DEBUG nova.network.neutron [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Updated VIF entry in instance network info cache for port cf99cdda-7071-4c18-8462-3a556234d81d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.416 187212 DEBUG nova.network.neutron [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Updating instance_info_cache with network_info: [{"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.455 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "4e67c74a736d89d49bae230086f8944c0448c13d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.456 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "4e67c74a736d89d49bae230086f8944c0448c13d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:16 compute-0 nova_compute[187208]: 2025-12-05 12:09:16.459 187212 DEBUG oslo_concurrency.lockutils [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:17 compute-0 podman[230505]: 2025-12-05 12:09:17.221354331 +0000 UTC m=+0.065086434 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:09:17 compute-0 nova_compute[187208]: 2025-12-05 12:09:17.985 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.341 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.402 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.403 187212 DEBUG nova.virt.images [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] 4d0314d0-2208-4446-8d20-5c2197f0bd9d was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.405 187212 DEBUG nova.privsep.utils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.405 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.part /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:18 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 12:09:18 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.616 187212 DEBUG nova.network.neutron [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.741 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.part /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.converted" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.751 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.812 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.converted --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.814 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "4e67c74a736d89d49bae230086f8944c0448c13d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.833 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.840 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.907 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.908 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "4e67c74a736d89d49bae230086f8944c0448c13d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.909 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "4e67c74a736d89d49bae230086f8944c0448c13d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.925 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.964 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-d35fce09-856e-4ebf-b944-0c0953a9492b" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 13.983s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.993 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:18 compute-0 nova_compute[187208]: 2025-12-05 12:09:18.993 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d,backing_fmt=raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.056 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d,backing_fmt=raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk 1073741824" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.057 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "4e67c74a736d89d49bae230086f8944c0448c13d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.058 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.116 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.118 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.143 187212 INFO nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Rebasing disk image.
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.144 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.201 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.202 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.431 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.431 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.432 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.432 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.432 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.434 187212 INFO nova.compute.manager [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Terminating instance
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.436 187212 DEBUG nova.compute.manager [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:09:19 compute-0 kernel: tapf7a6775e-6d (unregistering): left promiscuous mode
Dec 05 12:09:19 compute-0 NetworkManager[55691]: <info>  [1764936559.4793] device (tapf7a6775e-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 kernel: tapaf04237a-1f (unregistering): left promiscuous mode
Dec 05 12:09:19 compute-0 ovn_controller[95610]: 2025-12-05T12:09:19Z|00671|binding|INFO|Releasing lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 from this chassis (sb_readonly=0)
Dec 05 12:09:19 compute-0 ovn_controller[95610]: 2025-12-05T12:09:19Z|00672|binding|INFO|Setting lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 down in Southbound
Dec 05 12:09:19 compute-0 ovn_controller[95610]: 2025-12-05T12:09:19Z|00673|binding|INFO|Removing iface tapf7a6775e-6d ovn-installed in OVS
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.502 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-unplugged-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.503 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.503 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:19 compute-0 NetworkManager[55691]: <info>  [1764936559.5041] device (tapaf04237a-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.504 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.504 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-unplugged-d35fce09-856e-4ebf-b944-0c0953a9492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.504 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-unplugged-d35fce09-856e-4ebf-b944-0c0953a9492b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.504 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 WARNING nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b for instance with vm_state active and task_state deleting.
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.506 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-deleted-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.506 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.506 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing instance network info cache due to event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.506 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.507 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.507 187212 DEBUG nova.network.neutron [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.508 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.514 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:99:b0 10.100.0.7'], port_security=['fa:16:3e:01:99:b0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '83c79c65-073e-4860-a990-92e9abafc0bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f7a6775e-6d9c-48e1-91d7-829a6f5f3742) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.515 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f7a6775e-6d9c-48e1-91d7-829a6f5f3742 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.518 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:09:19 compute-0 ovn_controller[95610]: 2025-12-05T12:09:19Z|00674|binding|INFO|Releasing lport af04237a-1f79-4f68-a18e-1ceb4911605b from this chassis (sb_readonly=0)
Dec 05 12:09:19 compute-0 ovn_controller[95610]: 2025-12-05T12:09:19Z|00675|binding|INFO|Setting lport af04237a-1f79-4f68-a18e-1ceb4911605b down in Southbound
Dec 05 12:09:19 compute-0 ovn_controller[95610]: 2025-12-05T12:09:19Z|00676|binding|INFO|Removing iface tapaf04237a-1f ovn-installed in OVS
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.538 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:f6:34 10.100.0.10'], port_security=['fa:16:3e:54:f6:34 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=af04237a-1f79-4f68-a18e-1ceb4911605b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.537 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee47265-f580-4cd2-abf7-d97d67d870c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.530 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.544 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 kernel: tap08b15784-53 (unregistering): left promiscuous mode
Dec 05 12:09:19 compute-0 NetworkManager[55691]: <info>  [1764936559.5517] device (tap08b15784-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.573 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.575 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[30fde81e-7760-47e5-8e97-93a9970aa448]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.578 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[10ae17e4-043e-4b0e-8110-c5556dcd5ba2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:19 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000043.scope: Deactivated successfully.
Dec 05 12:09:19 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000043.scope: Consumed 16.764s CPU time.
Dec 05 12:09:19 compute-0 systemd-machined[153543]: Machine qemu-73-instance-00000043 terminated.
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.604 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b8bfbf65-b3ae-42eb-8a2e-91392e665dc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.621 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8a25b2ee-f1a1-4014-864b-8c8b22f6c55e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 16, 'rx_bytes': 868, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 16, 'rx_bytes': 868, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230586, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.636 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1c37f4-c312-4340-bc03-419089ce5079]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230587, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230587, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.637 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.639 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.647 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.648 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.648 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.649 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.649 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.650 104471 INFO neutron.agent.ovn.metadata.agent [-] Port af04237a-1f79-4f68-a18e-1ceb4911605b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.652 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.653 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d801f682-2a78-4d4c-8b7e-ca2378352244]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.654 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace which is not needed anymore
Dec 05 12:09:19 compute-0 NetworkManager[55691]: <info>  [1764936559.6698] manager: (tapaf04237a-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Dec 05 12:09:19 compute-0 NetworkManager[55691]: <info>  [1764936559.6828] manager: (tap08b15784-53): new Tun device (/org/freedesktop/NetworkManager/Devices/265)
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.722 187212 INFO nova.virt.libvirt.driver [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance destroyed successfully.
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.723 187212 DEBUG nova.objects.instance [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'resources' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.739 187212 DEBUG nova.virt.libvirt.vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.740 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.741 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.741 187212 DEBUG os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.744 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a6775e-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.749 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.750 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.755 187212 INFO os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d')
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.756 187212 DEBUG nova.virt.libvirt.vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.757 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.759 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.759 187212 DEBUG os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.761 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.761 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf04237a-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.763 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.766 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.772 187212 INFO os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f')
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.773 187212 DEBUG nova.virt.libvirt.vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.773 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.774 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.775 187212 DEBUG os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.776 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.776 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b15784-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.778 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.783 187212 INFO os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53')
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.784 187212 INFO nova.virt.libvirt.driver [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Deleting instance files /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f_del
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.785 187212 INFO nova.virt.libvirt.driver [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Deletion of /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f_del complete
Dec 05 12:09:19 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [NOTICE]   (228701) : haproxy version is 2.8.14-c23fe91
Dec 05 12:09:19 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [NOTICE]   (228701) : path to executable is /usr/sbin/haproxy
Dec 05 12:09:19 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [WARNING]  (228701) : Exiting Master process...
Dec 05 12:09:19 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [WARNING]  (228701) : Exiting Master process...
Dec 05 12:09:19 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [ALERT]    (228701) : Current worker (228703) exited with code 143 (Terminated)
Dec 05 12:09:19 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [WARNING]  (228701) : All workers exited. Exiting... (0)
Dec 05 12:09:19 compute-0 systemd[1]: libpod-9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b.scope: Deactivated successfully.
Dec 05 12:09:19 compute-0 podman[230652]: 2025-12-05 12:09:19.85414553 +0000 UTC m=+0.088244118 container died 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.865 187212 INFO nova.compute.manager [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Took 0.43 seconds to destroy the instance on the hypervisor.
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.865 187212 DEBUG oslo.service.loopingcall [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.866 187212 DEBUG nova.compute.manager [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:09:19 compute-0 nova_compute[187208]: 2025-12-05 12:09:19.866 187212 DEBUG nova.network.neutron [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:09:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b-userdata-shm.mount: Deactivated successfully.
Dec 05 12:09:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-186f8522ea6206e72ff71431d61cc132ff2e048571a25807429a54cf15a146be-merged.mount: Deactivated successfully.
Dec 05 12:09:20 compute-0 podman[230652]: 2025-12-05 12:09:20.072838711 +0000 UTC m=+0.306937299 container cleanup 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:09:20 compute-0 podman[230682]: 2025-12-05 12:09:20.168088078 +0000 UTC m=+0.073211837 container remove 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 12:09:20 compute-0 systemd[1]: libpod-conmon-9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b.scope: Deactivated successfully.
Dec 05 12:09:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.174 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f95b74c5-264d-4149-853e-4ad3c049cbf6]: (4, ('Fri Dec  5 12:09:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b)\n9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b\nFri Dec  5 12:09:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b)\n9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.177 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[928b2675-7241-498c-a730-042ebe658f7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.179 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:20 compute-0 kernel: tapfbfed6fc-30: left promiscuous mode
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.184 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.188 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe1bc8d-9823-4d45-ae67-2cb1a1168ea8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.207 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[15c3cc5c-2db5-4a95-8347-30133536f04d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.209 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b868d60a-e322-492e-96a6-ed5f352100ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.224 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e42669-7f70-44c0-9e7d-1634a937b8d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383474, 'reachable_time': 22757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230695, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:20 compute-0 systemd[1]: run-netns-ovnmeta\x2dfbfed6fc\x2d3701\x2d4311\x2da4c2\x2d8c49c5b7584c.mount: Deactivated successfully.
Dec 05 12:09:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.226 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:09:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.227 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[fea878ea-961a-4510-aade-50d07aea306b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.515 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk" returned: 0 in 1.312s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.515 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.515 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Ensure instance console log exists: /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.516 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.516 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.516 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.518 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start _get_guest_xml network_info=[{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='a6bda2862c2c7b673534b551216afe30',container_format='bare',created_at=2025-12-05T12:08:38Z,direct_url=<?>,disk_format='qcow2',id=4d0314d0-2208-4446-8d20-5c2197f0bd9d,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1629320086-shelved',owner='58cbd93e463049988ccd6d013893e7d6',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-12-05T12:08:51Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.522 187212 WARNING nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.530 187212 DEBUG nova.virt.libvirt.host [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.531 187212 DEBUG nova.virt.libvirt.host [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.535 187212 DEBUG nova.virt.libvirt.host [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.535 187212 DEBUG nova.virt.libvirt.host [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.536 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.536 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='a6bda2862c2c7b673534b551216afe30',container_format='bare',created_at=2025-12-05T12:08:38Z,direct_url=<?>,disk_format='qcow2',id=4d0314d0-2208-4446-8d20-5c2197f0bd9d,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1629320086-shelved',owner='58cbd93e463049988ccd6d013893e7d6',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-12-05T12:08:51Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.536 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.536 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.538 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.538 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.538 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.569 187212 DEBUG nova.virt.libvirt.vif [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='4d0314d0-2208-4446-8d20-5c2197f0bd9d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-776546213',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member',shelved_at='2025-12-05T12:08:51.833545',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='4d0314d0-2208-4446-8d20-5c2197f0bd9d'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.569 187212 DEBUG nova.network.os_vif_util [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.570 187212 DEBUG nova.network.os_vif_util [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.571 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.590 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <uuid>24358eea-14fb-4863-a6c4-aadcdb495f54</uuid>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <name>instance-00000036</name>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestOtherB-server-1629320086</nova:name>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:09:20</nova:creationTime>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:09:20 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:09:20 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:09:20 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:09:20 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:09:20 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:09:20 compute-0 nova_compute[187208]:         <nova:user uuid="4ad1281afc874c0ca55d908d3a6e05a8">tempest-ServerActionsTestOtherB-1759520420-project-member</nova:user>
Dec 05 12:09:20 compute-0 nova_compute[187208]:         <nova:project uuid="58cbd93e463049988ccd6d013893e7d6">tempest-ServerActionsTestOtherB-1759520420</nova:project>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="4d0314d0-2208-4446-8d20-5c2197f0bd9d"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:09:20 compute-0 nova_compute[187208]:         <nova:port uuid="2e9efd6c-740c-405b-b9f0-bd46434070a7">
Dec 05 12:09:20 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <system>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <entry name="serial">24358eea-14fb-4863-a6c4-aadcdb495f54</entry>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <entry name="uuid">24358eea-14fb-4863-a6c4-aadcdb495f54</entry>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     </system>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <os>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   </os>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <features>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   </features>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:ab:5e:ef"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <target dev="tap2e9efd6c-74"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/console.log" append="off"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <video>
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     </video>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:09:20 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:09:20 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:09:20 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:09:20 compute-0 nova_compute[187208]: </domain>
Dec 05 12:09:20 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.590 187212 DEBUG nova.compute.manager [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Preparing to wait for external event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.591 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.591 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.591 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.592 187212 DEBUG nova.virt.libvirt.vif [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='4d0314d0-2208-4446-8d20-5c2197f0bd9d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-776546213',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member',shelved_at='2025-12-05T12:08:51.833545',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='4d0314d0-2208-4446-8d20-5c2197f0bd9d'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.592 187212 DEBUG nova.network.os_vif_util [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.592 187212 DEBUG nova.network.os_vif_util [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.593 187212 DEBUG os_vif [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.593 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.594 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.596 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.596 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e9efd6c-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.597 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2e9efd6c-74, col_values=(('external_ids', {'iface-id': '2e9efd6c-740c-405b-b9f0-bd46434070a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:5e:ef', 'vm-uuid': '24358eea-14fb-4863-a6c4-aadcdb495f54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.598 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:20 compute-0 NetworkManager[55691]: <info>  [1764936560.5998] manager: (tap2e9efd6c-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.604 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.604 187212 INFO os_vif [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74')
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.677 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.677 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.678 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No VIF found with MAC fa:16:3e:ab:5e:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.678 187212 INFO nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Using config drive
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.703 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:20 compute-0 nova_compute[187208]: 2025-12-05 12:09:20.754 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'keypairs' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.059 187212 DEBUG neutronclient.v2_0.client [-] Error message: {"NeutronError": {"type": "PortNotFound", "message": "Port 08b15784-5374-4fb3-9f63-82412f709db4 could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.060 187212 DEBUG nova.network.neutron [-] Unable to show port 08b15784-5374-4fb3-9f63-82412f709db4 as it no longer exists. _unbind_ports /usr/lib/python3.9/site-packages/nova/network/neutron.py:666
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.389 187212 DEBUG nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.390 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.390 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.390 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.390 187212 DEBUG nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Processing event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] No waiting events found dispatching network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.392 187212 WARNING nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received unexpected event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d for instance with vm_state building and task_state spawning.
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.392 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.396 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936562.3965611, dbbad270-1e3c-41e1-9173-c1b9df0ab2dd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.397 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] VM Resumed (Lifecycle Event)
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.398 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.402 187212 INFO nova.virt.libvirt.driver [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance spawned successfully.
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.403 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.417 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.423 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.426 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.426 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.427 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.427 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.427 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.428 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.465 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.504 187212 INFO nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Took 17.83 seconds to spawn the instance on the hypervisor.
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.504 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.546 187212 INFO nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating config drive at /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.552 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjoed27ec execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.595 187212 INFO nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Took 18.43 seconds to build instance.
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.681 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjoed27ec" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:22 compute-0 kernel: tap2e9efd6c-74: entered promiscuous mode
Dec 05 12:09:22 compute-0 NetworkManager[55691]: <info>  [1764936562.7449] manager: (tap2e9efd6c-74): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Dec 05 12:09:22 compute-0 systemd-udevd[230712]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:09:22 compute-0 NetworkManager[55691]: <info>  [1764936562.7902] device (tap2e9efd6c-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:09:22 compute-0 ovn_controller[95610]: 2025-12-05T12:09:22Z|00677|binding|INFO|Claiming lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 for this chassis.
Dec 05 12:09:22 compute-0 ovn_controller[95610]: 2025-12-05T12:09:22Z|00678|binding|INFO|2e9efd6c-740c-405b-b9f0-bd46434070a7: Claiming fa:16:3e:ab:5e:ef 10.100.0.5
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.791 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:22 compute-0 NetworkManager[55691]: <info>  [1764936562.7945] device (tap2e9efd6c-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:09:22 compute-0 ovn_controller[95610]: 2025-12-05T12:09:22Z|00679|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 ovn-installed in OVS
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.805 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:22 compute-0 systemd-machined[153543]: New machine qemu-81-instance-00000036.
Dec 05 12:09:22 compute-0 systemd[1]: Started Virtual Machine qemu-81-instance-00000036.
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.868 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.870 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis
Dec 05 12:09:22 compute-0 ovn_controller[95610]: 2025-12-05T12:09:22Z|00680|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 up in Southbound
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.868 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.872 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.890 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[89fe3388-30c0-41e6-a392-60fbee328dbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.921 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb12523-4759-41c6-a751-797f7204e07e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.925 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc3a648-890d-4c88-aaef-fe593f7574c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.957 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a46c9a96-4472-4f0c-b0e2-27542b86d63a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.977 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fef62758-73fd-4999-9baa-c40d0f9f3857]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 17, 'rx_bytes': 700, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 17, 'rx_bytes': 700, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230730, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:22 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.987 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.996 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dc72f154-13ea-478b-81fc-bf076a1b5ef7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230731, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230731, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:22 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.997 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:23.001 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:23.001 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:23.002 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:23.002 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:22.999 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:23.000 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:23.314 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:23.432 187212 DEBUG nova.network.neutron [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated VIF entry in instance network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:23.432 187212 DEBUG nova.network.neutron [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:23.467 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:23.467 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-deleted-08b15784-5374-4fb3-9f63-82412f709db4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:23.468 187212 INFO nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Neutron deleted interface 08b15784-5374-4fb3-9f63-82412f709db4; detaching it from the instance and deleting it from the info cache
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:23.468 187212 DEBUG nova.network.neutron [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:23 compute-0 nova_compute[187208]: 2025-12-05 12:09:23.928 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Detach interface failed, port_id=08b15784-5374-4fb3-9f63-82412f709db4, reason: Instance f1e72d05-87e7-495d-9dbb-1a10b112c69f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Dec 05 12:09:24 compute-0 podman[230733]: 2025-12-05 12:09:24.209176279 +0000 UTC m=+0.064763875 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.292 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936564.2920427, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.293 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Started (Lifecycle Event)
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.354 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.359 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936564.2927058, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.359 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Paused (Lifecycle Event)
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.387 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.391 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.486 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.691 187212 DEBUG nova.compute.manager [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.691 187212 DEBUG nova.compute.manager [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.691 187212 DEBUG oslo_concurrency.lockutils [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.691 187212 DEBUG oslo_concurrency.lockutils [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:24 compute-0 nova_compute[187208]: 2025-12-05 12:09:24.692 187212 DEBUG nova.network.neutron [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:09:25 compute-0 nova_compute[187208]: 2025-12-05 12:09:25.599 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.020 187212 DEBUG nova.network.neutron [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.046 187212 INFO nova.compute.manager [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Took 7.18 seconds to deallocate network for instance.
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.096 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.096 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.278 187212 DEBUG nova.compute.provider_tree [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.296 187212 DEBUG nova.scheduler.client.report [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.322 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.357 187212 INFO nova.scheduler.client.report [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Deleted allocations for instance f1e72d05-87e7-495d-9dbb-1a10b112c69f
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.420 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.988s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.882 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-unplugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.883 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.883 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.884 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.884 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-unplugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.884 187212 WARNING nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-unplugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 for instance with vm_state deleted and task_state None.
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.884 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.885 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.885 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.885 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.885 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.886 187212 WARNING nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 for instance with vm_state deleted and task_state None.
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.886 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-unplugged-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.886 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.887 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.887 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.887 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-unplugged-af04237a-1f79-4f68-a18e-1ceb4911605b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.887 187212 WARNING nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-unplugged-af04237a-1f79-4f68-a18e-1ceb4911605b for instance with vm_state deleted and task_state None.
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.888 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.888 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.888 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.888 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.889 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.889 187212 WARNING nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b for instance with vm_state deleted and task_state None.
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.950 187212 DEBUG nova.network.neutron [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.951 187212 DEBUG nova.network.neutron [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.971 187212 DEBUG oslo_concurrency.lockutils [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:27 compute-0 nova_compute[187208]: 2025-12-05 12:09:27.990 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.907 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.908 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.908 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.908 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.909 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Processing event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.909 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-deleted-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.909 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.909 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.910 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.910 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.910 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.910 187212 WARNING nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved_offloaded and task_state spawning.
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.911 187212 DEBUG nova.compute.manager [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.915 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936569.9148142, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.916 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Resumed (Lifecycle Event)
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.921 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.924 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance spawned successfully.
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.971 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:29 compute-0 nova_compute[187208]: 2025-12-05 12:09:29.975 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:09:30 compute-0 nova_compute[187208]: 2025-12-05 12:09:30.432 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:09:30 compute-0 nova_compute[187208]: 2025-12-05 12:09:30.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:32 compute-0 podman[230766]: 2025-12-05 12:09:32.221909611 +0000 UTC m=+0.063529799 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 12:09:32 compute-0 podman[230765]: 2025-12-05 12:09:32.273628512 +0000 UTC m=+0.114389526 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 12:09:32 compute-0 nova_compute[187208]: 2025-12-05 12:09:32.312 187212 DEBUG nova.compute.manager [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:32 compute-0 nova_compute[187208]: 2025-12-05 12:09:32.387 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 23.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:32 compute-0 nova_compute[187208]: 2025-12-05 12:09:32.993 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:33 compute-0 nova_compute[187208]: 2025-12-05 12:09:33.010 187212 DEBUG nova.compute.manager [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-deleted-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:33 compute-0 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG nova.compute.manager [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:33 compute-0 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG nova.compute.manager [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:09:33 compute-0 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG oslo_concurrency.lockutils [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:33 compute-0 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG oslo_concurrency.lockutils [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:33 compute-0 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG nova.network.neutron [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:09:34 compute-0 ovn_controller[95610]: 2025-12-05T12:09:34Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:60:68:ad 10.100.0.4
Dec 05 12:09:34 compute-0 ovn_controller[95610]: 2025-12-05T12:09:34Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:68:ad 10.100.0.4
Dec 05 12:09:34 compute-0 nova_compute[187208]: 2025-12-05 12:09:34.722 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936559.720632, f1e72d05-87e7-495d-9dbb-1a10b112c69f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:34 compute-0 nova_compute[187208]: 2025-12-05 12:09:34.722 187212 INFO nova.compute.manager [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] VM Stopped (Lifecycle Event)
Dec 05 12:09:34 compute-0 nova_compute[187208]: 2025-12-05 12:09:34.742 187212 DEBUG nova.compute.manager [None req-ef119c1b-8bb9-4025-b750-736eb17d7e66 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:35 compute-0 nova_compute[187208]: 2025-12-05 12:09:35.430 187212 DEBUG nova.network.neutron [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:09:35 compute-0 nova_compute[187208]: 2025-12-05 12:09:35.431 187212 DEBUG nova.network.neutron [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:35 compute-0 nova_compute[187208]: 2025-12-05 12:09:35.449 187212 DEBUG oslo_concurrency.lockutils [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:35 compute-0 nova_compute[187208]: 2025-12-05 12:09:35.525 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:35 compute-0 nova_compute[187208]: 2025-12-05 12:09:35.602 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.252 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.252 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.253 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.253 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.253 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.254 187212 INFO nova.compute.manager [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Terminating instance
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.256 187212 DEBUG nova.compute.manager [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:09:36 compute-0 kernel: tapdf4eecd2-b2 (unregistering): left promiscuous mode
Dec 05 12:09:36 compute-0 NetworkManager[55691]: <info>  [1764936576.2822] device (tapdf4eecd2-b2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:09:36 compute-0 ovn_controller[95610]: 2025-12-05T12:09:36Z|00681|binding|INFO|Releasing lport df4eecd2-b2e2-445a-acac-232f66123555 from this chassis (sb_readonly=0)
Dec 05 12:09:36 compute-0 ovn_controller[95610]: 2025-12-05T12:09:36Z|00682|binding|INFO|Setting lport df4eecd2-b2e2-445a-acac-232f66123555 down in Southbound
Dec 05 12:09:36 compute-0 ovn_controller[95610]: 2025-12-05T12:09:36Z|00683|binding|INFO|Removing iface tapdf4eecd2-b2 ovn-installed in OVS
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.303 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.305 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:36 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000047.scope: Deactivated successfully.
Dec 05 12:09:36 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000047.scope: Consumed 15.391s CPU time.
Dec 05 12:09:36 compute-0 systemd-machined[153543]: Machine qemu-78-instance-00000047 terminated.
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.351 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:3b:49 10.100.0.11'], port_security=['fa:16:3e:40:3b:49 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf30ed1956544c7eae67c989042126e4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c4ee2104-41f1-480e-ab3a-db882b9c2d98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bb90128-3616-41a6-a999-156ce64fbcf7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=df4eecd2-b2e2-445a-acac-232f66123555) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.352 104471 INFO neutron.agent.ovn.metadata.agent [-] Port df4eecd2-b2e2-445a-acac-232f66123555 in datapath 02d8cc87-efdf-4db2-b7ab-393e2480966a unbound from our chassis
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.355 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02d8cc87-efdf-4db2-b7ab-393e2480966a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.356 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb6cb17-1751-4271-a9bc-67c56a4bbc0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:36 compute-0 podman[230816]: 2025-12-05 12:09:36.357494037 +0000 UTC m=+0.052751711 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.357 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a namespace which is not needed anymore
Dec 05 12:09:36 compute-0 podman[230819]: 2025-12-05 12:09:36.40859399 +0000 UTC m=+0.104048540 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:09:36 compute-0 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [NOTICE]   (229739) : haproxy version is 2.8.14-c23fe91
Dec 05 12:09:36 compute-0 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [NOTICE]   (229739) : path to executable is /usr/sbin/haproxy
Dec 05 12:09:36 compute-0 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [WARNING]  (229739) : Exiting Master process...
Dec 05 12:09:36 compute-0 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [WARNING]  (229739) : Exiting Master process...
Dec 05 12:09:36 compute-0 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [ALERT]    (229739) : Current worker (229741) exited with code 143 (Terminated)
Dec 05 12:09:36 compute-0 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [WARNING]  (229739) : All workers exited. Exiting... (0)
Dec 05 12:09:36 compute-0 systemd[1]: libpod-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1.scope: Deactivated successfully.
Dec 05 12:09:36 compute-0 conmon[229735]: conmon 44e956a7e8f863954c3e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1.scope/container/memory.events
Dec 05 12:09:36 compute-0 podman[230887]: 2025-12-05 12:09:36.490105724 +0000 UTC m=+0.047218283 container died 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.521 187212 INFO nova.virt.libvirt.driver [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance destroyed successfully.
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.523 187212 DEBUG nova.objects.instance [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lazy-loading 'resources' on Instance uuid b235a96f-7a12-4bd2-8627-33b128346aa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1-userdata-shm.mount: Deactivated successfully.
Dec 05 12:09:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-3936c809ee6a8001892b1f5e8b230731bdba206d1aa032e18836cd92f8d64675-merged.mount: Deactivated successfully.
Dec 05 12:09:36 compute-0 podman[230887]: 2025-12-05 12:09:36.541375632 +0000 UTC m=+0.098488201 container cleanup 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:09:36 compute-0 systemd[1]: libpod-conmon-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1.scope: Deactivated successfully.
Dec 05 12:09:36 compute-0 podman[230933]: 2025-12-05 12:09:36.608799722 +0000 UTC m=+0.045474682 container remove 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.612 187212 DEBUG nova.virt.libvirt.vif [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:08:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-959694714',display_name='tempest-ServerMetadataNegativeTestJSON-server-959694714',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-959694714',id=71,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bf30ed1956544c7eae67c989042126e4',ramdisk_id='',reservation_id='r-h9tu7hr7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-91345283',owner_user_name='tempest-ServerMetadataNegativeTestJSON-91345283-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:29Z,user_data=None,user_id='132d581de02e49b9a4c99b9b831dd5b5',uuid=b235a96f-7a12-4bd2-8627-33b128346aa4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.613 187212 DEBUG nova.network.os_vif_util [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converting VIF {"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.615 187212 DEBUG nova.network.os_vif_util [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.614 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[579a75fd-39ea-433a-b4e0-9726482adab0]: (4, ('Fri Dec  5 12:09:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a (44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1)\n44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1\nFri Dec  5 12:09:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a (44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1)\n44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.615 187212 DEBUG os_vif [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.616 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cc020c01-ca06-4e21-b86e-1e3927360c2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.618 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02d8cc87-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.618 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf4eecd2-b2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.623 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:09:36 compute-0 kernel: tap02d8cc87-e0: left promiscuous mode
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.628 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.631 187212 INFO os_vif [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2')
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.632 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed47019f-749e-4ea8-b5c0-e85bb45a3b77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.632 187212 INFO nova.virt.libvirt.driver [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Deleting instance files /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4_del
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.633 187212 INFO nova.virt.libvirt.driver [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Deletion of /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4_del complete
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.650 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.651 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5a738836-adbb-4c42-85f4-148fbd612952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.655 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d81d538-20ba-4060-a9c7-1b9fe7427fa8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.673 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3de205b6-bee7-493e-9ac9-21a7905c0588]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388441, 'reachable_time': 36104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230947, 'error': None, 'target': 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:36 compute-0 systemd[1]: run-netns-ovnmeta\x2d02d8cc87\x2defdf\x2d4db2\x2db7ab\x2d393e2480966a.mount: Deactivated successfully.
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.677 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:09:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.678 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb4c171-481c-4b5f-ba64-983632495f10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.728 187212 INFO nova.compute.manager [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Took 0.47 seconds to destroy the instance on the hypervisor.
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.729 187212 DEBUG oslo.service.loopingcall [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.729 187212 DEBUG nova.compute.manager [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:09:36 compute-0 nova_compute[187208]: 2025-12-05 12:09:36.729 187212 DEBUG nova.network.neutron [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:09:37 compute-0 nova_compute[187208]: 2025-12-05 12:09:37.877 187212 DEBUG nova.network.neutron [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:37 compute-0 nova_compute[187208]: 2025-12-05 12:09:37.902 187212 INFO nova.compute.manager [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Took 1.17 seconds to deallocate network for instance.
Dec 05 12:09:37 compute-0 nova_compute[187208]: 2025-12-05 12:09:37.943 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:37 compute-0 nova_compute[187208]: 2025-12-05 12:09:37.943 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:38 compute-0 nova_compute[187208]: 2025-12-05 12:09:38.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:38 compute-0 nova_compute[187208]: 2025-12-05 12:09:38.094 187212 DEBUG nova.compute.provider_tree [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:38 compute-0 nova_compute[187208]: 2025-12-05 12:09:38.112 187212 DEBUG nova.scheduler.client.report [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:38 compute-0 nova_compute[187208]: 2025-12-05 12:09:38.131 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:38 compute-0 nova_compute[187208]: 2025-12-05 12:09:38.165 187212 INFO nova.scheduler.client.report [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Deleted allocations for instance b235a96f-7a12-4bd2-8627-33b128346aa4
Dec 05 12:09:38 compute-0 nova_compute[187208]: 2025-12-05 12:09:38.243 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:40 compute-0 podman[230948]: 2025-12-05 12:09:40.219607443 +0000 UTC m=+0.070639813 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.509 187212 DEBUG nova.compute.manager [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-unplugged-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.510 187212 DEBUG oslo_concurrency.lockutils [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.511 187212 DEBUG oslo_concurrency.lockutils [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.511 187212 DEBUG oslo_concurrency.lockutils [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.511 187212 DEBUG nova.compute.manager [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] No waiting events found dispatching network-vif-unplugged-df4eecd2-b2e2-445a-acac-232f66123555 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.511 187212 WARNING nova.compute.manager [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received unexpected event network-vif-unplugged-df4eecd2-b2e2-445a-acac-232f66123555 for instance with vm_state deleted and task_state None.
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.715 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.716 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.716 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.716 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.717 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.718 187212 INFO nova.compute.manager [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Terminating instance
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.719 187212 DEBUG nova.compute.manager [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:09:40 compute-0 kernel: tap11c7fa90-6a (unregistering): left promiscuous mode
Dec 05 12:09:40 compute-0 NetworkManager[55691]: <info>  [1764936580.7484] device (tap11c7fa90-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:09:40 compute-0 ovn_controller[95610]: 2025-12-05T12:09:40Z|00684|binding|INFO|Releasing lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 from this chassis (sb_readonly=0)
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.757 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:40 compute-0 ovn_controller[95610]: 2025-12-05T12:09:40Z|00685|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 down in Southbound
Dec 05 12:09:40 compute-0 ovn_controller[95610]: 2025-12-05T12:09:40Z|00686|binding|INFO|Removing iface tap11c7fa90-6a ovn-installed in OVS
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:40.766 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:ee:e4 10.100.0.2'], port_security=['fa:16:3e:e4:ee:e4 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-034629ef-6cd1-463c-b963-3d0d9c530038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e846fccb774e44f585d8847897bc4229', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a77f7593-d6d1-44fb-8125-66cdfc38709c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4c9d894-0fc3-4aad-a4d5-6bee101a530c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=11c7fa90-6a48-487a-a375-5adf7f41cb90) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:40.767 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 11c7fa90-6a48-487a-a375-5adf7f41cb90 in datapath 034629ef-6cd1-463c-b963-3d0d9c530038 unbound from our chassis
Dec 05 12:09:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:40.769 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 034629ef-6cd1-463c-b963-3d0d9c530038 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:40.770 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b8897bcf-300c-414f-bb59-baacb5cc9fdb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:40 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d00000046.scope: Deactivated successfully.
Dec 05 12:09:40 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d00000046.scope: Consumed 14.338s CPU time.
Dec 05 12:09:40 compute-0 systemd-machined[153543]: Machine qemu-79-instance-00000046 terminated.
Dec 05 12:09:40 compute-0 nova_compute[187208]: 2025-12-05 12:09:40.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.008 187212 INFO nova.virt.libvirt.driver [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance destroyed successfully.
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.009 187212 DEBUG nova.objects.instance [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'resources' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.030 187212 DEBUG nova.virt.libvirt.vif [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1436335913',display_name='tempest-ServerRescueTestJSONUnderV235-server-1436335913',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1436335913',id=70,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e846fccb774e44f585d8847897bc4229',ramdisk_id='',reservation_id='r-230fx5t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1035500959',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1035500959-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:47Z,user_data=None,user_id='6a2cefdbcaae4db3b3ece95c8227d77e',uuid=2e537618-f998-4c4d-8e1e-e9cc79219330,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.031 187212 DEBUG nova.network.os_vif_util [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converting VIF {"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.032 187212 DEBUG nova.network.os_vif_util [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.032 187212 DEBUG os_vif [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.035 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11c7fa90-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.038 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.040 187212 INFO os_vif [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a')
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.041 187212 INFO nova.virt.libvirt.driver [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Deleting instance files /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330_del
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.042 187212 INFO nova.virt.libvirt.driver [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Deletion of /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330_del complete
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.049 187212 DEBUG nova.compute.manager [req-b15de738-2f3c-4d1c-bddf-7553d253060b req-be1941e0-b23c-45c2-a687-b66bda90f152 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-deleted-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.075 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.076 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.101 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.111 187212 INFO nova.compute.manager [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Took 0.39 seconds to destroy the instance on the hypervisor.
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.111 187212 DEBUG oslo.service.loopingcall [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.112 187212 DEBUG nova.compute.manager [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.112 187212 DEBUG nova.network.neutron [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.335 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.336 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.336 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.336 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.513 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.513 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.536 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.678 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.678 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.724 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.725 187212 INFO nova.compute.claims [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.921 187212 DEBUG nova.compute.provider_tree [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.941 187212 DEBUG nova.scheduler.client.report [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.985 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:41 compute-0 nova_compute[187208]: 2025-12-05 12:09:41.986 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.058 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.058 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.082 187212 INFO nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.099 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:09:42 compute-0 ovn_controller[95610]: 2025-12-05T12:09:42Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:5e:ef 10.100.0.5
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.201 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.202 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.203 187212 INFO nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating image(s)
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.204 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.204 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.205 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.231 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.292 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.293 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.294 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.306 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.363 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.364 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.619 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk 1073741824" returned: 0 in 0.255s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.621 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.621 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.680 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.681 187212 DEBUG nova.virt.disk.api [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Checking if we can resize image /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.681 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.703 187212 DEBUG nova.policy [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef254bb2df0442c6bcadfb3a6861c0e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e836357870d746e49bc783da7cd3accd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.739 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.740 187212 DEBUG nova.virt.disk.api [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Cannot resize image /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.741 187212 DEBUG nova.objects.instance [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'migration_context' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.758 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.759 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Ensure instance console log exists: /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.759 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.760 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:42 compute-0 nova_compute[187208]: 2025-12-05 12:09:42.760 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.036 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.363 187212 DEBUG nova.network.neutron [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.384 187212 INFO nova.compute.manager [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Took 2.27 seconds to deallocate network for instance.
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.427 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.428 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.566 187212 DEBUG nova.compute.provider_tree [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.579 187212 DEBUG nova.scheduler.client.report [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.601 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.633 187212 INFO nova.scheduler.client.report [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Deleted allocations for instance 2e537618-f998-4c4d-8e1e-e9cc79219330
Dec 05 12:09:43 compute-0 nova_compute[187208]: 2025-12-05 12:09:43.736 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.570 187212 DEBUG nova.compute.manager [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.570 187212 DEBUG oslo_concurrency.lockutils [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.570 187212 DEBUG oslo_concurrency.lockutils [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.571 187212 DEBUG oslo_concurrency.lockutils [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.571 187212 DEBUG nova.compute.manager [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] No waiting events found dispatching network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.571 187212 WARNING nova.compute.manager [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received unexpected event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 for instance with vm_state deleted and task_state None.
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.633 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.675 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.675 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.676 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.677 187212 DEBUG nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.678 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.678 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.678 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.678 187212 DEBUG nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 WARNING nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state deleted and task_state None.
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 DEBUG nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.680 187212 DEBUG nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.680 187212 WARNING nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state deleted and task_state None.
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.680 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.681 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.681 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.681 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.766 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Successfully created port: 7370bdd5-ddf8-40de-9f35-975b8ceab3ef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.874 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.875 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.875 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.875 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.875 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.876 187212 INFO nova.compute.manager [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Terminating instance
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.877 187212 DEBUG nova.compute.manager [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:09:44 compute-0 kernel: tap1b4ab157-dd (unregistering): left promiscuous mode
Dec 05 12:09:44 compute-0 NetworkManager[55691]: <info>  [1764936584.9106] device (tap1b4ab157-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:09:44 compute-0 ovn_controller[95610]: 2025-12-05T12:09:44Z|00687|binding|INFO|Releasing lport 1b4ab157-ddea-449c-ab91-983a53dd2045 from this chassis (sb_readonly=0)
Dec 05 12:09:44 compute-0 ovn_controller[95610]: 2025-12-05T12:09:44Z|00688|binding|INFO|Setting lport 1b4ab157-ddea-449c-ab91-983a53dd2045 down in Southbound
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:44 compute-0 ovn_controller[95610]: 2025-12-05T12:09:44Z|00689|binding|INFO|Removing iface tap1b4ab157-dd ovn-installed in OVS
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.926 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:e5:0a 10.100.0.13'], port_security=['fa:16:3e:03:e5:0a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'df1c03c3-b3c9-47b6-a712-a13948dd510e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=1b4ab157-ddea-449c-ab91-983a53dd2045) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.927 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 1b4ab157-ddea-449c-ab91-983a53dd2045 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis
Dec 05 12:09:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.930 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec 05 12:09:44 compute-0 nova_compute[187208]: 2025-12-05 12:09:44.931 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.952 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3530f19a-efbc-42e1-847b-b383f880f06b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:44 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000041.scope: Deactivated successfully.
Dec 05 12:09:44 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000041.scope: Consumed 18.742s CPU time.
Dec 05 12:09:44 compute-0 systemd-machined[153543]: Machine qemu-70-instance-00000041 terminated.
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.999 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[192eb20c-d4f2-4cee-aa17-457f3e71fe9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.002 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[60a4dc31-398e-4ff7-8fd2-e335c14bbfb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.033 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[89d62454-02bd-45d1-952c-c5ecb5474b6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.049 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd8ba68-6d28-46bd-b39c-e60befb88627]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 19, 'rx_bytes': 700, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 19, 'rx_bytes': 700, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231030, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.064 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8bd43ca-7033-4fe0-9582-a62abfa03dc0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231031, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231031, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.066 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.067 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.070 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.071 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.071 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.071 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.072 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.079 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.079 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.080 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.080 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.097 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.147 187212 INFO nova.virt.libvirt.driver [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance destroyed successfully.
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.148 187212 DEBUG nova.objects.instance [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'resources' on Instance uuid 854e3893-3908-4b4a-b29c-7fb4384e4f0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.159 187212 DEBUG nova.virt.libvirt.vif [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-63085993',display_name='tempest-ServerActionsTestOtherB-server-63085993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-63085993',id=65,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-ruwsmmgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:20Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=854e3893-3908-4b4a-b29c-7fb4384e4f0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.160 187212 DEBUG nova.network.os_vif_util [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.161 187212 DEBUG nova.network.os_vif_util [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.161 187212 DEBUG os_vif [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.162 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.163 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b4ab157-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.169 187212 INFO os_vif [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd')
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.170 187212 INFO nova.virt.libvirt.driver [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Deleting instance files /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c_del
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.170 187212 INFO nova.virt.libvirt.driver [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Deletion of /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c_del complete
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.404 187212 INFO nova.compute.manager [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Took 0.53 seconds to destroy the instance on the hypervisor.
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.404 187212 DEBUG oslo.service.loopingcall [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.405 187212 DEBUG nova.compute.manager [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.405 187212 DEBUG nova.network.neutron [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.419 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.486 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.487 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.543 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.547 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Error from libvirt while getting description of instance-00000041: [Error Code 42] Domain not found: no domain with matching uuid '854e3893-3908-4b4a-b29c-7fb4384e4f0c' (instance-00000041): libvirt.libvirtError: Domain not found: no domain with matching uuid '854e3893-3908-4b4a-b29c-7fb4384e4f0c' (instance-00000041)
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.553 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.622 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.623 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.690 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.880 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.881 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5297MB free_disk=73.02593994140625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.882 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:45 compute-0 nova_compute[187208]: 2025-12-05 12:09:45.882 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.068 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.069 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.195 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 854e3893-3908-4b4a-b29c-7fb4384e4f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.196 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance dbbad270-1e3c-41e1-9173-c1b9df0ab2dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.196 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.196 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 159b5354-c124-484f-a8ec-da1abf719114 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.479 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.480 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.480 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.561 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.600 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:46 compute-0 nova_compute[187208]: 2025-12-05 12:09:46.914 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:47 compute-0 nova_compute[187208]: 2025-12-05 12:09:47.086 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:09:47 compute-0 nova_compute[187208]: 2025-12-05 12:09:47.086 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:47 compute-0 nova_compute[187208]: 2025-12-05 12:09:47.108 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:47 compute-0 nova_compute[187208]: 2025-12-05 12:09:47.109 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:47 compute-0 nova_compute[187208]: 2025-12-05 12:09:47.119 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:09:47 compute-0 nova_compute[187208]: 2025-12-05 12:09:47.119 187212 INFO nova.compute.claims [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:09:47 compute-0 nova_compute[187208]: 2025-12-05 12:09:47.646 187212 DEBUG nova.compute.provider_tree [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.003 187212 DEBUG nova.scheduler.client.report [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.027 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.028 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.037 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.082 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.082 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.157 187212 DEBUG nova.compute.manager [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-deleted-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.157 187212 DEBUG nova.compute.manager [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-unplugged-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG oslo_concurrency.lockutils [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG oslo_concurrency.lockutils [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG oslo_concurrency.lockutils [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG nova.compute.manager [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] No waiting events found dispatching network-vif-unplugged-1b4ab157-ddea-449c-ab91-983a53dd2045 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG nova.compute.manager [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-unplugged-1b4ab157-ddea-449c-ab91-983a53dd2045 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.183 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.183 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.185 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.185 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:09:48 compute-0 podman[231062]: 2025-12-05 12:09:48.21015165 +0000 UTC m=+0.062522201 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.215 187212 INFO nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.254 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.527 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.529 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.529 187212 INFO nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Creating image(s)
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.530 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.530 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.531 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.548 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.613 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.614 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.615 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.626 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.688 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.689 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.728 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.729 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.729 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.789 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.791 187212 DEBUG nova.virt.disk.api [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Checking if we can resize image /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.791 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.855 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.856 187212 DEBUG nova.virt.disk.api [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Cannot resize image /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.857 187212 DEBUG nova.objects.instance [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'migration_context' on Instance uuid 54d9605a-998b-4492-afc8-f7a5b0dd4e84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.902 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.903 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Ensure instance console log exists: /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.903 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.903 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:48 compute-0 nova_compute[187208]: 2025-12-05 12:09:48.904 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:50.300 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.301 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:50.301 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:09:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:50.302 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.305 187212 DEBUG nova.network.neutron [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.325 187212 INFO nova.compute.manager [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Took 4.92 seconds to deallocate network for instance.
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.376 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.376 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.397 187212 DEBUG nova.policy [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.524 187212 DEBUG nova.compute.provider_tree [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.538 187212 DEBUG nova.scheduler.client.report [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.567 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.590 187212 INFO nova.scheduler.client.report [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Deleted allocations for instance 854e3893-3908-4b4a-b29c-7fb4384e4f0c
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.653 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.779s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.910 187212 DEBUG nova.compute.manager [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.910 187212 DEBUG oslo_concurrency.lockutils [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.910 187212 DEBUG oslo_concurrency.lockutils [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.910 187212 DEBUG oslo_concurrency.lockutils [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.911 187212 DEBUG nova.compute.manager [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] No waiting events found dispatching network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:50 compute-0 nova_compute[187208]: 2025-12-05 12:09:50.911 187212 WARNING nova.compute.manager [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received unexpected event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 for instance with vm_state deleted and task_state None.
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.450 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.451 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.451 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.451 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.452 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.453 187212 INFO nova.compute.manager [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Terminating instance
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.455 187212 DEBUG nova.compute.manager [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:09:51 compute-0 kernel: tap2e9efd6c-74 (unregistering): left promiscuous mode
Dec 05 12:09:51 compute-0 NetworkManager[55691]: <info>  [1764936591.4755] device (tap2e9efd6c-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:09:51 compute-0 ovn_controller[95610]: 2025-12-05T12:09:51Z|00690|binding|INFO|Releasing lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 from this chassis (sb_readonly=0)
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.481 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:51 compute-0 ovn_controller[95610]: 2025-12-05T12:09:51Z|00691|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 down in Southbound
Dec 05 12:09:51 compute-0 ovn_controller[95610]: 2025-12-05T12:09:51Z|00692|binding|INFO|Removing iface tap2e9efd6c-74 ovn-installed in OVS
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.485 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.499 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.520 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936576.5187266, b235a96f-7a12-4bd2-8627-33b128346aa4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.520 187212 INFO nova.compute.manager [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] VM Stopped (Lifecycle Event)
Dec 05 12:09:51 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000036.scope: Deactivated successfully.
Dec 05 12:09:51 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000036.scope: Consumed 13.234s CPU time.
Dec 05 12:09:51 compute-0 systemd-machined[153543]: Machine qemu-81-instance-00000036 terminated.
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.543 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Successfully updated port: 7370bdd5-ddf8-40de-9f35-975b8ceab3ef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.664 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.231', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.665 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.667 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5c17e5c-2b6c-48d3-9992-ac34070e3363, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.669 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f927fa9b-e9b5-45be-b894-cac3295b6d50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.669 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 namespace which is not needed anymore
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.680 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.680 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquired lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.680 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.683 187212 DEBUG nova.compute.manager [None req-2e64d549-3601-47d6-b551-e33f37549524 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.718 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance destroyed successfully.
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.719 187212 DEBUG nova.objects.instance [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'resources' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.740 187212 DEBUG nova.virt.libvirt.vif [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCLIKVsEL0lmma4upWYe8NiCB7ZJxacCmu4vu1RJu3M5/Fu5S7w/HUSIKvvOTrl/9nUJ4pE5tXIAyPQxQDsptmV5i8IinhFeAgIm0GlEBvfbCuuhpWud8F+u8GsIwgaqpQ==',key_name='tempest-keypair-776546213',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:09:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.741 187212 DEBUG nova.network.os_vif_util [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.742 187212 DEBUG nova.network.os_vif_util [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.742 187212 DEBUG os_vif [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.745 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e9efd6c-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.749 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.752 187212 INFO os_vif [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74')
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.752 187212 INFO nova.virt.libvirt.driver [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deleting instance files /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54_del
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.757 187212 INFO nova.virt.libvirt.driver [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deletion of /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54_del complete
Dec 05 12:09:51 compute-0 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [NOTICE]   (225423) : haproxy version is 2.8.14-c23fe91
Dec 05 12:09:51 compute-0 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [NOTICE]   (225423) : path to executable is /usr/sbin/haproxy
Dec 05 12:09:51 compute-0 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [WARNING]  (225423) : Exiting Master process...
Dec 05 12:09:51 compute-0 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [ALERT]    (225423) : Current worker (225425) exited with code 143 (Terminated)
Dec 05 12:09:51 compute-0 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [WARNING]  (225423) : All workers exited. Exiting... (0)
Dec 05 12:09:51 compute-0 systemd[1]: libpod-65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5.scope: Deactivated successfully.
Dec 05 12:09:51 compute-0 podman[231145]: 2025-12-05 12:09:51.796695637 +0000 UTC m=+0.040533592 container died 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.806 187212 INFO nova.compute.manager [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.806 187212 DEBUG oslo.service.loopingcall [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.807 187212 DEBUG nova.compute.manager [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.807 187212 DEBUG nova.network.neutron [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:09:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5-userdata-shm.mount: Deactivated successfully.
Dec 05 12:09:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-93f32fdd3ea0552c523abfc1a627c1ddf05c35a6f969e26671c37410720e74dc-merged.mount: Deactivated successfully.
Dec 05 12:09:51 compute-0 podman[231145]: 2025-12-05 12:09:51.835872399 +0000 UTC m=+0.079710354 container cleanup 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.837 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:09:51 compute-0 systemd[1]: libpod-conmon-65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5.scope: Deactivated successfully.
Dec 05 12:09:51 compute-0 podman[231176]: 2025-12-05 12:09:51.898826971 +0000 UTC m=+0.041034386 container remove 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.904 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[42f7b2b9-c2fe-4a52-9c6e-b56ad3e11bdd]: (4, ('Fri Dec  5 12:09:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 (65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5)\n65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5\nFri Dec  5 12:09:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 (65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5)\n65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.906 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0278b9-0aa1-4197-8e71-c74b08279188]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.907 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:51 compute-0 kernel: tapb5c17e5c-20: left promiscuous mode
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.910 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.914 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[06367650-203f-4e3d-be4c-7f621b3afd42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:51 compute-0 nova_compute[187208]: 2025-12-05 12:09:51.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.942 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c68ae03-bc83-4718-a6b1-9d6b062fc701]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.943 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7f52005e-f6fb-448a-8b7b-da46d96afbcf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.957 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62cde3f9-1902-47f9-8397-058446386876]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371894, 'reachable_time': 25036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231189, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.959 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:09:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.959 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[5e10bdae-c100-46b4-b674-c5aa56dcb910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:51 compute-0 systemd[1]: run-netns-ovnmeta\x2db5c17e5c\x2d2b6c\x2d48d3\x2d9992\x2dac34070e3363.mount: Deactivated successfully.
Dec 05 12:09:52 compute-0 nova_compute[187208]: 2025-12-05 12:09:52.743 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Updating instance_info_cache with network_info: [{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.033 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Releasing lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.034 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance network_info: |[{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.037 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start _get_guest_xml network_info=[{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.039 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.246 187212 WARNING nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.253 187212 DEBUG nova.virt.libvirt.host [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.254 187212 DEBUG nova.virt.libvirt.host [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.259 187212 DEBUG nova.virt.libvirt.host [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.260 187212 DEBUG nova.virt.libvirt.host [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.260 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.261 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.261 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.262 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.262 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.262 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.262 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.263 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.263 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.263 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.263 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.264 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.269 187212 DEBUG nova.virt.libvirt.vif [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:42Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.269 187212 DEBUG nova.network.os_vif_util [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.270 187212 DEBUG nova.network.os_vif_util [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.271 187212 DEBUG nova.objects.instance [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_devices' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.340 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <uuid>159b5354-c124-484f-a8ec-da1abf719114</uuid>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <name>instance-00000049</name>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-2012489303</nova:name>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:09:53</nova:creationTime>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:09:53 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:09:53 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:09:53 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:09:53 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:09:53 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:09:53 compute-0 nova_compute[187208]:         <nova:user uuid="ef254bb2df0442c6bcadfb3a6861c0e9">tempest-ServerDiskConfigTestJSON-1245488084-project-member</nova:user>
Dec 05 12:09:53 compute-0 nova_compute[187208]:         <nova:project uuid="e836357870d746e49bc783da7cd3accd">tempest-ServerDiskConfigTestJSON-1245488084</nova:project>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:09:53 compute-0 nova_compute[187208]:         <nova:port uuid="7370bdd5-ddf8-40de-9f35-975b8ceab3ef">
Dec 05 12:09:53 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <system>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <entry name="serial">159b5354-c124-484f-a8ec-da1abf719114</entry>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <entry name="uuid">159b5354-c124-484f-a8ec-da1abf719114</entry>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     </system>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <os>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   </os>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <features>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   </features>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:ee:f0:e8"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <target dev="tap7370bdd5-dd"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/console.log" append="off"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <video>
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     </video>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:09:53 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:09:53 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:09:53 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:09:53 compute-0 nova_compute[187208]: </domain>
Dec 05 12:09:53 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.342 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Preparing to wait for external event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.342 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.343 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.343 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.344 187212 DEBUG nova.virt.libvirt.vif [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:42Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.344 187212 DEBUG nova.network.os_vif_util [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.344 187212 DEBUG nova.network.os_vif_util [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.345 187212 DEBUG os_vif [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.346 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.346 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.347 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Successfully created port: ef99bad5-d092-46f6-9b3a-8225cc233d1e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.352 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.353 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7370bdd5-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.354 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7370bdd5-dd, col_values=(('external_ids', {'iface-id': '7370bdd5-ddf8-40de-9f35-975b8ceab3ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:f0:e8', 'vm-uuid': '159b5354-c124-484f-a8ec-da1abf719114'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.355 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.357 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:09:53 compute-0 NetworkManager[55691]: <info>  [1764936593.3574] manager: (tap7370bdd5-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.363 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.365 187212 INFO os_vif [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd')
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.494 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.495 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.495 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No VIF found with MAC fa:16:3e:ee:f0:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.495 187212 INFO nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Using config drive
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.770 187212 INFO nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating config drive at /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.777 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwyb6xu_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.905 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwyb6xu_" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:09:53 compute-0 kernel: tap7370bdd5-dd: entered promiscuous mode
Dec 05 12:09:53 compute-0 systemd-udevd[231106]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:09:53 compute-0 ovn_controller[95610]: 2025-12-05T12:09:53Z|00693|binding|INFO|Claiming lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef for this chassis.
Dec 05 12:09:53 compute-0 ovn_controller[95610]: 2025-12-05T12:09:53Z|00694|binding|INFO|7370bdd5-ddf8-40de-9f35-975b8ceab3ef: Claiming fa:16:3e:ee:f0:e8 10.100.0.14
Dec 05 12:09:53 compute-0 NetworkManager[55691]: <info>  [1764936593.9762] manager: (tap7370bdd5-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/269)
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.982 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f0:e8 10.100.0.14'], port_security=['fa:16:3e:ee:f0:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '159b5354-c124-484f-a8ec-da1abf719114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7370bdd5-ddf8-40de-9f35-975b8ceab3ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.983 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c bound to our chassis
Dec 05 12:09:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.985 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:09:53 compute-0 NetworkManager[55691]: <info>  [1764936593.9869] device (tap7370bdd5-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:09:53 compute-0 NetworkManager[55691]: <info>  [1764936593.9875] device (tap7370bdd5-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:09:53 compute-0 ovn_controller[95610]: 2025-12-05T12:09:53Z|00695|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef ovn-installed in OVS
Dec 05 12:09:53 compute-0 ovn_controller[95610]: 2025-12-05T12:09:53Z|00696|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef up in Southbound
Dec 05 12:09:53 compute-0 nova_compute[187208]: 2025-12-05 12:09:53.992 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.996 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8116b836-e86e-4cfa-b3d4-556d3f8fa9c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.997 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7be4540a-01 in ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.999 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7be4540a-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.999 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[959ca3cf-c740-44cc-9501-cfa2b2239ba6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.001 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e525708-5174-4bdf-ad88-1cd982ae1edb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.013 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c45ffd77-ae24-4ac7-b22c-b3ad89a505a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 systemd-machined[153543]: New machine qemu-82-instance-00000049.
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.025 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3ba3be-f90a-4c5a-a976-1b5fdcdbec45]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 systemd[1]: Started Virtual Machine qemu-82-instance-00000049.
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.051 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b19a87-ca2f-4e14-888f-93e92bec132a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.058 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac234cd7-1ae5-40cb-b634-a8fa9e4fc52b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 NetworkManager[55691]: <info>  [1764936594.0603] manager: (tap7be4540a-00): new Veth device (/org/freedesktop/NetworkManager/Devices/270)
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.089 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[508362f2-b86c-4a3e-826b-241ee5d99630]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.092 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8983ea-116d-480d-b9ce-ced77d856e51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.093 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-deleted-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.093 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-changed-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.094 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Refreshing instance network info cache due to event network-changed-7370bdd5-ddf8-40de-9f35-975b8ceab3ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.094 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.095 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.095 187212 DEBUG nova.network.neutron [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Refreshing network info cache for port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:09:54 compute-0 NetworkManager[55691]: <info>  [1764936594.1166] device (tap7be4540a-00): carrier: link connected
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.121 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[86b3f2b7-664f-41bd-a858-994fdb71806e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.137 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2c91e24c-8d58-4a7f-a228-90d1a8d6acc4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397468, 'reachable_time': 33202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231240, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.152 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0dd987-04ae-4056-bfce-df4b8e818e16]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:4893'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397468, 'tstamp': 397468}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231241, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.173 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[50326e5e-13b1-4290-b50a-be90c2dcb8c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397468, 'reachable_time': 33202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231242, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.206 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cfda746a-de5d-44a6-b684-099e29a0dfa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.266 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfe1d23-6864-4cb8-b7ea-e3d94111dad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.268 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.268 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.269 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7be4540a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:54 compute-0 kernel: tap7be4540a-00: entered promiscuous mode
Dec 05 12:09:54 compute-0 NetworkManager[55691]: <info>  [1764936594.2718] manager: (tap7be4540a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.271 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.274 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7be4540a-00, col_values=(('external_ids', {'iface-id': '4dcf8e96-bf04-4914-959a-aad071dfa454'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:54 compute-0 ovn_controller[95610]: 2025-12-05T12:09:54Z|00697|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.277 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.277 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c50be710-d029-491b-b265-a902d8b48b92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.278 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:09:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.281 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'env', 'PROCESS_TAG=haproxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.286 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.513 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936594.5128646, 159b5354-c124-484f-a8ec-da1abf719114 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.514 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Started (Lifecycle Event)
Dec 05 12:09:54 compute-0 podman[231281]: 2025-12-05 12:09:54.654213111 +0000 UTC m=+0.053938176 container create 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:09:54 compute-0 systemd[1]: Started libpod-conmon-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e.scope.
Dec 05 12:09:54 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:09:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db0958bae067600c1586deb306b205beef4d4a15d45a054b88ed994a15bf001d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:09:54 compute-0 podman[231281]: 2025-12-05 12:09:54.623987265 +0000 UTC m=+0.023712360 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:09:54 compute-0 podman[231281]: 2025-12-05 12:09:54.738124923 +0000 UTC m=+0.137850018 container init 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:09:54 compute-0 podman[231281]: 2025-12-05 12:09:54.744218357 +0000 UTC m=+0.143943422 container start 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:09:54 compute-0 podman[231294]: 2025-12-05 12:09:54.758746203 +0000 UTC m=+0.064610040 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:09:54 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [NOTICE]   (231317) : New worker (231320) forked
Dec 05 12:09:54 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [NOTICE]   (231317) : Loading success.
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.808 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.815 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936594.513273, 159b5354-c124-484f-a8ec-da1abf719114 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.815 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Paused (Lifecycle Event)
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.838 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.841 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:09:54 compute-0 nova_compute[187208]: 2025-12-05 12:09:54.869 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.204 187212 DEBUG nova.compute.manager [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.205 187212 DEBUG oslo_concurrency.lockutils [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.205 187212 DEBUG oslo_concurrency.lockutils [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.205 187212 DEBUG oslo_concurrency.lockutils [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.206 187212 DEBUG nova.compute.manager [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Processing event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.206 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.210 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936595.209823, 159b5354-c124-484f-a8ec-da1abf719114 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.210 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Resumed (Lifecycle Event)
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.212 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.216 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance spawned successfully.
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.216 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.248 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.256 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.260 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.261 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.261 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.262 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.262 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.263 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.295 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.336 187212 INFO nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Took 13.13 seconds to spawn the instance on the hypervisor.
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.337 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.413 187212 INFO nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Took 13.84 seconds to build instance.
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.437 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:55 compute-0 ovn_controller[95610]: 2025-12-05T12:09:55Z|00698|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:09:55 compute-0 ovn_controller[95610]: 2025-12-05T12:09:55Z|00699|binding|INFO|Releasing lport da9adcd8-f2a5-4ff7-962a-717d700ad7b5 from this chassis (sb_readonly=0)
Dec 05 12:09:55 compute-0 nova_compute[187208]: 2025-12-05 12:09:55.835 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.006 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936581.005481, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.007 187212 INFO nova.compute.manager [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Stopped (Lifecycle Event)
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.030 187212 DEBUG nova.network.neutron [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.032 187212 DEBUG nova.compute.manager [None req-d8000d96-c9e5-4bb5-a562-40943c9b246c - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.060 187212 INFO nova.compute.manager [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 4.25 seconds to deallocate network for instance.
Dec 05 12:09:56 compute-0 ovn_controller[95610]: 2025-12-05T12:09:56Z|00700|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:09:56 compute-0 ovn_controller[95610]: 2025-12-05T12:09:56Z|00701|binding|INFO|Releasing lport da9adcd8-f2a5-4ff7-962a-717d700ad7b5 from this chassis (sb_readonly=0)
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.086 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.149 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.150 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.241 187212 DEBUG nova.compute.provider_tree [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.262 187212 DEBUG nova.scheduler.client.report [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.322 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.354 187212 INFO nova.scheduler.client.report [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Deleted allocations for instance 24358eea-14fb-4863-a6c4-aadcdb495f54
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.527 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.794 187212 DEBUG nova.compute.manager [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.795 187212 DEBUG oslo_concurrency.lockutils [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.795 187212 DEBUG oslo_concurrency.lockutils [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.796 187212 DEBUG oslo_concurrency.lockutils [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.796 187212 DEBUG nova.compute.manager [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.796 187212 WARNING nova.compute.manager [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state deleted and task_state None.
Dec 05 12:09:56 compute-0 nova_compute[187208]: 2025-12-05 12:09:56.797 187212 DEBUG nova.compute.manager [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-deleted-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:57 compute-0 nova_compute[187208]: 2025-12-05 12:09:57.885 187212 DEBUG nova.compute.manager [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:57 compute-0 nova_compute[187208]: 2025-12-05 12:09:57.886 187212 DEBUG oslo_concurrency.lockutils [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:57 compute-0 nova_compute[187208]: 2025-12-05 12:09:57.887 187212 DEBUG oslo_concurrency.lockutils [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:57 compute-0 nova_compute[187208]: 2025-12-05 12:09:57.887 187212 DEBUG oslo_concurrency.lockutils [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:57 compute-0 nova_compute[187208]: 2025-12-05 12:09:57.887 187212 DEBUG nova.compute.manager [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:57 compute-0 nova_compute[187208]: 2025-12-05 12:09:57.888 187212 WARNING nova.compute.manager [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state None.
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.023 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Successfully updated port: ef99bad5-d092-46f6-9b3a-8225cc233d1e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.033 187212 DEBUG nova.network.neutron [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Updated VIF entry in instance network info cache for port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.034 187212 DEBUG nova.network.neutron [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Updating instance_info_cache with network_info: [{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.092 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.093 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.093 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.094 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.094 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.094 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.095 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.095 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.095 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.095 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.096 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.096 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.096 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.097 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.097 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.098 187212 INFO nova.compute.manager [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Terminating instance
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.099 187212 DEBUG nova.compute.manager [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.100 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 kernel: tapcf99cdda-70 (unregistering): left promiscuous mode
Dec 05 12:09:58 compute-0 NetworkManager[55691]: <info>  [1764936598.1333] device (tapcf99cdda-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:09:58 compute-0 ovn_controller[95610]: 2025-12-05T12:09:58Z|00702|binding|INFO|Releasing lport cf99cdda-7071-4c18-8462-3a556234d81d from this chassis (sb_readonly=0)
Dec 05 12:09:58 compute-0 ovn_controller[95610]: 2025-12-05T12:09:58Z|00703|binding|INFO|Setting lport cf99cdda-7071-4c18-8462-3a556234d81d down in Southbound
Dec 05 12:09:58 compute-0 ovn_controller[95610]: 2025-12-05T12:09:58Z|00704|binding|INFO|Removing iface tapcf99cdda-70 ovn-installed in OVS
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.142 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.145 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.152 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:68:ad 10.100.0.4'], port_security=['fa:16:3e:60:68:ad 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dbbad270-1e3c-41e1-9173-c1b9df0ab2dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bdbd9c8684c4b9b97e00725e41037eb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d91f504-323f-40f6-96ee-8e841aa785bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd596033-693a-40ca-949c-841d866181bd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=cf99cdda-7071-4c18-8462-3a556234d81d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.154 104471 INFO neutron.agent.ovn.metadata.agent [-] Port cf99cdda-7071-4c18-8462-3a556234d81d in datapath d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 unbound from our chassis
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.156 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.157 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[74a3c9bf-d58a-4905-988d-a2d9c80e178f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.158 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 namespace which is not needed anymore
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000048.scope: Deactivated successfully.
Dec 05 12:09:58 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000048.scope: Consumed 13.691s CPU time.
Dec 05 12:09:58 compute-0 systemd-machined[153543]: Machine qemu-80-instance-00000048 terminated.
Dec 05 12:09:58 compute-0 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [NOTICE]   (230487) : haproxy version is 2.8.14-c23fe91
Dec 05 12:09:58 compute-0 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [NOTICE]   (230487) : path to executable is /usr/sbin/haproxy
Dec 05 12:09:58 compute-0 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [WARNING]  (230487) : Exiting Master process...
Dec 05 12:09:58 compute-0 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [ALERT]    (230487) : Current worker (230489) exited with code 143 (Terminated)
Dec 05 12:09:58 compute-0 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [WARNING]  (230487) : All workers exited. Exiting... (0)
Dec 05 12:09:58 compute-0 systemd[1]: libpod-34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85.scope: Deactivated successfully.
Dec 05 12:09:58 compute-0 podman[231354]: 2025-12-05 12:09:58.30059132 +0000 UTC m=+0.047091409 container died 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:09:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85-userdata-shm.mount: Deactivated successfully.
Dec 05 12:09:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-58192d43ddea201cbf99dd4a079d9f14871a3e9699282f5a030bf56f666b8ee1-merged.mount: Deactivated successfully.
Dec 05 12:09:58 compute-0 podman[231354]: 2025-12-05 12:09:58.348033148 +0000 UTC m=+0.094533247 container cleanup 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 systemd[1]: libpod-conmon-34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85.scope: Deactivated successfully.
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.376 187212 INFO nova.virt.libvirt.driver [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance destroyed successfully.
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.376 187212 DEBUG nova.objects.instance [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lazy-loading 'resources' on Instance uuid dbbad270-1e3c-41e1-9173-c1b9df0ab2dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.382 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.396 187212 DEBUG nova.virt.libvirt.vif [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:09:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1390207148',display_name='tempest-ServerMetadataTestJSON-server-1390207148',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1390207148',id=72,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1bdbd9c8684c4b9b97e00725e41037eb',ramdisk_id='',reservation_id='r-8eo91pmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-355236921',owner_user_name='tempest-ServerMetadataTestJSON-355236921-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:09:56Z,user_data=None,user_id='4f805540d6084f53aa7bd5a66912be58',uuid=dbbad270-1e3c-41e1-9173-c1b9df0ab2dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.397 187212 DEBUG nova.network.os_vif_util [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converting VIF {"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.397 187212 DEBUG nova.network.os_vif_util [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.398 187212 DEBUG os_vif [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.399 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.400 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf99cdda-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.403 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.406 187212 INFO os_vif [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70')
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.407 187212 INFO nova.virt.libvirt.driver [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Deleting instance files /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd_del
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.407 187212 INFO nova.virt.libvirt.driver [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Deletion of /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd_del complete
Dec 05 12:09:58 compute-0 podman[231397]: 2025-12-05 12:09:58.429413618 +0000 UTC m=+0.055497640 container remove 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.437 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[55a9a081-d4de-49c1-a9fb-bda1696a5bcd]: (4, ('Fri Dec  5 12:09:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 (34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85)\n34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85\nFri Dec  5 12:09:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 (34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85)\n34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.440 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8080d015-ebf8-48cc-9b03-6202daec3de9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.442 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5794fbb-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.444 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 kernel: tapd5794fbb-c0: left promiscuous mode
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.450 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[721e1f85-e0ea-47ad-95af-f51ebd18c8a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.465 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.472 187212 INFO nova.compute.manager [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Took 0.37 seconds to destroy the instance on the hypervisor.
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.473 187212 DEBUG oslo.service.loopingcall [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.473 187212 DEBUG nova.compute.manager [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.473 187212 DEBUG nova.network.neutron [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.480 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6e7fe62f-7027-4722-8d48-4b72fcaae141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.482 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5d20a757-6e25-43a0-9527-14f2d6c38fa7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.499 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[38b653d2-fcd8-4b19-8c3d-f99028b09280]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393233, 'reachable_time': 18313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231411, 'error': None, 'target': 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.502 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:09:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.503 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[2e6f8025-f346-4518-ad35-d170cb5a4e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:09:58 compute-0 systemd[1]: run-netns-ovnmeta\x2dd5794fbb\x2dc1d5\x2d48a4\x2d95d7\x2da9b4ae1fcb22.mount: Deactivated successfully.
Dec 05 12:09:58 compute-0 nova_compute[187208]: 2025-12-05 12:09:58.899 187212 INFO nova.compute.manager [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Rebuilding instance
Dec 05 12:09:59 compute-0 nova_compute[187208]: 2025-12-05 12:09:59.167 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'trusted_certs' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:59 compute-0 nova_compute[187208]: 2025-12-05 12:09:59.434 187212 DEBUG nova.compute.manager [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:09:59 compute-0 nova_compute[187208]: 2025-12-05 12:09:59.508 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_requests' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:59 compute-0 nova_compute[187208]: 2025-12-05 12:09:59.522 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_devices' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:59 compute-0 nova_compute[187208]: 2025-12-05 12:09:59.539 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'resources' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:59 compute-0 nova_compute[187208]: 2025-12-05 12:09:59.551 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'migration_context' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:09:59 compute-0 nova_compute[187208]: 2025-12-05 12:09:59.566 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:09:59 compute-0 nova_compute[187208]: 2025-12-05 12:09:59.570 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.144 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936585.1436546, 854e3893-3908-4b4a-b29c-7fb4384e4f0c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.146 187212 INFO nova.compute.manager [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] VM Stopped (Lifecycle Event)
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.178 187212 DEBUG nova.compute.manager [None req-019c1553-3047-4daf-b809-c66895b903e1 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.193 187212 DEBUG nova.compute.manager [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.194 187212 DEBUG nova.compute.manager [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing instance network info cache due to event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.194 187212 DEBUG oslo_concurrency.lockutils [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.657 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.683 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.683 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Instance network_info: |[{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.684 187212 DEBUG oslo_concurrency.lockutils [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.684 187212 DEBUG nova.network.neutron [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.688 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Start _get_guest_xml network_info=[{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.694 187212 WARNING nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.705 187212 DEBUG nova.virt.libvirt.host [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.706 187212 DEBUG nova.virt.libvirt.host [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.715 187212 DEBUG nova.virt.libvirt.host [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.717 187212 DEBUG nova.virt.libvirt.host [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.717 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.717 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.718 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.718 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.719 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.719 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.719 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.720 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.720 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.721 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.721 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.721 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.726 187212 DEBUG nova.virt.libvirt.vif [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-569275018',display_name='tempest-tempest.common.compute-instance-569275018',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-569275018',id=74,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-h1nkz7rb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=54d9605a-998b-4492-afc8-f7a5b0dd4e84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.726 187212 DEBUG nova.network.os_vif_util [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.727 187212 DEBUG nova.network.os_vif_util [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.729 187212 DEBUG nova.objects.instance [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_devices' on Instance uuid 54d9605a-998b-4492-afc8-f7a5b0dd4e84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.746 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <uuid>54d9605a-998b-4492-afc8-f7a5b0dd4e84</uuid>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <name>instance-0000004a</name>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <nova:name>tempest-tempest.common.compute-instance-569275018</nova:name>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:10:00</nova:creationTime>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:10:00 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:10:00 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:10:00 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:10:00 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:10:00 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:10:00 compute-0 nova_compute[187208]:         <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:10:00 compute-0 nova_compute[187208]:         <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:10:00 compute-0 nova_compute[187208]:         <nova:port uuid="ef99bad5-d092-46f6-9b3a-8225cc233d1e">
Dec 05 12:10:00 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <system>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <entry name="serial">54d9605a-998b-4492-afc8-f7a5b0dd4e84</entry>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <entry name="uuid">54d9605a-998b-4492-afc8-f7a5b0dd4e84</entry>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     </system>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <os>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   </os>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <features>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   </features>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:bd:e5:94"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <target dev="tapef99bad5-d0"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/console.log" append="off"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <video>
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     </video>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:10:00 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:10:00 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:10:00 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:10:00 compute-0 nova_compute[187208]: </domain>
Dec 05 12:10:00 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.752 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Preparing to wait for external event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.753 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.753 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.753 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.754 187212 DEBUG nova.virt.libvirt.vif [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-569275018',display_name='tempest-tempest.common.compute-instance-569275018',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-569275018',id=74,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-h1nkz7rb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=54d9605a-998b-4492-afc8-f7a5b0dd4e84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.755 187212 DEBUG nova.network.os_vif_util [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.755 187212 DEBUG nova.network.os_vif_util [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.756 187212 DEBUG os_vif [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.757 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.757 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.762 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.763 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef99bad5-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapef99bad5-d0, col_values=(('external_ids', {'iface-id': 'ef99bad5-d092-46f6-9b3a-8225cc233d1e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:e5:94', 'vm-uuid': '54d9605a-998b-4492-afc8-f7a5b0dd4e84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.765 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:10:00 compute-0 NetworkManager[55691]: <info>  [1764936600.7671] manager: (tapef99bad5-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.772 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.773 187212 INFO os_vif [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0')
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.777 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-unplugged-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.777 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.777 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.777 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.778 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] No waiting events found dispatching network-vif-unplugged-cf99cdda-7071-4c18-8462-3a556234d81d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.778 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-unplugged-cf99cdda-7071-4c18-8462-3a556234d81d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.778 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.779 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.779 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.779 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.779 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] No waiting events found dispatching network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.780 187212 WARNING nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received unexpected event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d for instance with vm_state active and task_state deleting.
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.781 187212 DEBUG nova.network.neutron [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.800 187212 INFO nova.compute.manager [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Took 2.33 seconds to deallocate network for instance.
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.852 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.853 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.853 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:bd:e5:94, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.853 187212 INFO nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Using config drive
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.856 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.856 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.948 187212 DEBUG nova.compute.provider_tree [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:00 compute-0 nova_compute[187208]: 2025-12-05 12:10:00.963 187212 DEBUG nova.scheduler.client.report [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:01 compute-0 nova_compute[187208]: 2025-12-05 12:10:01.024 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:01 compute-0 nova_compute[187208]: 2025-12-05 12:10:01.057 187212 INFO nova.scheduler.client.report [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Deleted allocations for instance dbbad270-1e3c-41e1-9173-c1b9df0ab2dd
Dec 05 12:10:01 compute-0 nova_compute[187208]: 2025-12-05 12:10:01.147 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.047 187212 INFO nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Creating config drive at /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.053 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpolobh4qj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.183 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpolobh4qj" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:02 compute-0 kernel: tapef99bad5-d0: entered promiscuous mode
Dec 05 12:10:02 compute-0 NetworkManager[55691]: <info>  [1764936602.2676] manager: (tapef99bad5-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/273)
Dec 05 12:10:02 compute-0 ovn_controller[95610]: 2025-12-05T12:10:02Z|00705|binding|INFO|Claiming lport ef99bad5-d092-46f6-9b3a-8225cc233d1e for this chassis.
Dec 05 12:10:02 compute-0 ovn_controller[95610]: 2025-12-05T12:10:02Z|00706|binding|INFO|ef99bad5-d092-46f6-9b3a-8225cc233d1e: Claiming fa:16:3e:bd:e5:94 10.100.0.6
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.269 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.328 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:02 compute-0 ovn_controller[95610]: 2025-12-05T12:10:02Z|00707|binding|INFO|Setting lport ef99bad5-d092-46f6-9b3a-8225cc233d1e ovn-installed in OVS
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.334 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:02 compute-0 systemd-udevd[231456]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:10:02 compute-0 systemd-machined[153543]: New machine qemu-83-instance-0000004a.
Dec 05 12:10:02 compute-0 podman[231425]: 2025-12-05 12:10:02.348091084 +0000 UTC m=+0.084528191 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:10:02 compute-0 systemd[1]: Started Virtual Machine qemu-83-instance-0000004a.
Dec 05 12:10:02 compute-0 NetworkManager[55691]: <info>  [1764936602.3530] device (tapef99bad5-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:10:02 compute-0 NetworkManager[55691]: <info>  [1764936602.3540] device (tapef99bad5-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:10:02 compute-0 ovn_controller[95610]: 2025-12-05T12:10:02Z|00708|binding|INFO|Setting lport ef99bad5-d092-46f6-9b3a-8225cc233d1e up in Southbound
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.397 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:e5:94 10.100.0.6'], port_security=['fa:16:3e:bd:e5:94 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cbdd2780-9e2b-4e10-8d0a-98de936cf6ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ef99bad5-d092-46f6-9b3a-8225cc233d1e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.399 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ef99bad5-d092-46f6-9b3a-8225cc233d1e in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:10:02 compute-0 podman[231439]: 2025-12-05 12:10:02.401140163 +0000 UTC m=+0.086243030 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.401 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.415 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cab6b481-f77e-4cc5-b7c9-9dec533e66aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.416 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbfed6fc-31 in ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.419 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbfed6fc-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.419 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[277dc790-3142-4369-9fbf-f0fe5fd40b2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.421 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[04d8195f-6519-4ef6-afd8-51e052390b22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.441 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[f995f981-a665-42e5-9f00-4e51572ab1a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.460 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6259a408-5f98-44bd-b7d5-e8fded20e679]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.498 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5482da6c-3a4c-4be2-8550-f57d66f6318e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.504 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7fca83-1a73-4c88-b8fe-67388e8a3d3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 NetworkManager[55691]: <info>  [1764936602.5056] manager: (tapfbfed6fc-30): new Veth device (/org/freedesktop/NetworkManager/Devices/274)
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.539 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed7bde6-3b8e-4a7d-8611-cb148ab6e2a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.544 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[29bf5df2-b8ee-49b6-870c-0c6129ae2b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 NetworkManager[55691]: <info>  [1764936602.5695] device (tapfbfed6fc-30): carrier: link connected
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.579 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[274bb502-10ee-4f3c-aa22-741f8176763c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.600 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9e54ac04-4f9b-4502-88d3-816c49f09155]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231502, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.616 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72505309-d959-4fbc-9cc4-e57ed9d560d6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe26:8872'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398313, 'tstamp': 398313}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231503, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.620 187212 DEBUG nova.compute.manager [req-f721178f-5b07-44d5-8279-66ccdf46626b req-d59bc667-927d-4f54-ab01-7a9457e89e46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-deleted-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.634 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b774b591-fba1-45d0-b8f5-90d6f51dcd14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231504, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.673 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0b202cb9-bfe3-4dd3-a95b-d65bf5dcd84c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.744 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8858b129-f3ec-4d0f-8e41-788a5c95659d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.748 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.749 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.750 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:02 compute-0 NetworkManager[55691]: <info>  [1764936602.7536] manager: (tapfbfed6fc-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Dec 05 12:10:02 compute-0 kernel: tapfbfed6fc-30: entered promiscuous mode
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.757 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:02 compute-0 ovn_controller[95610]: 2025-12-05T12:10:02Z|00709|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.776 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.777 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.779 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a2131f08-9763-49f1-b735-76d4f9f22b6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.780 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:10:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.781 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'env', 'PROCESS_TAG=haproxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.823 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936602.8233302, 54d9605a-998b-4492-afc8-f7a5b0dd4e84 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.824 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] VM Started (Lifecycle Event)
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.850 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.861 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936602.8235443, 54d9605a-998b-4492-afc8-f7a5b0dd4e84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.862 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] VM Paused (Lifecycle Event)
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.883 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.888 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:02 compute-0 nova_compute[187208]: 2025-12-05 12:10:02.914 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:03.016 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:03.017 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:03.018 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:03 compute-0 nova_compute[187208]: 2025-12-05 12:10:03.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:03 compute-0 podman[231543]: 2025-12-05 12:10:03.184249085 +0000 UTC m=+0.049928621 container create 2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 12:10:03 compute-0 systemd[1]: Started libpod-conmon-2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0.scope.
Dec 05 12:10:03 compute-0 podman[231543]: 2025-12-05 12:10:03.159572268 +0000 UTC m=+0.025251824 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:10:03 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:10:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0abe6a70244aa25519de7b565800e0d75c41ac2b2899ca45664096e7b4998d70/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:10:03 compute-0 podman[231543]: 2025-12-05 12:10:03.278350939 +0000 UTC m=+0.144030495 container init 2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:10:03 compute-0 podman[231543]: 2025-12-05 12:10:03.284905356 +0000 UTC m=+0.150584892 container start 2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 12:10:03 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[231559]: [NOTICE]   (231563) : New worker (231565) forked
Dec 05 12:10:03 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[231559]: [NOTICE]   (231563) : Loading success.
Dec 05 12:10:03 compute-0 nova_compute[187208]: 2025-12-05 12:10:03.412 187212 DEBUG nova.network.neutron [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updated VIF entry in instance network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:10:03 compute-0 nova_compute[187208]: 2025-12-05 12:10:03.413 187212 DEBUG nova.network.neutron [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:03 compute-0 nova_compute[187208]: 2025-12-05 12:10:03.453 187212 DEBUG oslo_concurrency.lockutils [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.913 187212 DEBUG nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.913 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Processing event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.915 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.915 187212 DEBUG nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] No waiting events found dispatching network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.915 187212 WARNING nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received unexpected event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e for instance with vm_state building and task_state spawning.
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.915 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.919 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936604.9192653, 54d9605a-998b-4492-afc8-f7a5b0dd4e84 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.919 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] VM Resumed (Lifecycle Event)
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.922 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.927 187212 INFO nova.virt.libvirt.driver [-] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Instance spawned successfully.
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.928 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.964 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.971 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.972 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.972 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.973 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.973 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.974 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:04 compute-0 nova_compute[187208]: 2025-12-05 12:10:04.982 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:05 compute-0 nova_compute[187208]: 2025-12-05 12:10:05.262 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:05 compute-0 nova_compute[187208]: 2025-12-05 12:10:05.283 187212 INFO nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Took 16.76 seconds to spawn the instance on the hypervisor.
Dec 05 12:10:05 compute-0 nova_compute[187208]: 2025-12-05 12:10:05.284 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:05 compute-0 nova_compute[187208]: 2025-12-05 12:10:05.366 187212 INFO nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Took 18.28 seconds to build instance.
Dec 05 12:10:05 compute-0 nova_compute[187208]: 2025-12-05 12:10:05.387 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:05 compute-0 nova_compute[187208]: 2025-12-05 12:10:05.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:06 compute-0 nova_compute[187208]: 2025-12-05 12:10:06.716 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936591.7157092, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:06 compute-0 nova_compute[187208]: 2025-12-05 12:10:06.717 187212 INFO nova.compute.manager [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Stopped (Lifecycle Event)
Dec 05 12:10:07 compute-0 nova_compute[187208]: 2025-12-05 12:10:07.013 187212 DEBUG nova.compute.manager [None req-b6d856f4-2460-466b-bf95-245f12e859c0 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:07 compute-0 podman[231589]: 2025-12-05 12:10:07.213221198 +0000 UTC m=+0.058029553 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:10:07 compute-0 podman[231590]: 2025-12-05 12:10:07.245819541 +0000 UTC m=+0.090688987 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 05 12:10:08 compute-0 nova_compute[187208]: 2025-12-05 12:10:08.094 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:08 compute-0 ovn_controller[95610]: 2025-12-05T12:10:08Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:f0:e8 10.100.0.14
Dec 05 12:10:08 compute-0 ovn_controller[95610]: 2025-12-05T12:10:08Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:f0:e8 10.100.0.14
Dec 05 12:10:09 compute-0 nova_compute[187208]: 2025-12-05 12:10:09.617 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:10:10 compute-0 ovn_controller[95610]: 2025-12-05T12:10:10Z|00710|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:10:10 compute-0 ovn_controller[95610]: 2025-12-05T12:10:10Z|00711|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:10:10 compute-0 nova_compute[187208]: 2025-12-05 12:10:10.388 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:10 compute-0 ovn_controller[95610]: 2025-12-05T12:10:10Z|00712|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:10:10 compute-0 ovn_controller[95610]: 2025-12-05T12:10:10Z|00713|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:10:10 compute-0 nova_compute[187208]: 2025-12-05 12:10:10.765 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:10 compute-0 NetworkManager[55691]: <info>  [1764936610.7661] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Dec 05 12:10:10 compute-0 NetworkManager[55691]: <info>  [1764936610.7676] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Dec 05 12:10:10 compute-0 nova_compute[187208]: 2025-12-05 12:10:10.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:10 compute-0 ovn_controller[95610]: 2025-12-05T12:10:10Z|00714|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:10:10 compute-0 nova_compute[187208]: 2025-12-05 12:10:10.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:10 compute-0 ovn_controller[95610]: 2025-12-05T12:10:10Z|00715|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:10:10 compute-0 nova_compute[187208]: 2025-12-05 12:10:10.809 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:11 compute-0 podman[231637]: 2025-12-05 12:10:11.230952739 +0000 UTC m=+0.073954888 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 12:10:11 compute-0 nova_compute[187208]: 2025-12-05 12:10:11.537 187212 DEBUG nova.compute.manager [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:11 compute-0 nova_compute[187208]: 2025-12-05 12:10:11.537 187212 DEBUG nova.compute.manager [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing instance network info cache due to event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:10:11 compute-0 nova_compute[187208]: 2025-12-05 12:10:11.538 187212 DEBUG oslo_concurrency.lockutils [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:11 compute-0 nova_compute[187208]: 2025-12-05 12:10:11.538 187212 DEBUG oslo_concurrency.lockutils [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:11 compute-0 nova_compute[187208]: 2025-12-05 12:10:11.539 187212 DEBUG nova.network.neutron [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:10:11 compute-0 kernel: tap7370bdd5-dd (unregistering): left promiscuous mode
Dec 05 12:10:11 compute-0 NetworkManager[55691]: <info>  [1764936611.7987] device (tap7370bdd5-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:10:11 compute-0 ovn_controller[95610]: 2025-12-05T12:10:11Z|00716|binding|INFO|Releasing lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef from this chassis (sb_readonly=0)
Dec 05 12:10:11 compute-0 ovn_controller[95610]: 2025-12-05T12:10:11Z|00717|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef down in Southbound
Dec 05 12:10:11 compute-0 nova_compute[187208]: 2025-12-05 12:10:11.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:11 compute-0 ovn_controller[95610]: 2025-12-05T12:10:11Z|00718|binding|INFO|Removing iface tap7370bdd5-dd ovn-installed in OVS
Dec 05 12:10:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.817 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f0:e8 10.100.0.14'], port_security=['fa:16:3e:ee:f0:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '159b5354-c124-484f-a8ec-da1abf719114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7370bdd5-ddf8-40de-9f35-975b8ceab3ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.818 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c unbound from our chassis
Dec 05 12:10:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.819 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:10:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.820 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d871d2d-9bb8-4988-a853-19aa53a16383]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.821 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace which is not needed anymore
Dec 05 12:10:11 compute-0 nova_compute[187208]: 2025-12-05 12:10:11.834 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:11 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d00000049.scope: Deactivated successfully.
Dec 05 12:10:11 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d00000049.scope: Consumed 12.620s CPU time.
Dec 05 12:10:11 compute-0 systemd-machined[153543]: Machine qemu-82-instance-00000049 terminated.
Dec 05 12:10:11 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [NOTICE]   (231317) : haproxy version is 2.8.14-c23fe91
Dec 05 12:10:11 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [NOTICE]   (231317) : path to executable is /usr/sbin/haproxy
Dec 05 12:10:11 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [WARNING]  (231317) : Exiting Master process...
Dec 05 12:10:11 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [ALERT]    (231317) : Current worker (231320) exited with code 143 (Terminated)
Dec 05 12:10:11 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [WARNING]  (231317) : All workers exited. Exiting... (0)
Dec 05 12:10:11 compute-0 systemd[1]: libpod-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e.scope: Deactivated successfully.
Dec 05 12:10:11 compute-0 conmon[231297]: conmon 73d6e318f6f61d863eb4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e.scope/container/memory.events
Dec 05 12:10:11 compute-0 podman[231681]: 2025-12-05 12:10:11.953503706 +0000 UTC m=+0.045385931 container died 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 12:10:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e-userdata-shm.mount: Deactivated successfully.
Dec 05 12:10:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-db0958bae067600c1586deb306b205beef4d4a15d45a054b88ed994a15bf001d-merged.mount: Deactivated successfully.
Dec 05 12:10:12 compute-0 podman[231681]: 2025-12-05 12:10:12.00462799 +0000 UTC m=+0.096510185 container cleanup 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:10:12 compute-0 systemd[1]: libpod-conmon-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e.scope: Deactivated successfully.
Dec 05 12:10:12 compute-0 podman[231710]: 2025-12-05 12:10:12.085466084 +0000 UTC m=+0.058989260 container remove 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 05 12:10:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.098 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf92faa-2035-4ee9-ae13-79398681ee19]: (4, ('Fri Dec  5 12:10:11 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e)\n73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e\nFri Dec  5 12:10:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e)\n73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.100 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[073e7487-2007-4378-abde-401f1ec0740c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.100 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:12 compute-0 kernel: tap7be4540a-00: left promiscuous mode
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.120 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1040e75b-41ce-4f8c-bb86-8fedfffafb9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.136 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c20400fd-7934-4e8b-baf9-0419fa691322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.137 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[839a3dae-5c0f-49e4-87fa-016011b855a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.153 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8805a3f7-3dc2-436f-8d93-19d305799144]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397461, 'reachable_time': 16180, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231742, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.155 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:10:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.155 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[40daec45-7d32-430a-97fc-dd5463050dc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:12 compute-0 systemd[1]: run-netns-ovnmeta\x2d7be4540a\x2d0e0d\x2d45dd\x2d9ed3\x2d2c2701ae3e2c.mount: Deactivated successfully.
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.586 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.633 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance shutdown successfully after 13 seconds.
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.639 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance destroyed successfully.
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.644 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance destroyed successfully.
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.644 187212 DEBUG nova.virt.libvirt.vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:58Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.645 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.646 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.646 187212 DEBUG os_vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.648 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.648 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7370bdd5-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.695 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.698 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.700 187212 INFO os_vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd')
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.701 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deleting instance files /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114_del
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.702 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deletion of /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114_del complete
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.873 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.874 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating image(s)
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.874 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.875 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.875 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.888 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.950 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.952 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.952 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:12 compute-0 nova_compute[187208]: 2025-12-05 12:10:12.963 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.018 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.020 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.096 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.110 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk 1073741824" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.112 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.113 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.193 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.197 187212 DEBUG nova.virt.disk.api [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Checking if we can resize image /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.198 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.262 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.264 187212 DEBUG nova.virt.disk.api [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Cannot resize image /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.265 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.265 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Ensure instance console log exists: /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.266 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.266 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.267 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.269 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start _get_guest_xml network_info=[{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.275 187212 WARNING nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.282 187212 DEBUG nova.virt.libvirt.host [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.283 187212 DEBUG nova.virt.libvirt.host [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.287 187212 DEBUG nova.virt.libvirt.host [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.287 187212 DEBUG nova.virt.libvirt.host [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.288 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.288 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.289 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.289 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.289 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.289 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.290 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.290 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.290 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.291 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.291 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.291 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.292 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'vcpu_model' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.319 187212 DEBUG nova.virt.libvirt.vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:12Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.320 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.321 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.322 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <uuid>159b5354-c124-484f-a8ec-da1abf719114</uuid>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <name>instance-00000049</name>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-2012489303</nova:name>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:10:13</nova:creationTime>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:10:13 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:10:13 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:10:13 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:10:13 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:10:13 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:10:13 compute-0 nova_compute[187208]:         <nova:user uuid="ef254bb2df0442c6bcadfb3a6861c0e9">tempest-ServerDiskConfigTestJSON-1245488084-project-member</nova:user>
Dec 05 12:10:13 compute-0 nova_compute[187208]:         <nova:project uuid="e836357870d746e49bc783da7cd3accd">tempest-ServerDiskConfigTestJSON-1245488084</nova:project>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:10:13 compute-0 nova_compute[187208]:         <nova:port uuid="7370bdd5-ddf8-40de-9f35-975b8ceab3ef">
Dec 05 12:10:13 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <system>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <entry name="serial">159b5354-c124-484f-a8ec-da1abf719114</entry>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <entry name="uuid">159b5354-c124-484f-a8ec-da1abf719114</entry>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     </system>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <os>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   </os>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <features>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   </features>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:ee:f0:e8"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <target dev="tap7370bdd5-dd"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/console.log" append="off"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <video>
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     </video>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:10:13 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:10:13 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:10:13 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:10:13 compute-0 nova_compute[187208]: </domain>
Dec 05 12:10:13 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.328 187212 DEBUG nova.virt.libvirt.vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:12Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.329 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.329 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.330 187212 DEBUG os_vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.331 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.332 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.335 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.335 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7370bdd5-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.336 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7370bdd5-dd, col_values=(('external_ids', {'iface-id': '7370bdd5-ddf8-40de-9f35-975b8ceab3ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:f0:e8', 'vm-uuid': '159b5354-c124-484f-a8ec-da1abf719114'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.337 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:13 compute-0 NetworkManager[55691]: <info>  [1764936613.3385] manager: (tap7370bdd5-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.340 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.345 187212 INFO os_vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd')
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.375 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936598.3729482, dbbad270-1e3c-41e1-9173-c1b9df0ab2dd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.376 187212 INFO nova.compute.manager [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] VM Stopped (Lifecycle Event)
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.399 187212 DEBUG nova.compute.manager [None req-b82bb1ab-4def-4890-b17e-e3bf1ca3be4e - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.581 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.596 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.597 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No VIF found with MAC fa:16:3e:ee:f0:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.597 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Using config drive
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.627 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'ec2_ids' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.675 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'keypairs' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.929 187212 DEBUG nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-unplugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.930 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.931 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.931 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.932 187212 DEBUG nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-unplugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.932 187212 WARNING nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-unplugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state rebuild_spawning.
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.932 187212 DEBUG nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.933 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.933 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.933 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.934 187212 DEBUG nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:13 compute-0 nova_compute[187208]: 2025-12-05 12:10:13.934 187212 WARNING nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state rebuild_spawning.
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.132 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating config drive at /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.140 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyqgyouyp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.274 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyqgyouyp" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:14 compute-0 kernel: tap7370bdd5-dd: entered promiscuous mode
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.360 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:14 compute-0 ovn_controller[95610]: 2025-12-05T12:10:14Z|00719|binding|INFO|Claiming lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef for this chassis.
Dec 05 12:10:14 compute-0 ovn_controller[95610]: 2025-12-05T12:10:14Z|00720|binding|INFO|7370bdd5-ddf8-40de-9f35-975b8ceab3ef: Claiming fa:16:3e:ee:f0:e8 10.100.0.14
Dec 05 12:10:14 compute-0 systemd-udevd[231663]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:10:14 compute-0 NetworkManager[55691]: <info>  [1764936614.3686] manager: (tap7370bdd5-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/279)
Dec 05 12:10:14 compute-0 ovn_controller[95610]: 2025-12-05T12:10:14Z|00721|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef up in Southbound
Dec 05 12:10:14 compute-0 ovn_controller[95610]: 2025-12-05T12:10:14Z|00722|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef ovn-installed in OVS
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.374 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f0:e8 10.100.0.14'], port_security=['fa:16:3e:ee:f0:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '159b5354-c124-484f-a8ec-da1abf719114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7370bdd5-ddf8-40de-9f35-975b8ceab3ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.375 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c bound to our chassis
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.377 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.378 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:14 compute-0 NetworkManager[55691]: <info>  [1764936614.3814] device (tap7370bdd5-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:10:14 compute-0 NetworkManager[55691]: <info>  [1764936614.3828] device (tap7370bdd5-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.386 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.389 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ce0fc5-e216-4c46-87bb-105853d42c46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.390 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7be4540a-01 in ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.392 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7be4540a-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.392 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7abb17c1-76f1-4795-9b8a-2bab28fcddd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.394 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.393 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0a3705-652e-42ea-b22a-f24c94de52ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.409 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb8984d-9981-4223-bf5e-d381f5ba0ceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 systemd-machined[153543]: New machine qemu-84-instance-00000049.
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.435 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[787a614b-6c4f-4509-bec3-bd94ea35d675]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 systemd[1]: Started Virtual Machine qemu-84-instance-00000049.
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.472 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[066ce48d-6c11-4fff-92f7-6cab2e11e650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 NetworkManager[55691]: <info>  [1764936614.4804] manager: (tap7be4540a-00): new Veth device (/org/freedesktop/NetworkManager/Devices/280)
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.481 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b1af3abc-d7fd-4ece-a18f-a55bf604fdad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.516 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[aa634e7f-921e-4125-8227-5e0f8343b25c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.520 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6bed1c8b-189d-448b-a7e6-e562019bf80a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 NetworkManager[55691]: <info>  [1764936614.5457] device (tap7be4540a-00): carrier: link connected
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.550 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f2494267-3f0c-4594-b025-93e193f94e7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.569 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1922ccf5-43ed-4e4a-a1a6-5a5a9de53681]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399510, 'reachable_time': 39809, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231810, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.588 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e16e0830-274c-4e4d-b287-00869066bb71]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:4893'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399510, 'tstamp': 399510}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231811, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.605 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[75c55457-84f9-4c05-841b-71dfa13c6eb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399510, 'reachable_time': 39809, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231812, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.641 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[68198b47-0448-4dbc-9286-63f85cfc7527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.698 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce09e67-bf41-420f-b698-b51a7bafafef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.699 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.700 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.700 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7be4540a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.702 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:14 compute-0 kernel: tap7be4540a-00: entered promiscuous mode
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:14 compute-0 NetworkManager[55691]: <info>  [1764936614.7047] manager: (tap7be4540a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.704 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7be4540a-00, col_values=(('external_ids', {'iface-id': '4dcf8e96-bf04-4914-959a-aad071dfa454'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.705 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:14 compute-0 ovn_controller[95610]: 2025-12-05T12:10:14Z|00723|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.720 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.721 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.721 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4255ba7b-a0e4-4a7e-a88a-c3a760715448]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.722 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:10:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.723 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'env', 'PROCESS_TAG=haproxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.898 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 159b5354-c124-484f-a8ec-da1abf719114 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.899 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936614.8985865, 159b5354-c124-484f-a8ec-da1abf719114 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.899 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Resumed (Lifecycle Event)
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.902 187212 DEBUG nova.compute.manager [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.902 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.905 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance spawned successfully.
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.906 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.955 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.959 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.959 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.959 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.960 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.960 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.960 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:14 compute-0 nova_compute[187208]: 2025-12-05 12:10:14.965 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.059 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.061 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936614.9008446, 159b5354-c124-484f-a8ec-da1abf719114 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.062 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Started (Lifecycle Event)
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.103 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.108 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.122 187212 DEBUG nova.compute.manager [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.135 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:10:15 compute-0 podman[231852]: 2025-12-05 12:10:15.107812546 +0000 UTC m=+0.029845945 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.249 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.250 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.250 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:10:15 compute-0 podman[231852]: 2025-12-05 12:10:15.254455605 +0000 UTC m=+0.176488994 container create 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 12:10:15 compute-0 systemd[1]: Started libpod-conmon-13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9.scope.
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.314 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:15 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:10:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf1f79bda5e68e5137db659fc06127f1818d3fbe95292e9ac02d6cba31e7d86e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:10:15 compute-0 podman[231852]: 2025-12-05 12:10:15.427241792 +0000 UTC m=+0.349275201 container init 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:10:15 compute-0 podman[231852]: 2025-12-05 12:10:15.433354137 +0000 UTC m=+0.355387516 container start 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:10:15 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [NOTICE]   (231872) : New worker (231879) forked
Dec 05 12:10:15 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [NOTICE]   (231872) : Loading success.
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.524 187212 DEBUG nova.network.neutron [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updated VIF entry in instance network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.525 187212 DEBUG nova.network.neutron [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:15 compute-0 nova_compute[187208]: 2025-12-05 12:10:15.637 187212 DEBUG oslo_concurrency.lockutils [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:16 compute-0 ovn_controller[95610]: 2025-12-05T12:10:16Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bd:e5:94 10.100.0.6
Dec 05 12:10:16 compute-0 ovn_controller[95610]: 2025-12-05T12:10:16Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bd:e5:94 10.100.0.6
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.842 187212 DEBUG nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.842 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 DEBUG nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 WARNING nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state None.
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 DEBUG nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 DEBUG nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:16 compute-0 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 WARNING nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state None.
Dec 05 12:10:17 compute-0 nova_compute[187208]: 2025-12-05 12:10:17.247 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:17 compute-0 nova_compute[187208]: 2025-12-05 12:10:17.742 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:17 compute-0 nova_compute[187208]: 2025-12-05 12:10:17.742 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:17 compute-0 nova_compute[187208]: 2025-12-05 12:10:17.878 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:10:17 compute-0 nova_compute[187208]: 2025-12-05 12:10:17.998 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:17.999 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.006 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.006 187212 INFO nova.compute.claims [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.099 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.130 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "4053596b-9c68-4044-bb28-5f57016c8e62" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.131 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.156 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.254 187212 DEBUG nova.compute.provider_tree [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.280 187212 DEBUG nova.scheduler.client.report [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.286 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.331 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.332 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.335 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.338 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.342 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.342 187212 INFO nova.compute.claims [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.409 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.410 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.440 187212 INFO nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.553 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.660 187212 DEBUG nova.compute.provider_tree [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.670 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.671 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.672 187212 INFO nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Creating image(s)
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.672 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.672 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.673 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.689 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.710 187212 DEBUG nova.scheduler.client.report [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.748 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.750 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.751 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.763 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.808 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.809 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.817 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.818 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.874 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.875 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.875 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.902 187212 DEBUG nova.policy [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.930 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.931 187212 DEBUG nova.virt.disk.api [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Checking if we can resize image /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.931 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.992 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.993 187212 DEBUG nova.virt.disk.api [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Cannot resize image /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:10:18 compute-0 nova_compute[187208]: 2025-12-05 12:10:18.993 187212 DEBUG nova.objects.instance [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'migration_context' on Instance uuid ecc25cb4-5b3a-43f7-949d-ca9a1a19056a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.027 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.028 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Ensure instance console log exists: /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.028 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.029 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.029 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:19 compute-0 podman[231908]: 2025-12-05 12:10:19.210781418 +0000 UTC m=+0.056527129 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.217 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.235 187212 INFO nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.263 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.447 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.449 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.449 187212 INFO nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Creating image(s)
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.450 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.450 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.451 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.463 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.517 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.518 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.519 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.531 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.587 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.589 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.625 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.627 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.627 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.691 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.692 187212 DEBUG nova.virt.disk.api [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Checking if we can resize image /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.692 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.760 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.761 187212 DEBUG nova.virt.disk.api [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Cannot resize image /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.762 187212 DEBUG nova.objects.instance [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lazy-loading 'migration_context' on Instance uuid 4053596b-9c68-4044-bb28-5f57016c8e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.832 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.833 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Ensure instance console log exists: /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.834 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.835 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.835 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.838 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.845 187212 WARNING nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.852 187212 DEBUG nova.virt.libvirt.host [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.853 187212 DEBUG nova.virt.libvirt.host [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.857 187212 DEBUG nova.virt.libvirt.host [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.857 187212 DEBUG nova.virt.libvirt.host [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.858 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.858 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.864 187212 DEBUG nova.objects.instance [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4053596b-9c68-4044-bb28-5f57016c8e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.890 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.891 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.891 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.893 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.893 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.895 187212 INFO nova.compute.manager [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Terminating instance
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.896 187212 DEBUG nova.compute.manager [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:10:19 compute-0 kernel: tap7370bdd5-dd (unregistering): left promiscuous mode
Dec 05 12:10:19 compute-0 NetworkManager[55691]: <info>  [1764936619.9214] device (tap7370bdd5-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.922 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <uuid>4053596b-9c68-4044-bb28-5f57016c8e62</uuid>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <name>instance-0000004c</name>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersAaction247Test-server-1533885512</nova:name>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:10:19</nova:creationTime>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:10:19 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:10:19 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:10:19 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:10:19 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:10:19 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:10:19 compute-0 nova_compute[187208]:         <nova:user uuid="b63e7b2645a24842a40a218743fdda6f">tempest-ServersAaction247Test-924836898-project-member</nova:user>
Dec 05 12:10:19 compute-0 nova_compute[187208]:         <nova:project uuid="b6854395cda4464cb303b7eb51b4e4f1">tempest-ServersAaction247Test-924836898</nova:project>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <system>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <entry name="serial">4053596b-9c68-4044-bb28-5f57016c8e62</entry>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <entry name="uuid">4053596b-9c68-4044-bb28-5f57016c8e62</entry>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     </system>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <os>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   </os>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <features>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   </features>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.config"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/console.log" append="off"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <video>
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     </video>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:10:19 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:10:19 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:10:19 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:10:19 compute-0 nova_compute[187208]: </domain>
Dec 05 12:10:19 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:10:19 compute-0 ovn_controller[95610]: 2025-12-05T12:10:19Z|00724|binding|INFO|Releasing lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef from this chassis (sb_readonly=0)
Dec 05 12:10:19 compute-0 ovn_controller[95610]: 2025-12-05T12:10:19Z|00725|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef down in Southbound
Dec 05 12:10:19 compute-0 ovn_controller[95610]: 2025-12-05T12:10:19Z|00726|binding|INFO|Removing iface tap7370bdd5-dd ovn-installed in OVS
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.927 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.936 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f0:e8 10.100.0.14'], port_security=['fa:16:3e:ee:f0:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '159b5354-c124-484f-a8ec-da1abf719114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7370bdd5-ddf8-40de-9f35-975b8ceab3ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.938 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c unbound from our chassis
Dec 05 12:10:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.940 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:10:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.941 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e681993d-4b5b-4812-bc3f-9e275a9d91bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.941 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace which is not needed anymore
Dec 05 12:10:19 compute-0 nova_compute[187208]: 2025-12-05 12:10:19.942 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:19 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d00000049.scope: Deactivated successfully.
Dec 05 12:10:19 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d00000049.scope: Consumed 5.521s CPU time.
Dec 05 12:10:19 compute-0 systemd-machined[153543]: Machine qemu-84-instance-00000049 terminated.
Dec 05 12:10:20 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [NOTICE]   (231872) : haproxy version is 2.8.14-c23fe91
Dec 05 12:10:20 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [NOTICE]   (231872) : path to executable is /usr/sbin/haproxy
Dec 05 12:10:20 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [WARNING]  (231872) : Exiting Master process...
Dec 05 12:10:20 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [ALERT]    (231872) : Current worker (231879) exited with code 143 (Terminated)
Dec 05 12:10:20 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [WARNING]  (231872) : All workers exited. Exiting... (0)
Dec 05 12:10:20 compute-0 systemd[1]: libpod-13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9.scope: Deactivated successfully.
Dec 05 12:10:20 compute-0 podman[231973]: 2025-12-05 12:10:20.071057869 +0000 UTC m=+0.047599564 container died 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 05 12:10:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9-userdata-shm.mount: Deactivated successfully.
Dec 05 12:10:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-cf1f79bda5e68e5137db659fc06127f1818d3fbe95292e9ac02d6cba31e7d86e-merged.mount: Deactivated successfully.
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.152 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance destroyed successfully.
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.153 187212 DEBUG nova.objects.instance [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'resources' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:20 compute-0 podman[231973]: 2025-12-05 12:10:20.238579005 +0000 UTC m=+0.215120700 container cleanup 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:10:20 compute-0 systemd[1]: libpod-conmon-13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9.scope: Deactivated successfully.
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.275 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.277 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.278 187212 INFO nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Using config drive
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.280 187212 DEBUG nova.virt.libvirt.vif [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:15Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.281 187212 DEBUG nova.network.os_vif_util [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.282 187212 DEBUG nova.network.os_vif_util [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.282 187212 DEBUG os_vif [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.283 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.284 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7370bdd5-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.286 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.288 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.290 187212 INFO os_vif [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd')
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.291 187212 INFO nova.virt.libvirt.driver [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deleting instance files /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114_del
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.291 187212 INFO nova.virt.libvirt.driver [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deletion of /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114_del complete
Dec 05 12:10:20 compute-0 podman[232020]: 2025-12-05 12:10:20.320776649 +0000 UTC m=+0.058107195 container remove 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:10:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.327 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[41378f74-c5a7-4b66-87ef-1da9a931676a]: (4, ('Fri Dec  5 12:10:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9)\n13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9\nFri Dec  5 12:10:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9)\n13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.328 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd59179-d43e-45f4-ac17-91617f2d5be1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.329 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:20 compute-0 kernel: tap7be4540a-00: left promiscuous mode
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.344 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.347 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a2c17c87-8e9a-432a-a038-4c0341f3ed9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.371 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d3eebf74-5cdc-4df1-ac4c-ebf55a2e1ebe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.373 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7d675916-6596-4142-9ddc-95862886c549]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.391 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8129ba88-6313-4765-b2c7-5600d9f6cb37]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399503, 'reachable_time': 29416, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232033, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.394 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:10:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.394 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[6d160eae-f368-4ee2-a6f6-b129be714556]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:20 compute-0 systemd[1]: run-netns-ovnmeta\x2d7be4540a\x2d0e0d\x2d45dd\x2d9ed3\x2d2c2701ae3e2c.mount: Deactivated successfully.
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.472 187212 INFO nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Creating config drive at /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.config
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.478 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3wxhtg3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.514 187212 INFO nova.compute.manager [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Took 0.62 seconds to destroy the instance on the hypervisor.
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.515 187212 DEBUG oslo.service.loopingcall [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.516 187212 DEBUG nova.compute.manager [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.516 187212 DEBUG nova.network.neutron [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:10:20 compute-0 nova_compute[187208]: 2025-12-05 12:10:20.609 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3wxhtg3" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:20 compute-0 systemd-machined[153543]: New machine qemu-85-instance-0000004c.
Dec 05 12:10:20 compute-0 systemd[1]: Started Virtual Machine qemu-85-instance-0000004c.
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.115 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936621.1148732, 4053596b-9c68-4044-bb28-5f57016c8e62 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.115 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] VM Resumed (Lifecycle Event)
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.117 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.118 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.122 187212 INFO nova.virt.libvirt.driver [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance spawned successfully.
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.122 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.135 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.141 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.144 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.144 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.145 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.145 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.146 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.146 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.188 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.188 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936621.1160429, 4053596b-9c68-4044-bb28-5f57016c8e62 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.188 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] VM Started (Lifecycle Event)
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.223 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.226 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.231 187212 INFO nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Took 1.78 seconds to spawn the instance on the hypervisor.
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.231 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.254 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.280 187212 INFO nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Took 3.01 seconds to build instance.
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.299 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.411 187212 DEBUG nova.network.neutron [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.432 187212 INFO nova.compute.manager [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Took 0.92 seconds to deallocate network for instance.
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.484 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.484 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.598 187212 DEBUG nova.compute.provider_tree [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.621 187212 DEBUG nova.scheduler.client.report [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.649 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.683 187212 INFO nova.scheduler.client.report [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Deleted allocations for instance 159b5354-c124-484f-a8ec-da1abf719114
Dec 05 12:10:21 compute-0 nova_compute[187208]: 2025-12-05 12:10:21.749 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.858s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:22 compute-0 nova_compute[187208]: 2025-12-05 12:10:22.743 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Successfully created port: 88e41011-3ebc-4215-ad20-58a49d31a6d4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:10:22 compute-0 nova_compute[187208]: 2025-12-05 12:10:22.834 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:23 compute-0 nova_compute[187208]: 2025-12-05 12:10:23.104 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:23 compute-0 nova_compute[187208]: 2025-12-05 12:10:23.525 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:23 compute-0 nova_compute[187208]: 2025-12-05 12:10:23.612 187212 DEBUG nova.compute.manager [req-fdfa2c01-e50d-4f1d-830d-2946122a78dd req-3d143c0c-c39f-4999-8f91-3139f6e0c2ee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-deleted-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.111 187212 DEBUG nova.compute.manager [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.150 187212 INFO nova.compute.manager [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] instance snapshotting
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.151 187212 DEBUG nova.objects.instance [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lazy-loading 'flavor' on Instance uuid 4053596b-9c68-4044-bb28-5f57016c8e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.512 187212 INFO nova.virt.libvirt.driver [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Beginning live snapshot process
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.609 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.609 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.624 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.663 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "4053596b-9c68-4044-bb28-5f57016c8e62" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.664 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.664 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "4053596b-9c68-4044-bb28-5f57016c8e62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.664 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.664 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.665 187212 INFO nova.compute.manager [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Terminating instance
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.666 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "refresh_cache-4053596b-9c68-4044-bb28-5f57016c8e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.666 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquired lock "refresh_cache-4053596b-9c68-4044-bb28-5f57016c8e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.666 187212 DEBUG nova.network.neutron [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.687 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.688 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.696 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.696 187212 INFO nova.compute.claims [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.855 187212 DEBUG nova.compute.provider_tree [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.873 187212 DEBUG nova.scheduler.client.report [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.899 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.900 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.953 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.953 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.974 187212 INFO nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:10:24 compute-0 nova_compute[187208]: 2025-12-05 12:10:24.990 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.094 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.095 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.095 187212 INFO nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Creating image(s)
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.096 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.096 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.097 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.108 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.198 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.200 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.200 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.215 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:25 compute-0 podman[232062]: 2025-12-05 12:10:25.236148231 +0000 UTC m=+0.080603959 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.236 187212 DEBUG nova.network.neutron [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.281 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.282 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.300 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.335 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.337 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.337 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.392 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.394 187212 DEBUG nova.virt.disk.api [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Checking if we can resize image /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.394 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.415 187212 DEBUG nova.policy [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef254bb2df0442c6bcadfb3a6861c0e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e836357870d746e49bc783da7cd3accd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.457 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.458 187212 DEBUG nova.virt.disk.api [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Cannot resize image /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.459 187212 DEBUG nova.objects.instance [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'migration_context' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.477 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.478 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Ensure instance console log exists: /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.478 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.478 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.479 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.638 187212 DEBUG nova.compute.manager [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.694 187212 DEBUG nova.network.neutron [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.724 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Releasing lock "refresh_cache-4053596b-9c68-4044-bb28-5f57016c8e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.725 187212 DEBUG nova.compute.manager [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:10:25 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Dec 05 12:10:25 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d0000004c.scope: Consumed 5.116s CPU time.
Dec 05 12:10:25 compute-0 systemd-machined[153543]: Machine qemu-85-instance-0000004c terminated.
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.987 187212 INFO nova.virt.libvirt.driver [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance destroyed successfully.
Dec 05 12:10:25 compute-0 nova_compute[187208]: 2025-12-05 12:10:25.988 187212 DEBUG nova.objects.instance [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lazy-loading 'resources' on Instance uuid 4053596b-9c68-4044-bb28-5f57016c8e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.017 187212 INFO nova.virt.libvirt.driver [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Deleting instance files /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62_del
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.018 187212 INFO nova.virt.libvirt.driver [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Deletion of /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62_del complete
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.086 187212 INFO nova.compute.manager [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.086 187212 DEBUG oslo.service.loopingcall [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.087 187212 DEBUG nova.compute.manager [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.087 187212 DEBUG nova.network.neutron [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.186 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "e689e2f0-16e9-402a-986e-a769d72fa0bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.187 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.205 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.275 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.275 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.276 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Successfully updated port: 88e41011-3ebc-4215-ad20-58a49d31a6d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.285 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.285 187212 INFO nova.compute.claims [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.291 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.292 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.292 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.321 187212 DEBUG nova.compute.manager [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.355 187212 DEBUG nova.network.neutron [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.375 187212 DEBUG nova.network.neutron [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.389 187212 INFO nova.compute.manager [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Took 0.30 seconds to deallocate network for instance.
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.437 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.456 187212 DEBUG nova.compute.manager [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.457 187212 DEBUG nova.compute.manager [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing instance network info cache due to event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.457 187212 DEBUG oslo_concurrency.lockutils [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.518 187212 DEBUG nova.compute.provider_tree [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.534 187212 DEBUG nova.scheduler.client.report [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.555 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.556 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.559 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.612 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.612 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.634 187212 INFO nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.666 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.696 187212 DEBUG nova.compute.provider_tree [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.740 187212 DEBUG nova.scheduler.client.report [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.774 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.925 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.927 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.928 187212 INFO nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Creating image(s)
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.929 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "/var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.929 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "/var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.930 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "/var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:26 compute-0 nova_compute[187208]: 2025-12-05 12:10:26.956 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.053 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.055 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.056 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.081 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.168 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.170 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.441 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk 1073741824" returned: 0 in 0.271s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.442 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.443 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.505 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.507 187212 DEBUG nova.virt.disk.api [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Checking if we can resize image /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.508 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.556 187212 INFO nova.scheduler.client.report [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Deleted allocations for instance 4053596b-9c68-4044-bb28-5f57016c8e62
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.572 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.573 187212 DEBUG nova.virt.disk.api [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Cannot resize image /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.573 187212 DEBUG nova.objects.instance [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lazy-loading 'migration_context' on Instance uuid e689e2f0-16e9-402a-986e-a769d72fa0bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.580 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.620 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.621 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Ensure instance console log exists: /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.621 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.622 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.622 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.758 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:27 compute-0 nova_compute[187208]: 2025-12-05 12:10:27.987 187212 DEBUG nova.policy [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b8b32a7fde5424795b54914a14028b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e3f3e747de24befad6008f67eb551ae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:10:28 compute-0 nova_compute[187208]: 2025-12-05 12:10:28.104 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:28 compute-0 nova_compute[187208]: 2025-12-05 12:10:28.758 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:28 compute-0 nova_compute[187208]: 2025-12-05 12:10:28.776 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Successfully created port: 8c343187-712d-4aee-9c47-18497ec1042e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.074 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.097 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.097 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Instance network_info: |[{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.097 187212 DEBUG oslo_concurrency.lockutils [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.098 187212 DEBUG nova.network.neutron [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.101 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Start _get_guest_xml network_info=[{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.106 187212 WARNING nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.110 187212 DEBUG nova.virt.libvirt.host [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.110 187212 DEBUG nova.virt.libvirt.host [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.117 187212 DEBUG nova.virt.libvirt.host [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.117 187212 DEBUG nova.virt.libvirt.host [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.118 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.118 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.118 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.119 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.119 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.119 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.119 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.120 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.120 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.120 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.120 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.121 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.124 187212 DEBUG nova.virt.libvirt.vif [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.125 187212 DEBUG nova.network.os_vif_util [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.126 187212 DEBUG nova.network.os_vif_util [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.127 187212 DEBUG nova.objects.instance [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_devices' on Instance uuid ecc25cb4-5b3a-43f7-949d-ca9a1a19056a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.192 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <uuid>ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</uuid>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <name>instance-0000004b</name>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <nova:name>tempest-tempest.common.compute-instance-914539058</nova:name>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:10:30</nova:creationTime>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:10:30 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:10:30 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:10:30 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:10:30 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:10:30 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:10:30 compute-0 nova_compute[187208]:         <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:10:30 compute-0 nova_compute[187208]:         <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:10:30 compute-0 nova_compute[187208]:         <nova:port uuid="88e41011-3ebc-4215-ad20-58a49d31a6d4">
Dec 05 12:10:30 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <system>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <entry name="serial">ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</entry>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <entry name="uuid">ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</entry>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     </system>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <os>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   </os>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <features>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   </features>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:a2:40:d1"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <target dev="tap88e41011-3e"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/console.log" append="off"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <video>
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     </video>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:10:30 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:10:30 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:10:30 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:10:30 compute-0 nova_compute[187208]: </domain>
Dec 05 12:10:30 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.194 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Preparing to wait for external event network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.195 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.195 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.195 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.196 187212 DEBUG nova.virt.libvirt.vif [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.196 187212 DEBUG nova.network.os_vif_util [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.197 187212 DEBUG nova.network.os_vif_util [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.197 187212 DEBUG os_vif [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.198 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.198 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.199 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.203 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88e41011-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.204 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap88e41011-3e, col_values=(('external_ids', {'iface-id': '88e41011-3ebc-4215-ad20-58a49d31a6d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:40:d1', 'vm-uuid': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.205 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:30 compute-0 NetworkManager[55691]: <info>  [1764936630.2064] manager: (tap88e41011-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.208 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.216 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.217 187212 INFO os_vif [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e')
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.279 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.280 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.280 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:a2:40:d1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.281 187212 INFO nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Using config drive
Dec 05 12:10:30 compute-0 nova_compute[187208]: 2025-12-05 12:10:30.964 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Successfully created port: 10dc6775-d9c9-40ca-bd05-41c56cffc744 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:10:31 compute-0 nova_compute[187208]: 2025-12-05 12:10:31.663 187212 INFO nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Creating config drive at /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config
Dec 05 12:10:31 compute-0 nova_compute[187208]: 2025-12-05 12:10:31.671 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl7xy1vkq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:31 compute-0 nova_compute[187208]: 2025-12-05 12:10:31.805 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl7xy1vkq" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:31 compute-0 nova_compute[187208]: 2025-12-05 12:10:31.806 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:31 compute-0 nova_compute[187208]: 2025-12-05 12:10:31.807 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:31 compute-0 kernel: tap88e41011-3e: entered promiscuous mode
Dec 05 12:10:31 compute-0 NetworkManager[55691]: <info>  [1764936631.8747] manager: (tap88e41011-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Dec 05 12:10:31 compute-0 nova_compute[187208]: 2025-12-05 12:10:31.874 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:31 compute-0 ovn_controller[95610]: 2025-12-05T12:10:31Z|00727|binding|INFO|Claiming lport 88e41011-3ebc-4215-ad20-58a49d31a6d4 for this chassis.
Dec 05 12:10:31 compute-0 ovn_controller[95610]: 2025-12-05T12:10:31Z|00728|binding|INFO|88e41011-3ebc-4215-ad20-58a49d31a6d4: Claiming fa:16:3e:a2:40:d1 10.100.0.8
Dec 05 12:10:31 compute-0 nova_compute[187208]: 2025-12-05 12:10:31.888 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:31 compute-0 ovn_controller[95610]: 2025-12-05T12:10:31Z|00729|binding|INFO|Setting lport 88e41011-3ebc-4215-ad20-58a49d31a6d4 ovn-installed in OVS
Dec 05 12:10:31 compute-0 nova_compute[187208]: 2025-12-05 12:10:31.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:31 compute-0 systemd-udevd[232140]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:10:31 compute-0 systemd-machined[153543]: New machine qemu-86-instance-0000004b.
Dec 05 12:10:31 compute-0 NetworkManager[55691]: <info>  [1764936631.9234] device (tap88e41011-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:10:31 compute-0 NetworkManager[55691]: <info>  [1764936631.9239] device (tap88e41011-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:10:31 compute-0 nova_compute[187208]: 2025-12-05 12:10:31.925 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:10:31 compute-0 systemd[1]: Started Virtual Machine qemu-86-instance-0000004b.
Dec 05 12:10:31 compute-0 ovn_controller[95610]: 2025-12-05T12:10:31Z|00730|binding|INFO|Setting lport 88e41011-3ebc-4215-ad20-58a49d31a6d4 up in Southbound
Dec 05 12:10:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:31.972 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:40:d1 10.100.0.8'], port_security=['fa:16:3e:a2:40:d1 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cbdd2780-9e2b-4e10-8d0a-98de936cf6ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=88e41011-3ebc-4215-ad20-58a49d31a6d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:31.973 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 88e41011-3ebc-4215-ad20-58a49d31a6d4 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:10:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:31.976 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:10:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:31.996 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aa6ac8ca-bed6-4e05-a210-486c26fa2f34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.045 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c11f060c-ae5e-464c-a0ce-ec6c36694023]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.049 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8482a5-9ca7-4c29-acdb-0ee87c314caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.090 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5b0d6b-6549-48b9-94d8-d95c64120559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.111 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84afbee7-499f-47dd-9020-7d22fbb8bbb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232155, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.131 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb50bd2-16a7-4e91-9ff7-967d06bebc1b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398326, 'tstamp': 398326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232156, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398330, 'tstamp': 398330}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232156, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.134 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.137 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.137 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.137 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.137 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.138 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.206 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.207 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.221 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.221 187212 INFO nova.compute.claims [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.397 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936632.3973331, ecc25cb4-5b3a-43f7-949d-ca9a1a19056a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.398 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] VM Started (Lifecycle Event)
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.421 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.426 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936632.3975806, ecc25cb4-5b3a-43f7-949d-ca9a1a19056a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.426 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] VM Paused (Lifecycle Event)
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.672 187212 DEBUG nova.compute.provider_tree [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.770 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.771 187212 DEBUG nova.scheduler.client.report [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.779 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.869 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.958 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:32 compute-0 nova_compute[187208]: 2025-12-05 12:10:32.959 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.094 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.095 187212 DEBUG nova.network.neutron [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.107 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.118 187212 INFO nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.141 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:10:33 compute-0 podman[232165]: 2025-12-05 12:10:33.218853843 +0000 UTC m=+0.063469379 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.228 187212 DEBUG nova.network.neutron [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updated VIF entry in instance network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.228 187212 DEBUG nova.network.neutron [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:33 compute-0 podman[232164]: 2025-12-05 12:10:33.254180724 +0000 UTC m=+0.100643772 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.506 187212 DEBUG oslo_concurrency.lockutils [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.627 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.629 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.630 187212 INFO nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Creating image(s)
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.630 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "/var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.630 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.631 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.646 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.669 187212 DEBUG nova.policy [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.708 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.709 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.710 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.722 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.803 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.805 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.946 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk 1073741824" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.947 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:33 compute-0 nova_compute[187208]: 2025-12-05 12:10:33.948 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.007 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.008 187212 DEBUG nova.virt.disk.api [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Checking if we can resize image /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.009 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.064 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.065 187212 DEBUG nova.virt.disk.api [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Cannot resize image /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.066 187212 DEBUG nova.objects.instance [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'migration_context' on Instance uuid 30cb83d4-3a34-4420-bc83-099b266da48c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.085 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.085 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Ensure instance console log exists: /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.086 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.086 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.086 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.328 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Successfully updated port: 8c343187-712d-4aee-9c47-18497ec1042e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.619 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.619 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquired lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.620 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.624 187212 DEBUG nova.compute.manager [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-changed-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.624 187212 DEBUG nova.compute.manager [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Refreshing instance network info cache due to event network-changed-8c343187-712d-4aee-9c47-18497ec1042e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:10:34 compute-0 nova_compute[187208]: 2025-12-05 12:10:34.625 187212 DEBUG oslo_concurrency.lockutils [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.072 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.151 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936620.150944, 159b5354-c124-484f-a8ec-da1abf719114 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.152 187212 INFO nova.compute.manager [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Stopped (Lifecycle Event)
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.173 187212 DEBUG nova.compute.manager [None req-7daee835-e878-462a-bd9a-eaed73c7a230 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.206 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.363 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'name': 'tempest-tempest.common.compute-instance-569275018', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98681240c47b41cba28d91e1c11fd71f', 'user_id': '242b773b0af24caf814e2a84178332d5', 'hostId': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.366 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'name': 'tempest-tempest.common.compute-instance-914539058', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '98681240c47b41cba28d91e1c11fd71f', 'user_id': '242b773b0af24caf814e2a84178332d5', 'hostId': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.367 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.372 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 54d9605a-998b-4492-afc8-f7a5b0dd4e84 / tapef99bad5-d0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.372 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.375 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ecc25cb4-5b3a-43f7-949d-ca9a1a19056a / tap88e41011-3e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.375 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ee6244f-292c-4898-87a3-40503dceef6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.367882', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '67879fa2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'e3ffc561d974dcff7de17e61f5f0c206ed5d5d5b21464365a7e9b4cbe23860c1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.367882', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '678816a8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'b166f8b3ed46eb1d65447ebc6c58fa0a6075dd35772a1c23e254a6062a42d5b9'}]}, 'timestamp': '2025-12-05 12:10:35.376371', '_unique_id': '740dfc1403364ebd81137d408fc591be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.379 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.379 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.379 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>]
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.379 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.392 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.393 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.403 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.404 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0ceb262-3873-4349-9293-eb3f104a75bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.379804', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '678aa5a8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': 'a0d49b49f94a1568fcd16d3188a028ec2f0eb7be686e72c562499bb4203a58cc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.379804', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '678ab6a6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': 'c5a642a2c3ec58f75730ae40ee2df8dd7e4281c27729ecefb598d5e5fe604756'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.379804', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '678c509c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'a561e77b8cd9565b28ec86370f9e2fc168d38a2f92b5e73e03333eea663f03a5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.379804', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '678c5fa6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'ef3d5574e120c41fa5d21593b949debec714a2cae2ccb0d64d35d57ccdf6f09d'}]}, 'timestamp': '2025-12-05 12:10:35.404436', '_unique_id': '64957bc634fc4c23af8c423065ec5a49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.438 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.bytes volume: 30243328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.438 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.460 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.460 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89cb980c-7a6a-42f1-a761-af3ae5620973', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30243328, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.406620', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6791a4ca-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '1548c642611bf3a110a517f1c4620ed536faf79b0037d1cef3280d66cc7b7540'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.406620', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6791b014-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '6c3a2584f372d998296bc51de8e3c3d697d8270b2b5a161d6d46afb78cd9c75b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.406620', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6794edc4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '85c2134e0fbc8f4190314b2c10c1963a4d26856e5188d7b7b7f031821e40633b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.406620', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6794f918-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '72346a86546f45a4511e91edbebc5c5ab1304795f234947cd763b360f6a37715'}]}, 'timestamp': '2025-12-05 12:10:35.460809', '_unique_id': '46e9b1a28a894a94b278da1b115fcb42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.462 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.462 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.462 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.462 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19385ae9-28cc-4eb2-a759-dbb67da93c3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.462500', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679546b6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': 'be9e32dbf7949037d8dc30c76b38d7cb6d5fc77985e038b51c7b39a476f888af'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.462500', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67954e2c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': '61ecafe6d24183d5f8ed9117a2fb913cdd082295a0e82555176081ad400f77eb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.462500', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679555de-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'fd7af1c35f018dfbf1f7eb37edb2f67dd040a763a0fcd50e746af9beb0c42587'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.462500', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67955eee-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'f8b684081f26e701eda1bd49df3d4cb38e80e6ab8924b5483625ab60c7c0d877'}]}, 'timestamp': '2025-12-05 12:10:35.463335', '_unique_id': '5572a8dc9b944e7ab2451e5317d6fd98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.464 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.464 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.464 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b733c48-214c-4cf4-9b64-5a3f1a34f76d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.464567', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679598d2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'b47f5c010f94161ff023667c0d917074483eaa95a12212e3b485b009aa382b95'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.464567', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '6795a200-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': '06bf847c07670f4b1ce8230474f7ca084a4f143fceec78728831335a07b8adb2'}]}, 'timestamp': '2025-12-05 12:10:35.465123', '_unique_id': '54945c73d6ae4c32afce2c943377129f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.466 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.466 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1bd2055-1ac1-4184-a690-84b44dc434fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.466530', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '6795e684-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'a4b9a752734e9f5fca0ff6978a17587e6cc3f66a11248f41678a4447d55367e4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.466530', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '6795f174-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'b1cbc7083e0fe37d2c1dc0714dbddecd658d8c1ea91a7e73f6939ca044bda5ed'}]}, 'timestamp': '2025-12-05 12:10:35.467154', '_unique_id': '365223e7c35a475d97cfd9bfcdcb777e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.468 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.468 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.468 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70735f2b-d94e-4bcc-920e-dfd3f605e54d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.468446', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '67962ff4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': '655fad305fdc328893e77ba38e26605a914f67d61236c5df9085e510cdf5b5c0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.468446', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '67963ba2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': '29e3b4b9e13b8f6cfe10ba4af693a98134c4db006a7dad73d4676b10420abc16'}]}, 'timestamp': '2025-12-05 12:10:35.469048', '_unique_id': '0489860b3dac46fa92f16fd886142e4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.470 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.470 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.470 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '065b6943-08e3-4236-90f7-196059909052', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.470190', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679672d4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': '98a569d4056e8509b28a9fff09c6b4fdb7ffc7d4d5d553743b7a5bd3e0399205'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.470190', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '67967aea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'e3864d80c548b559e93ec593a94b7119e50f8c46b3c2cf086f0e770fed6d1246'}]}, 'timestamp': '2025-12-05 12:10:35.470611', '_unique_id': '35a0f617174d42e99813eab47b3ddbba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.latency volume: 224419055 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.latency volume: 21240656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.472 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.472 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e41f37ec-9d9b-46e6-a365-e8f9257b6049', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 224419055, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.471688', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6796ad3a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '561a5afdf998a9d58da1e16e7549b69b61c9b2791165efba118404c64ce4c796'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21240656, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.471688', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6796b6f4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '87a135ce881ef8e97ec1f837386a93521244b01cfed4b6c4540ccc937337ab4a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.471688', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6796c05e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '0e00fe7aa7e64e9b10bf3d6be194b14553e560ca13d5467f66cc3a1c44ab612f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.471688', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6796ca86-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'b6c7a8a36666584ee721f44943a1e40dadf820cc9176b798c05680741a09c386'}]}, 'timestamp': '2025-12-05 12:10:35.472695', '_unique_id': 'eb5f27ca792a467cb2f116416e6d4a1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4af9242-7dcb-43a4-bacc-a2b3db599161', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.473890', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '67970352-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'd9b0e44316c5e364c03a88ff3ec612766c39a9d82c4867f618268ce28dc10abf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.473890', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '67970c44-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': '4f4bc838ad0a474169876973fe805ab0863898d46914bd59f136eead41eb320f'}]}, 'timestamp': '2025-12-05 12:10:35.474331', '_unique_id': 'e2e953b41e3c4a7799c12cc766041bb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>]
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.bytes volume: 72925184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.476 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.476 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.476 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2efa5b8c-04b1-49af-afe7-624ff3e2509a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72925184, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.475915', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67975366-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': 'f7d4997888e3d61792f37c1c9043f348130d02e84482b8cdfbbc2f72c8bc05e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.475915', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67975b54-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '029b474f604d4f1a4e0aa5d140f52caa7fae878370a8f87ddc494cbf68217f4e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.475915', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67976464-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'f1535f0beb9291c2f018b3de001797c47ed22e33e3aed6d3b1c656f1f7b1b39a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.475915', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67976f0e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'cb7f6f1dad6058a80e48f2c678e2154c3a6fe46676f1b90a60f985379c73495f'}]}, 'timestamp': '2025-12-05 12:10:35.476891', '_unique_id': '974eacf9ed744d8596737b8aa4dcdd7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.478 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.498 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/cpu volume: 10980000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.508 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.514 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccbb3d4f-1114-41e7-b69f-1360ff3f219a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10980000000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'timestamp': '2025-12-05T12:10:35.478470', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '679abc72-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.118920524, 'message_signature': 'fd3ae8049a713f24cd697cbe15e523ecafa0164ab5c37b1f193b5ff3449d7f21'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'timestamp': '2025-12-05T12:10:35.478470', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '679d3088-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.135104737, 'message_signature': 'd8226b0875f21f979e25381e405f9fc6e3faa3a56e000c91173235ad6389fc97'}]}, 'timestamp': '2025-12-05 12:10:35.514716', '_unique_id': '84f847cc3e8c445592e69aa1c7bf6e2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.516 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.516 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.requests volume: 1092 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.517 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.517 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.517 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cf501e4-ee6b-4a98-ba54-25f835567591', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1092, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.516563', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679da072-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': 'ea289b470b837ddbe402e829ce15f310cd0b73479117aeae2fe5fa398874d205'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.516563', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679da978-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '9486a8555f71c8273a705372ebde2c4c9c37cb0f9b9e4e829c2cbfffa2223be4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.516563', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679db198-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '8f1767f54c8b6efa7060896ed737877200cc4571129bf927d59b45617b1139f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.516563', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679dbbe8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'f959a2b136ee341152d1dd4ebbecd44daefe9226b272d39a73e6c0c93f837eaa'}]}, 'timestamp': '2025-12-05 12:10:35.518173', '_unique_id': 'b93250d030684d56be2c064f43baf9f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.519 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.519 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.519 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.520 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.520 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38cb8820-abfb-4fa0-8229-db07525f6ede', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.519678', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679e0170-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': 'f732ed6ac40479f13aa9c592985070202bd7f56ac7101d8da53873c556dfbe10'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.519678', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679e09d6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': '589087d8cdfbd8e1bf37f6bb33614e4d69c61d1ffb099b5df04dc1171864afb6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.519678', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679e1156-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'a1a4c21a8bb6bef2ebe713c860037dad1f4c2e8c80e3d19d995f2eb6a0436e65'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.519678', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679e1886-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'fa88e6517d755748128f5835dd658c56941b5262b9fe8bcbe0e3ec6cdc584ec9'}]}, 'timestamp': '2025-12-05 12:10:35.520511', '_unique_id': '90d2634c16754cbfbc4d1cbf530664d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.latency volume: 4439600897 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.522 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.522 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.522 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63a798d9-14c5-4234-8337-c0355057e7d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4439600897, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.521930', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679e5ae4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': 'cdbe72b0ad6cc04581435dfb0f110dc5c11481859326d8e3ef98503693509802'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.521930', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679e634a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': 'b4150a5fbcff8675c543ec9ebf2e58dc7e95fb51d99c8ee3671bd0993dba504a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.521930', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679e6ac0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '94aee0ec138063744306f1e465c5a09bead883cf06a8c1e5c3bd1626e3a3d932'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.521930', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679e7272-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '93c3ba61e894ce48f9b66f5e0eb24cbfb2636f0b113be071fd8c77c369c92499'}]}, 'timestamp': '2025-12-05 12:10:35.522841', '_unique_id': 'e9ea58007ec14183be4bd4969934756b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.524 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.524 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/memory.usage volume: 42.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.524 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.524 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a: ceilometer.compute.pollsters.NoVolumeException
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15275e60-bc2f-4d30-a644-5829eeb85daf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.671875, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'timestamp': '2025-12-05T12:10:35.524175', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '679eaff8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.118920524, 'message_signature': 'f89a58207c17fa035917f9ab389baeaf4cf402f50ef2dbf545fb24da9b7a6fa4'}]}, 'timestamp': '2025-12-05 12:10:35.524584', '_unique_id': 'ce09d925a396434bbee1497bd2605141'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.526 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.526 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd847d934-b825-4922-b3f7-934b45d7c1a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.525741', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679eeca2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '9660dc6cccee14018ceac31b4abe8b81af231c064a310d7eae99bc00c9fcc218'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.525741', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679ef508-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '350e8a9b918f2ff27e7d25b34ea0f4b1262c6837bf512b6ab1d688f2e3967358'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.525741', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679eff26-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'bbdec4bfc4e4a3b93d07bc7cd4a17423637c4d3aa1ee1c17c3b00c33828f21e2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.525741', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679f0674-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '16ba444e10e4b4d3879dc709f0664aed173e9f6c755d29aa005f06015b7583f6'}]}, 'timestamp': '2025-12-05 12:10:35.526603', '_unique_id': '96256ef8e52a41da9a3760a7d14d462d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47169a6a-1829-47b4-a3fc-f803c4423029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.527815', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679f3e78-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'ae4fc6c8dece74b094dc8e6d2e1722db6ce7edf82c971f5ca2fb73ef300dc6d9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.527815', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '679f4788-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': '85dd77d603730043dd4f0533b29194e0f6d0f4cd1573f7e077bf2355c623c604'}]}, 'timestamp': '2025-12-05 12:10:35.528278', '_unique_id': '84dad1814f174e31a613f22972e412d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>]
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da98dae8-590a-4356-ad8a-0867a7fe42e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1436, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.529723', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679f8810-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'eb0e184dfbe2a1e74f3eca3392297c20f59d9b6fcf4869ca56a079c46f92b210'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.529723', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '679f9364-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'acfac9a011a0d4ca63e4adaa023f051266077597ecdebaf3f9a0c80861c001f1'}]}, 'timestamp': '2025-12-05 12:10:35.530222', '_unique_id': 'b8089e3de2e945e4b3ed412fc80737e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>]
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.532 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bf67bc3-4117-410c-87b1-e374d8d8f2bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.531751', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679fd928-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': '7d839ac6bd8cdb68d7690dd4eab39bd111707ff5dad7e5c4eb6d460fb23f331b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.531751', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '679ff354-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'c07c087556f745338d9782bcd1493479594bf9950eb8e7fc7a3fe92e85487e7a'}]}, 'timestamp': '2025-12-05 12:10:35.532816', '_unique_id': '163511e283e644bc8905501a88597ed1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.534 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.534 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd200daa7-a6df-42f1-94df-abd50a13b6c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.534254', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '67a039cc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'a8e3f61231709bfb46528ad40286317b0b47d4ec4eb23d7bba65e8d5b88c4594'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.534254', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '67a041ce-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'c95507eddb6a441ce33a8534b202cf83d27be6895fda6f4050e9e6b684716950'}]}, 'timestamp': '2025-12-05 12:10:35.534724', '_unique_id': '61f4357a4af44414aa4473a28ce9d620'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:10:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.689 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Successfully updated port: 10dc6775-d9c9-40ca-bd05-41c56cffc744 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.704 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "refresh_cache-e689e2f0-16e9-402a-986e-a769d72fa0bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.704 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquired lock "refresh_cache-e689e2f0-16e9-402a-986e-a769d72fa0bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.704 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:10:35 compute-0 nova_compute[187208]: 2025-12-05 12:10:35.799 187212 DEBUG nova.network.neutron [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Successfully created port: 96dab709-f4e0-48a6-ab76-0b13fdf97017 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.025 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:10:36 compute-0 ovn_controller[95610]: 2025-12-05T12:10:36Z|00731|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.277 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.453 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Updating instance_info_cache with network_info: [{"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.816 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Releasing lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.817 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance network_info: |[{"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.817 187212 DEBUG oslo_concurrency.lockutils [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.817 187212 DEBUG nova.network.neutron [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Refreshing network info cache for port 8c343187-712d-4aee-9c47-18497ec1042e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.820 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start _get_guest_xml network_info=[{"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.826 187212 WARNING nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.832 187212 DEBUG nova.virt.libvirt.host [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.833 187212 DEBUG nova.virt.libvirt.host [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.837 187212 DEBUG nova.virt.libvirt.host [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.838 187212 DEBUG nova.virt.libvirt.host [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.838 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.838 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.839 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.839 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.839 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.839 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.840 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.840 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.840 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.840 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.841 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.841 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.845 187212 DEBUG nova.virt.libvirt.vif [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2029374639',display_name='tempest-ServerDiskConfigTestJSON-server-2029374639',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2029374639',id=77,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ep0a320q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:25Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=8fe1c6df-f787-4c56-b3e7-899cf5e9f723,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.846 187212 DEBUG nova.network.os_vif_util [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.847 187212 DEBUG nova.network.os_vif_util [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.847 187212 DEBUG nova.objects.instance [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_devices' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.865 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <uuid>8fe1c6df-f787-4c56-b3e7-899cf5e9f723</uuid>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <name>instance-0000004d</name>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-2029374639</nova:name>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:10:36</nova:creationTime>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:10:36 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:10:36 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:10:36 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:10:36 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:10:36 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:10:36 compute-0 nova_compute[187208]:         <nova:user uuid="ef254bb2df0442c6bcadfb3a6861c0e9">tempest-ServerDiskConfigTestJSON-1245488084-project-member</nova:user>
Dec 05 12:10:36 compute-0 nova_compute[187208]:         <nova:project uuid="e836357870d746e49bc783da7cd3accd">tempest-ServerDiskConfigTestJSON-1245488084</nova:project>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:10:36 compute-0 nova_compute[187208]:         <nova:port uuid="8c343187-712d-4aee-9c47-18497ec1042e">
Dec 05 12:10:36 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <system>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <entry name="serial">8fe1c6df-f787-4c56-b3e7-899cf5e9f723</entry>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <entry name="uuid">8fe1c6df-f787-4c56-b3e7-899cf5e9f723</entry>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     </system>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <os>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   </os>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <features>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   </features>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.config"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:56:54:21"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <target dev="tap8c343187-71"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/console.log" append="off"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <video>
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     </video>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:10:36 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:10:36 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:10:36 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:10:36 compute-0 nova_compute[187208]: </domain>
Dec 05 12:10:36 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.867 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Preparing to wait for external event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.868 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.868 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.868 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.869 187212 DEBUG nova.virt.libvirt.vif [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2029374639',display_name='tempest-ServerDiskConfigTestJSON-server-2029374639',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2029374639',id=77,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ep0a320q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:25Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=8fe1c6df-f787-4c56-b3e7-899cf5e9f723,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.870 187212 DEBUG nova.network.os_vif_util [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.870 187212 DEBUG nova.network.os_vif_util [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.871 187212 DEBUG os_vif [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.872 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.873 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.873 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.877 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.877 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c343187-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.877 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8c343187-71, col_values=(('external_ids', {'iface-id': '8c343187-712d-4aee-9c47-18497ec1042e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:54:21', 'vm-uuid': '8fe1c6df-f787-4c56-b3e7-899cf5e9f723'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.879 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:36 compute-0 NetworkManager[55691]: <info>  [1764936636.8803] manager: (tap8c343187-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.882 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.885 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:36 compute-0 nova_compute[187208]: 2025-12-05 12:10:36.886 187212 INFO os_vif [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71')
Dec 05 12:10:37 compute-0 rsyslogd[1004]: imjournal from <np0005546909:nova_compute>: begin to drop messages due to rate-limiting
Dec 05 12:10:37 compute-0 nova_compute[187208]: 2025-12-05 12:10:37.210 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:37 compute-0 nova_compute[187208]: 2025-12-05 12:10:37.211 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:37 compute-0 nova_compute[187208]: 2025-12-05 12:10:37.211 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No VIF found with MAC fa:16:3e:56:54:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:10:37 compute-0 nova_compute[187208]: 2025-12-05 12:10:37.212 187212 INFO nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Using config drive
Dec 05 12:10:38 compute-0 nova_compute[187208]: 2025-12-05 12:10:38.157 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:38 compute-0 podman[232219]: 2025-12-05 12:10:38.250625197 +0000 UTC m=+0.061866912 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:10:38 compute-0 podman[232220]: 2025-12-05 12:10:38.279610427 +0000 UTC m=+0.087154697 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.136 187212 INFO nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Creating config drive at /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.config
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.142 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa4l6twtq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.277 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa4l6twtq" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:40 compute-0 kernel: tap8c343187-71: entered promiscuous mode
Dec 05 12:10:40 compute-0 NetworkManager[55691]: <info>  [1764936640.3549] manager: (tap8c343187-71): new Tun device (/org/freedesktop/NetworkManager/Devices/285)
Dec 05 12:10:40 compute-0 ovn_controller[95610]: 2025-12-05T12:10:40Z|00732|binding|INFO|Claiming lport 8c343187-712d-4aee-9c47-18497ec1042e for this chassis.
Dec 05 12:10:40 compute-0 ovn_controller[95610]: 2025-12-05T12:10:40Z|00733|binding|INFO|8c343187-712d-4aee-9c47-18497ec1042e: Claiming fa:16:3e:56:54:21 10.100.0.12
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.357 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 ovn_controller[95610]: 2025-12-05T12:10:40Z|00734|binding|INFO|Setting lport 8c343187-712d-4aee-9c47-18497ec1042e ovn-installed in OVS
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.369 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.373 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 systemd-udevd[232286]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:10:40 compute-0 systemd-machined[153543]: New machine qemu-87-instance-0000004d.
Dec 05 12:10:40 compute-0 NetworkManager[55691]: <info>  [1764936640.4138] device (tap8c343187-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:10:40 compute-0 NetworkManager[55691]: <info>  [1764936640.4163] device (tap8c343187-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:10:40 compute-0 systemd[1]: Started Virtual Machine qemu-87-instance-0000004d.
Dec 05 12:10:40 compute-0 ovn_controller[95610]: 2025-12-05T12:10:40Z|00735|binding|INFO|Setting lport 8c343187-712d-4aee-9c47-18497ec1042e up in Southbound
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.444 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:54:21 10.100.0.12'], port_security=['fa:16:3e:56:54:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '8fe1c6df-f787-4c56-b3e7-899cf5e9f723', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8c343187-712d-4aee-9c47-18497ec1042e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.445 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8c343187-712d-4aee-9c47-18497ec1042e in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c bound to our chassis
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.448 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.462 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[60b3f747-d9f9-49f6-b6db-ff152dba829d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.463 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7be4540a-01 in ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.465 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7be4540a-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.466 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e136d224-b5f2-4b37-b189-2f54425048d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.467 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[30ac0322-a794-4fef-bcb0-dad24794a0d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.478 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7d4866ca-43e0-4ec3-abfd-12a563ef05a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.494 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6a78ca35-4af6-45c0-959a-870c63c93acd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.525 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7969dfbc-f27f-43fb-aa76-27daf387c1bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.532 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2c53f76f-8210-48c0-9e1b-1f563e309022]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 NetworkManager[55691]: <info>  [1764936640.5338] manager: (tap7be4540a-00): new Veth device (/org/freedesktop/NetworkManager/Devices/286)
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.565 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b39a4e13-ebfc-4ebf-b009-e897f54f5098]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.567 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Updating instance_info_cache with network_info: [{"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.570 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9a08ef01-44e3-4857-9fa1-c7d2eaf93a9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 NetworkManager[55691]: <info>  [1764936640.5936] device (tap7be4540a-00): carrier: link connected
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.599 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[630cb6b2-bda9-4c6a-b17d-5f399756a333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.616 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd3af8a-70d3-4fdb-a525-48d6a9369d8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402115, 'reachable_time': 34535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232320, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.637 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[827fff3b-e492-46df-8f21-eedc16ded15a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:4893'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402115, 'tstamp': 402115}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232321, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.656 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f6e4b9-b1e8-4dc1-9277-ae0c74471b6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402115, 'reachable_time': 34535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232322, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.668 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Releasing lock "refresh_cache-e689e2f0-16e9-402a-986e-a769d72fa0bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.669 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Instance network_info: |[{"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.672 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Start _get_guest_xml network_info=[{"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.677 187212 WARNING nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.683 187212 DEBUG nova.virt.libvirt.host [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.684 187212 DEBUG nova.virt.libvirt.host [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.688 187212 DEBUG nova.virt.libvirt.host [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.689 187212 DEBUG nova.virt.libvirt.host [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.690 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.690 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.690 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.691 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.691 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.691 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.692 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.692 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.692 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.692 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.693 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.693 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.693 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[139893c2-e187-4110-86b1-b58030a9fce2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.697 187212 DEBUG nova.virt.libvirt.vif [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-2116425279',display_name='tempest-ServerPasswordTestJSON-server-2116425279',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-2116425279',id=78,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e3f3e747de24befad6008f67eb551ae',ramdisk_id='',reservation_id='r-iw9yq52b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-1312525266',owner_user_name='tempest-ServerPasswordTestJSON-1312525266-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:26Z,user_data=None,user_id='8b8b32a7fde5424795b54914a14028b5',uuid=e689e2f0-16e9-402a-986e-a769d72fa0bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.698 187212 DEBUG nova.network.os_vif_util [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Converting VIF {"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.699 187212 DEBUG nova.network.os_vif_util [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:2c:5b,bridge_name='br-int',has_traffic_filtering=True,id=10dc6775-d9c9-40ca-bd05-41c56cffc744,network=Network(f97e8b9d-fb9c-4712-b30e-e03f0b0d85da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10dc6775-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.700 187212 DEBUG nova.objects.instance [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lazy-loading 'pci_devices' on Instance uuid e689e2f0-16e9-402a-986e-a769d72fa0bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.770 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[22c5f10c-ba1c-43cb-8802-320fe893a137]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.771 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.771 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.772 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7be4540a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.774 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 kernel: tap7be4540a-00: entered promiscuous mode
Dec 05 12:10:40 compute-0 NetworkManager[55691]: <info>  [1764936640.7752] manager: (tap7be4540a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.777 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7be4540a-00, col_values=(('external_ids', {'iface-id': '4dcf8e96-bf04-4914-959a-aad071dfa454'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.779 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 ovn_controller[95610]: 2025-12-05T12:10:40Z|00736|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.792 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.793 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[15a38ad7-fe4f-4d7b-9eb8-3c69fd05a827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.794 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:10:40 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:40.795 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'env', 'PROCESS_TAG=haproxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.840 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <uuid>e689e2f0-16e9-402a-986e-a769d72fa0bd</uuid>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <name>instance-0000004e</name>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerPasswordTestJSON-server-2116425279</nova:name>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:10:40</nova:creationTime>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:10:40 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:10:40 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:10:40 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:10:40 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:10:40 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:10:40 compute-0 nova_compute[187208]:         <nova:user uuid="8b8b32a7fde5424795b54914a14028b5">tempest-ServerPasswordTestJSON-1312525266-project-member</nova:user>
Dec 05 12:10:40 compute-0 nova_compute[187208]:         <nova:project uuid="7e3f3e747de24befad6008f67eb551ae">tempest-ServerPasswordTestJSON-1312525266</nova:project>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:10:40 compute-0 nova_compute[187208]:         <nova:port uuid="10dc6775-d9c9-40ca-bd05-41c56cffc744">
Dec 05 12:10:40 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <system>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <entry name="serial">e689e2f0-16e9-402a-986e-a769d72fa0bd</entry>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <entry name="uuid">e689e2f0-16e9-402a-986e-a769d72fa0bd</entry>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     </system>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <os>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   </os>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <features>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   </features>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.config"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:63:2c:5b"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <target dev="tap10dc6775-d9"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/console.log" append="off"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <video>
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     </video>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:10:40 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:10:40 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:10:40 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:10:40 compute-0 nova_compute[187208]: </domain>
Dec 05 12:10:40 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.841 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Preparing to wait for external event network-vif-plugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.842 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.842 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.842 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.843 187212 DEBUG nova.virt.libvirt.vif [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-2116425279',display_name='tempest-ServerPasswordTestJSON-server-2116425279',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-2116425279',id=78,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e3f3e747de24befad6008f67eb551ae',ramdisk_id='',reservation_id='r-iw9yq52b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-1312525266',owner_user_name='tempest-ServerPasswordTestJSON-1312525266-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:26Z,user_data=None,user_id='8b8b32a7fde5424795b54914a14028b5',uuid=e689e2f0-16e9-402a-986e-a769d72fa0bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.843 187212 DEBUG nova.network.os_vif_util [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Converting VIF {"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.844 187212 DEBUG nova.network.os_vif_util [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:2c:5b,bridge_name='br-int',has_traffic_filtering=True,id=10dc6775-d9c9-40ca-bd05-41c56cffc744,network=Network(f97e8b9d-fb9c-4712-b30e-e03f0b0d85da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10dc6775-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.845 187212 DEBUG os_vif [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:2c:5b,bridge_name='br-int',has_traffic_filtering=True,id=10dc6775-d9c9-40ca-bd05-41c56cffc744,network=Network(f97e8b9d-fb9c-4712-b30e-e03f0b0d85da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10dc6775-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.845 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.846 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.846 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.848 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.849 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap10dc6775-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.849 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap10dc6775-d9, col_values=(('external_ids', {'iface-id': '10dc6775-d9c9-40ca-bd05-41c56cffc744', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:2c:5b', 'vm-uuid': 'e689e2f0-16e9-402a-986e-a769d72fa0bd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 NetworkManager[55691]: <info>  [1764936640.8520] manager: (tap10dc6775-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.854 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.857 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.858 187212 INFO os_vif [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:2c:5b,bridge_name='br-int',has_traffic_filtering=True,id=10dc6775-d9c9-40ca-bd05-41c56cffc744,network=Network(f97e8b9d-fb9c-4712-b30e-e03f0b0d85da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10dc6775-d9')
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.870 187212 DEBUG nova.network.neutron [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Successfully updated port: 96dab709-f4e0-48a6-ab76-0b13fdf97017 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.985 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936625.9838457, 4053596b-9c68-4044-bb28-5f57016c8e62 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:40 compute-0 nova_compute[187208]: 2025-12-05 12:10:40.985 187212 INFO nova.compute.manager [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] VM Stopped (Lifecycle Event)
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.046 187212 DEBUG nova.compute.manager [req-f08aee9e-6a72-4819-bfcb-b1f4cde4bde1 req-938499b6-6f11-463d-9a95-e1683b9466c5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Received event network-changed-10dc6775-d9c9-40ca-bd05-41c56cffc744 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.049 187212 DEBUG nova.compute.manager [req-f08aee9e-6a72-4819-bfcb-b1f4cde4bde1 req-938499b6-6f11-463d-9a95-e1683b9466c5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Refreshing instance network info cache due to event network-changed-10dc6775-d9c9-40ca-bd05-41c56cffc744. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.049 187212 DEBUG oslo_concurrency.lockutils [req-f08aee9e-6a72-4819-bfcb-b1f4cde4bde1 req-938499b6-6f11-463d-9a95-e1683b9466c5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e689e2f0-16e9-402a-986e-a769d72fa0bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.050 187212 DEBUG oslo_concurrency.lockutils [req-f08aee9e-6a72-4819-bfcb-b1f4cde4bde1 req-938499b6-6f11-463d-9a95-e1683b9466c5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e689e2f0-16e9-402a-986e-a769d72fa0bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.050 187212 DEBUG nova.network.neutron [req-f08aee9e-6a72-4819-bfcb-b1f4cde4bde1 req-938499b6-6f11-463d-9a95-e1683b9466c5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Refreshing network info cache for port 10dc6775-d9c9-40ca-bd05-41c56cffc744 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.053 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "refresh_cache-30cb83d4-3a34-4420-bc83-099b266da48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.054 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquired lock "refresh_cache-30cb83d4-3a34-4420-bc83-099b266da48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.054 187212 DEBUG nova.network.neutron [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.063 187212 DEBUG nova.compute.manager [None req-664cf75d-27e5-43a4-b0cd-11ee6e5fa305 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.166 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.168 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.168 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] No VIF found with MAC fa:16:3e:63:2c:5b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.169 187212 INFO nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Using config drive
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.240 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:10:41 compute-0 podman[232353]: 2025-12-05 12:10:41.254097709 +0000 UTC m=+0.059384481 container create 8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:10:41 compute-0 systemd[1]: Started libpod-conmon-8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740.scope.
Dec 05 12:10:41 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:10:41 compute-0 podman[232353]: 2025-12-05 12:10:41.223566375 +0000 UTC m=+0.028853177 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:10:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f57f6f5ca920f6b052ac4a2d8205d8028c66e8b7337dd5e242b29ed149ec9c59/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:10:41 compute-0 podman[232353]: 2025-12-05 12:10:41.353902456 +0000 UTC m=+0.159189248 container init 8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 12:10:41 compute-0 podman[232366]: 2025-12-05 12:10:41.355843912 +0000 UTC m=+0.060308718 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 12:10:41 compute-0 podman[232353]: 2025-12-05 12:10:41.361667459 +0000 UTC m=+0.166954231 container start 8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 12:10:41 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[232369]: [NOTICE]   (232392) : New worker (232394) forked
Dec 05 12:10:41 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[232369]: [NOTICE]   (232392) : Loading success.
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.583 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936641.5832663, 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.584 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] VM Started (Lifecycle Event)
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.612 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.618 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936641.5845585, 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.618 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] VM Paused (Lifecycle Event)
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.641 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.644 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.663 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.808 187212 DEBUG nova.network.neutron [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.886 187212 DEBUG nova.network.neutron [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Updated VIF entry in instance network info cache for port 8c343187-712d-4aee-9c47-18497ec1042e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.887 187212 DEBUG nova.network.neutron [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Updating instance_info_cache with network_info: [{"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:41 compute-0 nova_compute[187208]: 2025-12-05 12:10:41.915 187212 DEBUG oslo_concurrency.lockutils [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.301 187212 INFO nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Creating config drive at /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.config
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.307 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0delmrba execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.436 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0delmrba" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:42 compute-0 kernel: tap10dc6775-d9: entered promiscuous mode
Dec 05 12:10:42 compute-0 systemd-udevd[232305]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:10:42 compute-0 NetworkManager[55691]: <info>  [1764936642.4966] manager: (tap10dc6775-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Dec 05 12:10:42 compute-0 ovn_controller[95610]: 2025-12-05T12:10:42Z|00737|binding|INFO|Claiming lport 10dc6775-d9c9-40ca-bd05-41c56cffc744 for this chassis.
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.497 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:42 compute-0 ovn_controller[95610]: 2025-12-05T12:10:42Z|00738|binding|INFO|10dc6775-d9c9-40ca-bd05-41c56cffc744: Claiming fa:16:3e:63:2c:5b 10.100.0.12
Dec 05 12:10:42 compute-0 NetworkManager[55691]: <info>  [1764936642.5083] device (tap10dc6775-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:10:42 compute-0 NetworkManager[55691]: <info>  [1764936642.5095] device (tap10dc6775-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.508 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:2c:5b 10.100.0.12'], port_security=['fa:16:3e:63:2c:5b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e689e2f0-16e9-402a-986e-a769d72fa0bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e3f3e747de24befad6008f67eb551ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3ef52474-3611-44ee-97e2-190329a43cf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ad412fa-dc2f-4f56-abb7-47e751784509, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=10dc6775-d9c9-40ca-bd05-41c56cffc744) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.510 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 10dc6775-d9c9-40ca-bd05-41c56cffc744 in datapath f97e8b9d-fb9c-4712-b30e-e03f0b0d85da bound to our chassis
Dec 05 12:10:42 compute-0 ovn_controller[95610]: 2025-12-05T12:10:42Z|00739|binding|INFO|Setting lport 10dc6775-d9c9-40ca-bd05-41c56cffc744 ovn-installed in OVS
Dec 05 12:10:42 compute-0 ovn_controller[95610]: 2025-12-05T12:10:42Z|00740|binding|INFO|Setting lport 10dc6775-d9c9-40ca-bd05-41c56cffc744 up in Southbound
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.513 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f97e8b9d-fb9c-4712-b30e-e03f0b0d85da
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.515 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.517 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.528 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eaf0ca91-1567-41e4-9c4a-137b16ded50f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.529 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf97e8b9d-f1 in ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.532 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf97e8b9d-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.532 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[afad1cbf-d89d-4c1e-8a3e-7adb02089e8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.532 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[76ffaa3f-df7c-439e-9441-1acb2d9d53cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 systemd-machined[153543]: New machine qemu-88-instance-0000004e.
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.544 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[86aed664-0393-4411-bbe9-960cd0181048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 systemd[1]: Started Virtual Machine qemu-88-instance-0000004e.
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.568 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6225cdfd-7f10-4248-9ff1-e5c0f0c4440d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.598 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[02bd2ee2-e297-433c-82ce-747a62547f70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.612 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f13e512c-d898-4301-8237-3085ac7bda89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 NetworkManager[55691]: <info>  [1764936642.6138] manager: (tapf97e8b9d-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.645 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d19b3d30-5476-497c-84a0-70ca7d510632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.648 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c46abec9-bcca-417a-80c6-7b9c7dd71296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.667 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.668 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:42 compute-0 NetworkManager[55691]: <info>  [1764936642.6708] device (tapf97e8b9d-f0): carrier: link connected
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.675 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3f41c184-154b-4697-9ed9-ce5dc3eeeef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.689 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[265c8a44-1aa1-4325-9c2c-955ea871627c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf97e8b9d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:0a:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402323, 'reachable_time': 29160, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232448, 'error': None, 'target': 'ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.691 187212 DEBUG nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.706 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9093ef-62fd-4d25-9913-21976fc99176]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5e:aef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402323, 'tstamp': 402323}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232449, 'error': None, 'target': 'ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.723 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb14eb8-5d1b-46cb-afaf-95d6e82f0e6a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf97e8b9d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:0a:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402323, 'reachable_time': 29160, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232450, 'error': None, 'target': 'ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.753 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d3aca5db-f2b9-4cc9-a81d-dc236fa2d04d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.791 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.792 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.801 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.801 187212 INFO nova.compute.claims [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.813 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0c9cd100-3ea0-4fad-ba94-7fa20e1d0a43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.815 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf97e8b9d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.815 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.815 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf97e8b9d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.817 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:42 compute-0 NetworkManager[55691]: <info>  [1764936642.8180] manager: (tapf97e8b9d-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Dec 05 12:10:42 compute-0 kernel: tapf97e8b9d-f0: entered promiscuous mode
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.823 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf97e8b9d-f0, col_values=(('external_ids', {'iface-id': '8caa07bc-c963-4249-9c51-e35f2533d29c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.825 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:42 compute-0 ovn_controller[95610]: 2025-12-05T12:10:42Z|00741|binding|INFO|Releasing lport 8caa07bc-c963-4249-9c51-e35f2533d29c from this chassis (sb_readonly=0)
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.827 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f97e8b9d-fb9c-4712-b30e-e03f0b0d85da.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f97e8b9d-fb9c-4712-b30e-e03f0b0d85da.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.828 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d299709-4948-4267-9da1-a84723279988]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.829 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/f97e8b9d-fb9c-4712-b30e-e03f0b0d85da.pid.haproxy
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID f97e8b9d-fb9c-4712-b30e-e03f0b0d85da
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:10:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:42.829 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da', 'env', 'PROCESS_TAG=haproxy-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f97e8b9d-fb9c-4712-b30e-e03f0b0d85da.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:10:42 compute-0 nova_compute[187208]: 2025-12-05 12:10:42.839 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.024 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936643.024501, e689e2f0-16e9-402a-986e-a769d72fa0bd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.025 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] VM Started (Lifecycle Event)
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.039 187212 DEBUG nova.compute.provider_tree [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.045 187212 DEBUG nova.compute.manager [req-c733ed57-02e7-474a-8a4e-da325dc45117 req-905a8a6f-b49a-4712-8c20-1e8b8d764078 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.046 187212 DEBUG oslo_concurrency.lockutils [req-c733ed57-02e7-474a-8a4e-da325dc45117 req-905a8a6f-b49a-4712-8c20-1e8b8d764078 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.046 187212 DEBUG oslo_concurrency.lockutils [req-c733ed57-02e7-474a-8a4e-da325dc45117 req-905a8a6f-b49a-4712-8c20-1e8b8d764078 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.047 187212 DEBUG oslo_concurrency.lockutils [req-c733ed57-02e7-474a-8a4e-da325dc45117 req-905a8a6f-b49a-4712-8c20-1e8b8d764078 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.047 187212 DEBUG nova.compute.manager [req-c733ed57-02e7-474a-8a4e-da325dc45117 req-905a8a6f-b49a-4712-8c20-1e8b8d764078 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Processing event network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.050 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.051 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.058 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.060 187212 DEBUG nova.scheduler.client.report [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.070 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.071 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.075 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936643.025522, e689e2f0-16e9-402a-986e-a769d72fa0bd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.076 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] VM Paused (Lifecycle Event)
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.079 187212 INFO nova.virt.libvirt.driver [-] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Instance spawned successfully.
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.080 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.110 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.111 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.112 187212 DEBUG nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.118 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.119 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.119 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.120 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.120 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.121 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.126 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.137 187212 DEBUG nova.network.neutron [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Updating instance_info_cache with network_info: [{"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.158 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.165 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.165 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936643.0569, ecc25cb4-5b3a-43f7-949d-ca9a1a19056a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.166 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] VM Resumed (Lifecycle Event)
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.167 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Releasing lock "refresh_cache-30cb83d4-3a34-4420-bc83-099b266da48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.168 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Instance network_info: |[{"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.169 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Start _get_guest_xml network_info=[{"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.173 187212 WARNING nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.179 187212 DEBUG nova.virt.libvirt.host [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.179 187212 DEBUG nova.virt.libvirt.host [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.182 187212 DEBUG nova.virt.libvirt.host [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.183 187212 DEBUG nova.virt.libvirt.host [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.183 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.183 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.184 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.184 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.184 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.184 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.185 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.185 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.185 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.185 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.185 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.186 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.190 187212 DEBUG nova.virt.libvirt.vif [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-1444488967',display_name='tempest-₡-1444488967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest--1444488967',id=79,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-o6d8hryc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:33Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=30cb83d4-3a34-4420-bc83-099b266da48c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.190 187212 DEBUG nova.network.os_vif_util [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.191 187212 DEBUG nova.network.os_vif_util [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:a5:e5,bridge_name='br-int',has_traffic_filtering=True,id=96dab709-f4e0-48a6-ab76-0b13fdf97017,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96dab709-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.191 187212 DEBUG nova.objects.instance [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 30cb83d4-3a34-4420-bc83-099b266da48c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.207 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.210 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.216 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <uuid>30cb83d4-3a34-4420-bc83-099b266da48c</uuid>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <name>instance-0000004f</name>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <nova:name>tempest-₡-1444488967</nova:name>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:10:43</nova:creationTime>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:10:43 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:10:43 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:10:43 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:10:43 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:10:43 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:10:43 compute-0 nova_compute[187208]:         <nova:user uuid="62153b585ecc4e6fa2ad567851d49081">tempest-ServersTestJSON-1492365581-project-member</nova:user>
Dec 05 12:10:43 compute-0 nova_compute[187208]:         <nova:project uuid="0c982a61e3fc4c8da9248076bb0361ac">tempest-ServersTestJSON-1492365581</nova:project>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:10:43 compute-0 nova_compute[187208]:         <nova:port uuid="96dab709-f4e0-48a6-ab76-0b13fdf97017">
Dec 05 12:10:43 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <system>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <entry name="serial">30cb83d4-3a34-4420-bc83-099b266da48c</entry>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <entry name="uuid">30cb83d4-3a34-4420-bc83-099b266da48c</entry>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     </system>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <os>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   </os>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <features>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   </features>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.config"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:82:a5:e5"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <target dev="tap96dab709-f4"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/console.log" append="off"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <video>
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     </video>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:10:43 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:10:43 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:10:43 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:10:43 compute-0 nova_compute[187208]: </domain>
Dec 05 12:10:43 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.217 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Preparing to wait for external event network-vif-plugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.217 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.217 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.218 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.218 187212 DEBUG nova.virt.libvirt.vif [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-1444488967',display_name='tempest-₡-1444488967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest--1444488967',id=79,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-o6d8hryc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:33Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=30cb83d4-3a34-4420-bc83-099b266da48c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.219 187212 DEBUG nova.network.os_vif_util [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.219 187212 DEBUG nova.network.os_vif_util [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:a5:e5,bridge_name='br-int',has_traffic_filtering=True,id=96dab709-f4e0-48a6-ab76-0b13fdf97017,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96dab709-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.219 187212 DEBUG os_vif [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:a5:e5,bridge_name='br-int',has_traffic_filtering=True,id=96dab709-f4e0-48a6-ab76-0b13fdf97017,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96dab709-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.220 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.220 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.221 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.221 187212 DEBUG nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.222 187212 DEBUG nova.network.neutron [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.226 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96dab709-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.226 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap96dab709-f4, col_values=(('external_ids', {'iface-id': '96dab709-f4e0-48a6-ab76-0b13fdf97017', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:a5:e5', 'vm-uuid': '30cb83d4-3a34-4420-bc83-099b266da48c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.227 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:43 compute-0 NetworkManager[55691]: <info>  [1764936643.2286] manager: (tap96dab709-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.229 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.232 187212 INFO nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Took 24.56 seconds to spawn the instance on the hypervisor.
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.232 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.234 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:43 compute-0 podman[232489]: 2025-12-05 12:10:43.235073087 +0000 UTC m=+0.050249270 container create 51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.235 187212 INFO os_vif [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:a5:e5,bridge_name='br-int',has_traffic_filtering=True,id=96dab709-f4e0-48a6-ab76-0b13fdf97017,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96dab709-f4')
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.241 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.247 187212 INFO nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:10:43 compute-0 systemd[1]: Started libpod-conmon-51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d.scope.
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.296 187212 DEBUG nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:10:43 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:10:43 compute-0 podman[232489]: 2025-12-05 12:10:43.204910833 +0000 UTC m=+0.020087046 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:10:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/831e2f239e994d4e8099b3d44bb3baf813987ee1ff02110006e4a66e930f8440/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:10:43 compute-0 podman[232489]: 2025-12-05 12:10:43.313149192 +0000 UTC m=+0.128325415 container init 51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:10:43 compute-0 podman[232489]: 2025-12-05 12:10:43.31797048 +0000 UTC m=+0.133146673 container start 51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.333 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.333 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.334 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No VIF found with MAC fa:16:3e:82:a5:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.334 187212 INFO nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Using config drive
Dec 05 12:10:43 compute-0 neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da[232506]: [NOTICE]   (232510) : New worker (232512) forked
Dec 05 12:10:43 compute-0 neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da[232506]: [NOTICE]   (232510) : Loading success.
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.374 187212 INFO nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Took 25.40 seconds to build instance.
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.407 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 25.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.435 187212 DEBUG nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.436 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.437 187212 INFO nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Creating image(s)
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.437 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.438 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.438 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.450 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.475 187212 DEBUG nova.policy [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.512 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.513 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.514 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.526 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.583 187212 DEBUG nova.network.neutron [req-f08aee9e-6a72-4819-bfcb-b1f4cde4bde1 req-938499b6-6f11-463d-9a95-e1683b9466c5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Updated VIF entry in instance network info cache for port 10dc6775-d9c9-40ca-bd05-41c56cffc744. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.584 187212 DEBUG nova.network.neutron [req-f08aee9e-6a72-4819-bfcb-b1f4cde4bde1 req-938499b6-6f11-463d-9a95-e1683b9466c5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Updating instance_info_cache with network_info: [{"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.596 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.597 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.638 187212 DEBUG oslo_concurrency.lockutils [req-f08aee9e-6a72-4819-bfcb-b1f4cde4bde1 req-938499b6-6f11-463d-9a95-e1683b9466c5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e689e2f0-16e9-402a-986e-a769d72fa0bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.642 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.643 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.643 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.706 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.707 187212 DEBUG nova.virt.disk.api [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Checking if we can resize image /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.708 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.766 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.767 187212 DEBUG nova.virt.disk.api [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Cannot resize image /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.768 187212 DEBUG nova.objects.instance [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'migration_context' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.783 187212 INFO nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Creating config drive at /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.config
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.788 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7_nem1gb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.806 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.806 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Ensure instance console log exists: /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.807 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.808 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.808 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.912 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7_nem1gb" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:43 compute-0 NetworkManager[55691]: <info>  [1764936643.9665] manager: (tap96dab709-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/293)
Dec 05 12:10:43 compute-0 kernel: tap96dab709-f4: entered promiscuous mode
Dec 05 12:10:43 compute-0 nova_compute[187208]: 2025-12-05 12:10:43.981 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:43 compute-0 ovn_controller[95610]: 2025-12-05T12:10:43Z|00742|binding|INFO|Claiming lport 96dab709-f4e0-48a6-ab76-0b13fdf97017 for this chassis.
Dec 05 12:10:43 compute-0 ovn_controller[95610]: 2025-12-05T12:10:43Z|00743|binding|INFO|96dab709-f4e0-48a6-ab76-0b13fdf97017: Claiming fa:16:3e:82:a5:e5 10.100.0.9
Dec 05 12:10:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:43.990 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:a5:e5 10.100.0.9'], port_security=['fa:16:3e:82:a5:e5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=96dab709-f4e0-48a6-ab76-0b13fdf97017) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:43.992 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 96dab709-f4e0-48a6-ab76-0b13fdf97017 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 bound to our chassis
Dec 05 12:10:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:43.994 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:10:43 compute-0 ovn_controller[95610]: 2025-12-05T12:10:43Z|00744|binding|INFO|Setting lport 96dab709-f4e0-48a6-ab76-0b13fdf97017 ovn-installed in OVS
Dec 05 12:10:43 compute-0 ovn_controller[95610]: 2025-12-05T12:10:43Z|00745|binding|INFO|Setting lport 96dab709-f4e0-48a6-ab76-0b13fdf97017 up in Southbound
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.001 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.006 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c440530a-c5d6-44ce-9e63-baddf2844f50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.007 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0c025e40-a1 in ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.008 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0c025e40-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.009 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0403e71e-ad5b-4320-9f42-f38bbeed1d62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.009 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8e47c65e-01a8-45c8-b5c4-8038b21329ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 systemd-machined[153543]: New machine qemu-89-instance-0000004f.
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.019 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a1233610-e480-4079-b6df-c4ccf2abb726]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 systemd[1]: Started Virtual Machine qemu-89-instance-0000004f.
Dec 05 12:10:44 compute-0 systemd-udevd[232557]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.044 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1b5df606-a4d4-43f8-af97-603487ee53a8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 NetworkManager[55691]: <info>  [1764936644.0579] device (tap96dab709-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:10:44 compute-0 NetworkManager[55691]: <info>  [1764936644.0588] device (tap96dab709-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.078 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[26e1508a-230c-400b-b5d9-048bf95337c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 NetworkManager[55691]: <info>  [1764936644.0849] manager: (tap0c025e40-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/294)
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.083 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a82fed25-c52a-4243-9919-0afacd4f0f10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.104 187212 DEBUG nova.compute.manager [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Received event network-changed-96dab709-f4e0-48a6-ab76-0b13fdf97017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.105 187212 DEBUG nova.compute.manager [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Refreshing instance network info cache due to event network-changed-96dab709-f4e0-48a6-ab76-0b13fdf97017. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.105 187212 DEBUG oslo_concurrency.lockutils [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-30cb83d4-3a34-4420-bc83-099b266da48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.105 187212 DEBUG oslo_concurrency.lockutils [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-30cb83d4-3a34-4420-bc83-099b266da48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.105 187212 DEBUG nova.network.neutron [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Refreshing network info cache for port 96dab709-f4e0-48a6-ab76-0b13fdf97017 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.117 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4c552e82-c6c4-40a7-a079-1a9017ddc487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.120 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ab1c3db4-62d3-4643-8609-d5b5aa73f274]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 NetworkManager[55691]: <info>  [1764936644.1448] device (tap0c025e40-a0): carrier: link connected
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.153 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a512bd4e-5229-4a3a-ab45-930b3e29be7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.177 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[462ce163-567e-4bc4-aaca-a1fbf9f53d39]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232587, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.201 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[61e23e0c-e911-4c33-bfeb-4c6949bee66f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea6:a4d4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402470, 'tstamp': 402470}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232588, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.219 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[36b4c34f-8e71-44b9-946b-56ba5ec9d11d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232590, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.248 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[163e51c9-41cb-422b-b0b1-dcd4311cde47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.298 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e626cee2-8590-4eea-96fd-4326cff697a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.300 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.300 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.301 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.303 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:44 compute-0 NetworkManager[55691]: <info>  [1764936644.3040] manager: (tap0c025e40-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Dec 05 12:10:44 compute-0 kernel: tap0c025e40-a0: entered promiscuous mode
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.305 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.308 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.309 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:44 compute-0 ovn_controller[95610]: 2025-12-05T12:10:44Z|00746|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.311 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0c025e40-a124-4810-9d75-2a59e91db1b3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0c025e40-a124-4810-9d75-2a59e91db1b3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.312 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78ddcccc-a52d-4745-bb15-92807bbfe470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.313 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/0c025e40-a124-4810-9d75-2a59e91db1b3.pid.haproxy
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:10:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:44.316 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'env', 'PROCESS_TAG=haproxy-0c025e40-a124-4810-9d75-2a59e91db1b3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0c025e40-a124-4810-9d75-2a59e91db1b3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.316 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936644.3162708, 30cb83d4-3a34-4420-bc83-099b266da48c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.317 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] VM Started (Lifecycle Event)
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.322 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.334 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.337 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936644.3167753, 30cb83d4-3a34-4420-bc83-099b266da48c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.337 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] VM Paused (Lifecycle Event)
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.359 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.362 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.384 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:44 compute-0 nova_compute[187208]: 2025-12-05 12:10:44.406 187212 DEBUG nova.network.neutron [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Successfully created port: e30774db-d3d3-4438-b68a-6f7855f55128 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:10:44 compute-0 podman[232628]: 2025-12-05 12:10:44.721435402 +0000 UTC m=+0.073676210 container create 9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:10:44 compute-0 systemd[1]: Started libpod-conmon-9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33.scope.
Dec 05 12:10:44 compute-0 podman[232628]: 2025-12-05 12:10:44.675467936 +0000 UTC m=+0.027708754 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:10:44 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:10:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fff199011b698c5a7285fa6b201980fd6fde3394f58b1348881bea980a3c7d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:10:44 compute-0 podman[232628]: 2025-12-05 12:10:44.819608193 +0000 UTC m=+0.171849081 container init 9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 12:10:44 compute-0 podman[232628]: 2025-12-05 12:10:44.832114461 +0000 UTC m=+0.184355269 container start 9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:10:44 compute-0 neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3[232643]: [NOTICE]   (232647) : New worker (232649) forked
Dec 05 12:10:44 compute-0 neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3[232643]: [NOTICE]   (232647) : Loading success.
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.707 187212 DEBUG nova.compute.manager [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.708 187212 DEBUG oslo_concurrency.lockutils [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.708 187212 DEBUG oslo_concurrency.lockutils [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.708 187212 DEBUG oslo_concurrency.lockutils [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.708 187212 DEBUG nova.compute.manager [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] No waiting events found dispatching network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.708 187212 WARNING nova.compute.manager [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received unexpected event network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 for instance with vm_state active and task_state None.
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.709 187212 DEBUG nova.compute.manager [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Received event network-vif-plugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.709 187212 DEBUG oslo_concurrency.lockutils [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.709 187212 DEBUG oslo_concurrency.lockutils [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.709 187212 DEBUG oslo_concurrency.lockutils [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.709 187212 DEBUG nova.compute.manager [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Processing event network-vif-plugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.709 187212 DEBUG nova.compute.manager [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Received event network-vif-plugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.710 187212 DEBUG oslo_concurrency.lockutils [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.710 187212 DEBUG oslo_concurrency.lockutils [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.710 187212 DEBUG oslo_concurrency.lockutils [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.710 187212 DEBUG nova.compute.manager [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] No waiting events found dispatching network-vif-plugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.710 187212 WARNING nova.compute.manager [req-d540c6da-59b6-4c8f-af60-4edff1415e90 req-0abd2fac-6206-427e-9f97-42bf940eb432 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Received unexpected event network-vif-plugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 for instance with vm_state building and task_state spawning.
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.711 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.716 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936645.715946, e689e2f0-16e9-402a-986e-a769d72fa0bd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.716 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] VM Resumed (Lifecycle Event)
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.718 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.721 187212 INFO nova.virt.libvirt.driver [-] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Instance spawned successfully.
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.721 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.739 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.743 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.743 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.744 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.744 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.744 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.745 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.748 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.775 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.812 187212 INFO nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Took 18.89 seconds to spawn the instance on the hypervisor.
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.812 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.878 187212 INFO nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Took 19.63 seconds to build instance.
Dec 05 12:10:45 compute-0 nova_compute[187208]: 2025-12-05 12:10:45.895 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.447 187212 DEBUG nova.network.neutron [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Updated VIF entry in instance network info cache for port 96dab709-f4e0-48a6-ab76-0b13fdf97017. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.447 187212 DEBUG nova.network.neutron [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Updating instance_info_cache with network_info: [{"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.474 187212 DEBUG oslo_concurrency.lockutils [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-30cb83d4-3a34-4420-bc83-099b266da48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.475 187212 DEBUG nova.compute.manager [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.475 187212 DEBUG oslo_concurrency.lockutils [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.475 187212 DEBUG oslo_concurrency.lockutils [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.475 187212 DEBUG oslo_concurrency.lockutils [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.476 187212 DEBUG nova.compute.manager [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Processing event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.476 187212 DEBUG nova.compute.manager [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.476 187212 DEBUG oslo_concurrency.lockutils [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.476 187212 DEBUG oslo_concurrency.lockutils [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.476 187212 DEBUG oslo_concurrency.lockutils [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.477 187212 DEBUG nova.compute.manager [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] No waiting events found dispatching network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.477 187212 WARNING nova.compute.manager [req-cf7d59b4-5f83-4340-b472-3853d8ffc8b2 req-930deec1-4c1f-4fee-9561-ceb324da98f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received unexpected event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e for instance with vm_state building and task_state spawning.
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.478 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.491 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936646.4892454, 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.491 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] VM Resumed (Lifecycle Event)
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.493 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.509 187212 INFO nova.virt.libvirt.driver [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance spawned successfully.
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.510 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.514 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.519 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.531 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.532 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.532 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.533 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.533 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.533 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.540 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.598 187212 INFO nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Took 21.50 seconds to spawn the instance on the hypervisor.
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.599 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.656 187212 INFO nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Took 21.99 seconds to build instance.
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.671 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.963 187212 DEBUG nova.network.neutron [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Successfully updated port: e30774db-d3d3-4438-b68a-6f7855f55128 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.979 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.979 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquired lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:46 compute-0 nova_compute[187208]: 2025-12-05 12:10:46.979 187212 DEBUG nova.network.neutron [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.086 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.086 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.087 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.142 187212 DEBUG nova.compute.manager [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Received event network-vif-plugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.143 187212 DEBUG oslo_concurrency.lockutils [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.143 187212 DEBUG oslo_concurrency.lockutils [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.143 187212 DEBUG oslo_concurrency.lockutils [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.143 187212 DEBUG nova.compute.manager [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Processing event network-vif-plugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.143 187212 DEBUG nova.compute.manager [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Received event network-vif-plugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.144 187212 DEBUG oslo_concurrency.lockutils [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.144 187212 DEBUG oslo_concurrency.lockutils [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.144 187212 DEBUG oslo_concurrency.lockutils [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.144 187212 DEBUG nova.compute.manager [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] No waiting events found dispatching network-vif-plugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.144 187212 WARNING nova.compute.manager [req-c275c4da-dd1b-4c58-bc37-ea720f0f1dd6 req-102c69c5-c1d3-4be9-b923-77b50426b64d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Received unexpected event network-vif-plugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 for instance with vm_state building and task_state spawning.
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.145 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.148 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936647.1477501, 30cb83d4-3a34-4420-bc83-099b266da48c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.148 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] VM Resumed (Lifecycle Event)
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.150 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.153 187212 INFO nova.virt.libvirt.driver [-] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Instance spawned successfully.
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.153 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.178 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.187 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.187 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.188 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.188 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.188 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.189 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.192 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.195 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.218 187212 DEBUG nova.network.neutron [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.240 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.272 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.273 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.301 187212 INFO nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Took 13.67 seconds to spawn the instance on the hypervisor.
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.302 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.333 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.339 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.389 187212 INFO nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Took 15.20 seconds to build instance.
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.406 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.409 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.409 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.479 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.487 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.546 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.547 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.618 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.624 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.690 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.691 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.750 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.755 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.834 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.834 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:47 compute-0 nova_compute[187208]: 2025-12-05 12:10:47.901 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.102 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.103 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5134MB free_disk=73.08087921142578GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.103 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.103 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.265 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.371 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.371 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.371 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.371 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance e689e2f0-16e9-402a-986e-a769d72fa0bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.371 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 30cb83d4-3a34-4420-bc83-099b266da48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.371 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 28e48516-8665-4d98-a92d-c84b7da9a284 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.372 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 6 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.372 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1280MB phys_disk=79GB used_disk=6GB total_vcpus=8 used_vcpus=6 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.554 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.560 187212 DEBUG nova.compute.manager [req-e22dbf9f-c40d-46f4-88ce-aeb216f2cdc7 req-508a8509-5d02-4b52-b768-81cfcc909283 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.561 187212 DEBUG nova.compute.manager [req-e22dbf9f-c40d-46f4-88ce-aeb216f2cdc7 req-508a8509-5d02-4b52-b768-81cfcc909283 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing instance network info cache due to event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.561 187212 DEBUG oslo_concurrency.lockutils [req-e22dbf9f-c40d-46f4-88ce-aeb216f2cdc7 req-508a8509-5d02-4b52-b768-81cfcc909283 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.561 187212 DEBUG oslo_concurrency.lockutils [req-e22dbf9f-c40d-46f4-88ce-aeb216f2cdc7 req-508a8509-5d02-4b52-b768-81cfcc909283 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.561 187212 DEBUG nova.network.neutron [req-e22dbf9f-c40d-46f4-88ce-aeb216f2cdc7 req-508a8509-5d02-4b52-b768-81cfcc909283 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.578 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.583 187212 DEBUG nova.network.neutron [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.669 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.669 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.679 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Releasing lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.679 187212 DEBUG nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance network_info: |[{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.682 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Start _get_guest_xml network_info=[{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.685 187212 WARNING nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.688 187212 DEBUG nova.virt.libvirt.host [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.689 187212 DEBUG nova.virt.libvirt.host [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.692 187212 DEBUG nova.virt.libvirt.host [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.692 187212 DEBUG nova.virt.libvirt.host [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.693 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.693 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.693 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.694 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.694 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.694 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.694 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.694 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.695 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.695 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.695 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.695 187212 DEBUG nova.virt.hardware [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.699 187212 DEBUG nova.virt.libvirt.vif [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-826937421',display_name='tempest-ServersNegativeTestJSON-server-826937421',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-826937421',id=80,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-snx0qylv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:43Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=28e48516-8665-4d98-a92d-c84b7da9a284,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.699 187212 DEBUG nova.network.os_vif_util [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.700 187212 DEBUG nova.network.os_vif_util [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.703 187212 DEBUG nova.objects.instance [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'pci_devices' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.732 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <uuid>28e48516-8665-4d98-a92d-c84b7da9a284</uuid>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <name>instance-00000050</name>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersNegativeTestJSON-server-826937421</nova:name>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:10:48</nova:creationTime>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:10:48 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:10:48 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:10:48 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:10:48 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:10:48 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:10:48 compute-0 nova_compute[187208]:         <nova:user uuid="e90fa3a379b4494c84626bb6a761cd30">tempest-ServersNegativeTestJSON-1063007033-project-member</nova:user>
Dec 05 12:10:48 compute-0 nova_compute[187208]:         <nova:project uuid="c5b34686513f4abc8165113eb8c6831e">tempest-ServersNegativeTestJSON-1063007033</nova:project>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:10:48 compute-0 nova_compute[187208]:         <nova:port uuid="e30774db-d3d3-4438-b68a-6f7855f55128">
Dec 05 12:10:48 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <system>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <entry name="serial">28e48516-8665-4d98-a92d-c84b7da9a284</entry>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <entry name="uuid">28e48516-8665-4d98-a92d-c84b7da9a284</entry>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     </system>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <os>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   </os>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <features>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   </features>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.config"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:50:8e:78"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <target dev="tape30774db-d3"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/console.log" append="off"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <video>
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     </video>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:10:48 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:10:48 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:10:48 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:10:48 compute-0 nova_compute[187208]: </domain>
Dec 05 12:10:48 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.733 187212 DEBUG nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Preparing to wait for external event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.733 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.733 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.733 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.734 187212 DEBUG nova.virt.libvirt.vif [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-826937421',display_name='tempest-ServersNegativeTestJSON-server-826937421',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-826937421',id=80,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-snx0qylv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:43Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=28e48516-8665-4d98-a92d-c84b7da9a284,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.734 187212 DEBUG nova.network.os_vif_util [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.735 187212 DEBUG nova.network.os_vif_util [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.735 187212 DEBUG os_vif [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.736 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.736 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.737 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.740 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.740 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape30774db-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.740 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape30774db-d3, col_values=(('external_ids', {'iface-id': 'e30774db-d3d3-4438-b68a-6f7855f55128', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:8e:78', 'vm-uuid': '28e48516-8665-4d98-a92d-c84b7da9a284'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.742 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:48 compute-0 NetworkManager[55691]: <info>  [1764936648.7434] manager: (tape30774db-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.745 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.749 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.750 187212 INFO os_vif [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3')
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.832 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.833 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.833 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] No VIF found with MAC fa:16:3e:50:8e:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:10:48 compute-0 nova_compute[187208]: 2025-12-05 12:10:48.834 187212 INFO nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Using config drive
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.558 187212 DEBUG nova.compute.manager [req-caa3cdf7-2fd2-495d-bc12-29c76724b7d4 req-104a2b8a-ce80-4f0e-b06d-494e26238b12 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-changed-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.558 187212 DEBUG nova.compute.manager [req-caa3cdf7-2fd2-495d-bc12-29c76724b7d4 req-104a2b8a-ce80-4f0e-b06d-494e26238b12 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Refreshing instance network info cache due to event network-changed-e30774db-d3d3-4438-b68a-6f7855f55128. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.559 187212 DEBUG oslo_concurrency.lockutils [req-caa3cdf7-2fd2-495d-bc12-29c76724b7d4 req-104a2b8a-ce80-4f0e-b06d-494e26238b12 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.559 187212 DEBUG oslo_concurrency.lockutils [req-caa3cdf7-2fd2-495d-bc12-29c76724b7d4 req-104a2b8a-ce80-4f0e-b06d-494e26238b12 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.559 187212 DEBUG nova.network.neutron [req-caa3cdf7-2fd2-495d-bc12-29c76724b7d4 req-104a2b8a-ce80-4f0e-b06d-494e26238b12 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Refreshing network info cache for port e30774db-d3d3-4438-b68a-6f7855f55128 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.665 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.665 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.665 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.812 187212 INFO nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Creating config drive at /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.config
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.816 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph9vyfur0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:10:49 compute-0 nova_compute[187208]: 2025-12-05 12:10:49.950 187212 DEBUG oslo_concurrency.processutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph9vyfur0" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:10:50 compute-0 kernel: tape30774db-d3: entered promiscuous mode
Dec 05 12:10:50 compute-0 NetworkManager[55691]: <info>  [1764936650.0408] manager: (tape30774db-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/297)
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.045 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 ovn_controller[95610]: 2025-12-05T12:10:50Z|00747|binding|INFO|Claiming lport e30774db-d3d3-4438-b68a-6f7855f55128 for this chassis.
Dec 05 12:10:50 compute-0 ovn_controller[95610]: 2025-12-05T12:10:50Z|00748|binding|INFO|e30774db-d3d3-4438-b68a-6f7855f55128: Claiming fa:16:3e:50:8e:78 10.100.0.9
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.049 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 ovn_controller[95610]: 2025-12-05T12:10:50Z|00749|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 ovn-installed in OVS
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.063 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 systemd-udevd[232718]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:10:50 compute-0 NetworkManager[55691]: <info>  [1764936650.0945] device (tape30774db-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:10:50 compute-0 NetworkManager[55691]: <info>  [1764936650.0956] device (tape30774db-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:10:50 compute-0 systemd-machined[153543]: New machine qemu-90-instance-00000050.
Dec 05 12:10:50 compute-0 systemd[1]: Started Virtual Machine qemu-90-instance-00000050.
Dec 05 12:10:50 compute-0 podman[232703]: 2025-12-05 12:10:50.123237381 +0000 UTC m=+0.088530055 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:10:50 compute-0 ovn_controller[95610]: 2025-12-05T12:10:50Z|00750|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 up in Southbound
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.218 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8e:78 10.100.0.9'], port_security=['fa:16:3e:50:8e:78 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e30774db-d3d3-4438-b68a-6f7855f55128) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.220 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e30774db-d3d3-4438-b68a-6f7855f55128 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be bound to our chassis
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.223 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.235 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b45274f3-868a-4613-a16d-baf0f5cfb2fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.236 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82130d25-f1 in ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.238 187212 DEBUG oslo_concurrency.lockutils [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "e689e2f0-16e9-402a-986e-a769d72fa0bd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.238 187212 DEBUG oslo_concurrency.lockutils [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.238 187212 DEBUG oslo_concurrency.lockutils [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.238 187212 DEBUG oslo_concurrency.lockutils [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.238 187212 DEBUG oslo_concurrency.lockutils [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.240 187212 INFO nova.compute.manager [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Terminating instance
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.239 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82130d25-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.239 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5005e6ac-f8bb-4dc8-88d2-f867363dbc36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.241 187212 DEBUG nova.compute.manager [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.242 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f38f1d28-8f04-4649-9954-2bd9cd36da2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.259 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[fed9f24e-a933-4db5-ae96-6880082a1a87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 kernel: tap10dc6775-d9 (unregistering): left promiscuous mode
Dec 05 12:10:50 compute-0 NetworkManager[55691]: <info>  [1764936650.2734] device (tap10dc6775-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.276 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a10b5ed-11a4-45a4-b257-e90a9057b2f7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.285 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 ovn_controller[95610]: 2025-12-05T12:10:50Z|00751|binding|INFO|Releasing lport 10dc6775-d9c9-40ca-bd05-41c56cffc744 from this chassis (sb_readonly=0)
Dec 05 12:10:50 compute-0 ovn_controller[95610]: 2025-12-05T12:10:50Z|00752|binding|INFO|Setting lport 10dc6775-d9c9-40ca-bd05-41c56cffc744 down in Southbound
Dec 05 12:10:50 compute-0 ovn_controller[95610]: 2025-12-05T12:10:50Z|00753|binding|INFO|Removing iface tap10dc6775-d9 ovn-installed in OVS
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.300 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.302 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:2c:5b 10.100.0.12'], port_security=['fa:16:3e:63:2c:5b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e689e2f0-16e9-402a-986e-a769d72fa0bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e3f3e747de24befad6008f67eb551ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3ef52474-3611-44ee-97e2-190329a43cf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ad412fa-dc2f-4f56-abb7-47e751784509, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=10dc6775-d9c9-40ca-bd05-41c56cffc744) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.312 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9f7ed482-7115-4bd3-8e12-6a5a1199afa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.318 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[55122ac1-657b-4c02-b2f2-369be6fa2ba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 NetworkManager[55691]: <info>  [1764936650.3209] manager: (tap82130d25-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/298)
Dec 05 12:10:50 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Dec 05 12:10:50 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d0000004e.scope: Consumed 4.985s CPU time.
Dec 05 12:10:50 compute-0 systemd-machined[153543]: Machine qemu-88-instance-0000004e terminated.
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.359 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1084d4f3-e18e-46d9-ad13-0917504c5b30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.362 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab0ea77-56eb-4674-b936-ac4de55c7a7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 NetworkManager[55691]: <info>  [1764936650.3877] device (tap82130d25-f0): carrier: link connected
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.393 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3adf1ac0-95df-4421-b958-4bc7d66ef4bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.415 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[22e3d879-9f93-49a3-9354-22da6144191f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82130d25-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:36:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403095, 'reachable_time': 38277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232779, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.439 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[88998175-5ab0-4c75-a9be-1aa4ace4d82b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:36e4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403095, 'tstamp': 403095}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232781, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.447 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936650.4472654, 28e48516-8665-4d98-a92d-c84b7da9a284 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.448 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Started (Lifecycle Event)
Dec 05 12:10:50 compute-0 NetworkManager[55691]: <info>  [1764936650.4576] manager: (tap10dc6775-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/299)
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.462 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d523495f-c848-467c-9fa5-4830ba117765]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82130d25-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:36:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403095, 'reachable_time': 38277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232782, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.489 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.494 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7b5461db-a8b6-4a44-bed6-a0a384f5b01f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.497 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936650.4492803, 28e48516-8665-4d98-a92d-c84b7da9a284 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.498 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Paused (Lifecycle Event)
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.505 187212 INFO nova.virt.libvirt.driver [-] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Instance destroyed successfully.
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.506 187212 DEBUG nova.objects.instance [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lazy-loading 'resources' on Instance uuid e689e2f0-16e9-402a-986e-a769d72fa0bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.521 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.522 187212 DEBUG nova.virt.libvirt.vif [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-2116425279',display_name='tempest-ServerPasswordTestJSON-server-2116425279',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-2116425279',id=78,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e3f3e747de24befad6008f67eb551ae',ramdisk_id='',reservation_id='r-iw9yq52b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerPasswordTestJSON-1312525266',owner_user_name='tempest-ServerPasswordTestJSON-1312525266-project-member',password_0='',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:48Z,user_data=None,user_id='8b8b32a7fde5424795b54914a14028b5',uuid=e689e2f0-16e9-402a-986e-a769d72fa0bd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.522 187212 DEBUG nova.network.os_vif_util [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Converting VIF {"id": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "address": "fa:16:3e:63:2c:5b", "network": {"id": "f97e8b9d-fb9c-4712-b30e-e03f0b0d85da", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-905639670-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e3f3e747de24befad6008f67eb551ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10dc6775-d9", "ovs_interfaceid": "10dc6775-d9c9-40ca-bd05-41c56cffc744", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.523 187212 DEBUG nova.network.os_vif_util [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:2c:5b,bridge_name='br-int',has_traffic_filtering=True,id=10dc6775-d9c9-40ca-bd05-41c56cffc744,network=Network(f97e8b9d-fb9c-4712-b30e-e03f0b0d85da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10dc6775-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.523 187212 DEBUG os_vif [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:2c:5b,bridge_name='br-int',has_traffic_filtering=True,id=10dc6775-d9c9-40ca-bd05-41c56cffc744,network=Network(f97e8b9d-fb9c-4712-b30e-e03f0b0d85da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10dc6775-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.527 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.528 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap10dc6775-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.530 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.535 187212 INFO os_vif [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:2c:5b,bridge_name='br-int',has_traffic_filtering=True,id=10dc6775-d9c9-40ca-bd05-41c56cffc744,network=Network(f97e8b9d-fb9c-4712-b30e-e03f0b0d85da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10dc6775-d9')
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.535 187212 INFO nova.virt.libvirt.driver [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Deleting instance files /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd_del
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.536 187212 INFO nova.virt.libvirt.driver [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Deletion of /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd_del complete
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.540 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.576 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4c15c6fc-be3a-427f-80e3-d78d7700e895]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.577 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82130d25-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.577 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.578 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82130d25-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:50 compute-0 kernel: tap82130d25-f0: entered promiscuous mode
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.580 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 NetworkManager[55691]: <info>  [1764936650.5809] manager: (tap82130d25-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.583 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.588 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82130d25-f0, col_values=(('external_ids', {'iface-id': 'f81c4a80-27d3-4231-a37a-7c231838aca7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.589 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 ovn_controller[95610]: 2025-12-05T12:10:50Z|00754|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.592 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.595 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82130d25-ff6c-480e-884d-f3d97b6fd9be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82130d25-ff6c-480e-884d-f3d97b6fd9be.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.595 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[22a4e363-79a3-4969-87c9-0646711b96e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.596 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/82130d25-ff6c-480e-884d-f3d97b6fd9be.pid.haproxy
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.597 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'env', 'PROCESS_TAG=haproxy-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82130d25-ff6c-480e-884d-f3d97b6fd9be.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.603 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.665 187212 INFO nova.compute.manager [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Took 0.42 seconds to destroy the instance on the hypervisor.
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.666 187212 DEBUG oslo.service.loopingcall [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.666 187212 DEBUG nova.compute.manager [-] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.667 187212 DEBUG nova.network.neutron [-] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:10:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:50.953 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.958 187212 DEBUG nova.network.neutron [req-e22dbf9f-c40d-46f4-88ce-aeb216f2cdc7 req-508a8509-5d02-4b52-b768-81cfcc909283 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updated VIF entry in instance network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.959 187212 DEBUG nova.network.neutron [req-e22dbf9f-c40d-46f4-88ce-aeb216f2cdc7 req-508a8509-5d02-4b52-b768-81cfcc909283 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.960 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:50 compute-0 nova_compute[187208]: 2025-12-05 12:10:50.985 187212 DEBUG oslo_concurrency.lockutils [req-e22dbf9f-c40d-46f4-88ce-aeb216f2cdc7 req-508a8509-5d02-4b52-b768-81cfcc909283 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:51 compute-0 podman[232829]: 2025-12-05 12:10:51.104748032 +0000 UTC m=+0.061559997 container create 9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:10:51 compute-0 systemd[1]: Started libpod-conmon-9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b.scope.
Dec 05 12:10:51 compute-0 podman[232829]: 2025-12-05 12:10:51.066756042 +0000 UTC m=+0.023568057 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:10:51 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:10:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5da39cf8afffca99931577d63971bef13ebdbc9b82672cc00526d085f11c3c50/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:10:51 compute-0 podman[232829]: 2025-12-05 12:10:51.192630154 +0000 UTC m=+0.149442119 container init 9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 05 12:10:51 compute-0 podman[232829]: 2025-12-05 12:10:51.198824232 +0000 UTC m=+0.155636197 container start 9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 12:10:51 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[232845]: [NOTICE]   (232849) : New worker (232851) forked
Dec 05 12:10:51 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[232845]: [NOTICE]   (232849) : Loading success.
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.258 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 10dc6775-d9c9-40ca-bd05-41c56cffc744 in datapath f97e8b9d-fb9c-4712-b30e-e03f0b0d85da unbound from our chassis
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.260 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f97e8b9d-fb9c-4712-b30e-e03f0b0d85da, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.261 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[59af3f3a-b6c6-4db9-b2b1-5714bddb82be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.261 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da namespace which is not needed anymore
Dec 05 12:10:51 compute-0 neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da[232506]: [NOTICE]   (232510) : haproxy version is 2.8.14-c23fe91
Dec 05 12:10:51 compute-0 neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da[232506]: [NOTICE]   (232510) : path to executable is /usr/sbin/haproxy
Dec 05 12:10:51 compute-0 neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da[232506]: [ALERT]    (232510) : Current worker (232512) exited with code 143 (Terminated)
Dec 05 12:10:51 compute-0 neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da[232506]: [WARNING]  (232510) : All workers exited. Exiting... (0)
Dec 05 12:10:51 compute-0 systemd[1]: libpod-51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d.scope: Deactivated successfully.
Dec 05 12:10:51 compute-0 podman[232878]: 2025-12-05 12:10:51.404820544 +0000 UTC m=+0.058856230 container died 51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 12:10:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-831e2f239e994d4e8099b3d44bb3baf813987ee1ff02110006e4a66e930f8440-merged.mount: Deactivated successfully.
Dec 05 12:10:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d-userdata-shm.mount: Deactivated successfully.
Dec 05 12:10:51 compute-0 podman[232878]: 2025-12-05 12:10:51.461542081 +0000 UTC m=+0.115577747 container cleanup 51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:10:51 compute-0 systemd[1]: libpod-conmon-51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d.scope: Deactivated successfully.
Dec 05 12:10:51 compute-0 podman[232908]: 2025-12-05 12:10:51.525362403 +0000 UTC m=+0.038710452 container remove 51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.535 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b0804120-929d-4e84-9e27-509ee7594b2a]: (4, ('Fri Dec  5 12:10:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da (51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d)\n51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d\nFri Dec  5 12:10:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da (51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d)\n51eb569c746a99ff191df31bc473c714e5a7f66f1a0c229787fd99a63497de2d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.537 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05a2ad8a-1175-415d-923a-10640c85b60b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.538 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf97e8b9d-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:10:51 compute-0 nova_compute[187208]: 2025-12-05 12:10:51.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:51 compute-0 kernel: tapf97e8b9d-f0: left promiscuous mode
Dec 05 12:10:51 compute-0 nova_compute[187208]: 2025-12-05 12:10:51.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:51 compute-0 nova_compute[187208]: 2025-12-05 12:10:51.596 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.597 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[23bd530b-dad3-4bfb-a5ad-42c00fb487c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.610 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b08bffb7-1e90-4553-83ed-f71db30350a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.611 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[981e6dce-e127-4670-a7bf-b742d6db4b07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.627 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[816bcd56-ce91-43be-9c12-36daac0c0731]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402315, 'reachable_time': 41617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232923, 'error': None, 'target': 'ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.629 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f97e8b9d-fb9c-4712-b30e-e03f0b0d85da deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.629 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[3fbdf452-42a6-4c17-b583-039769a662ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:10:51 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:10:51.630 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:10:51 compute-0 systemd[1]: run-netns-ovnmeta\x2df97e8b9d\x2dfb9c\x2d4712\x2db30e\x2de03f0b0d85da.mount: Deactivated successfully.
Dec 05 12:10:52 compute-0 nova_compute[187208]: 2025-12-05 12:10:52.529 187212 DEBUG nova.network.neutron [req-caa3cdf7-2fd2-495d-bc12-29c76724b7d4 req-104a2b8a-ce80-4f0e-b06d-494e26238b12 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updated VIF entry in instance network info cache for port e30774db-d3d3-4438-b68a-6f7855f55128. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:10:52 compute-0 nova_compute[187208]: 2025-12-05 12:10:52.529 187212 DEBUG nova.network.neutron [req-caa3cdf7-2fd2-495d-bc12-29c76724b7d4 req-104a2b8a-ce80-4f0e-b06d-494e26238b12 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:52 compute-0 nova_compute[187208]: 2025-12-05 12:10:52.838 187212 DEBUG oslo_concurrency.lockutils [req-caa3cdf7-2fd2-495d-bc12-29c76724b7d4 req-104a2b8a-ce80-4f0e-b06d-494e26238b12 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.000 187212 DEBUG nova.network.neutron [-] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.035 187212 INFO nova.compute.manager [-] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Took 2.37 seconds to deallocate network for instance.
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.089 187212 DEBUG oslo_concurrency.lockutils [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.090 187212 DEBUG oslo_concurrency.lockutils [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.136 187212 DEBUG nova.compute.manager [req-5e5e5ec9-ff80-48d8-8914-75fed84ae7bf req-a17bda68-a2b4-467d-abeb-cc5fb6c9eb8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.136 187212 DEBUG nova.compute.manager [req-5e5e5ec9-ff80-48d8-8914-75fed84ae7bf req-a17bda68-a2b4-467d-abeb-cc5fb6c9eb8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing instance network info cache due to event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.137 187212 DEBUG oslo_concurrency.lockutils [req-5e5e5ec9-ff80-48d8-8914-75fed84ae7bf req-a17bda68-a2b4-467d-abeb-cc5fb6c9eb8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.137 187212 DEBUG oslo_concurrency.lockutils [req-5e5e5ec9-ff80-48d8-8914-75fed84ae7bf req-a17bda68-a2b4-467d-abeb-cc5fb6c9eb8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.137 187212 DEBUG nova.network.neutron [req-5e5e5ec9-ff80-48d8-8914-75fed84ae7bf req-a17bda68-a2b4-467d-abeb-cc5fb6c9eb8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.246 187212 DEBUG nova.compute.provider_tree [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.272 187212 DEBUG nova.scheduler.client.report [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.317 187212 DEBUG oslo_concurrency.lockutils [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:53 compute-0 nova_compute[187208]: 2025-12-05 12:10:53.524 187212 INFO nova.scheduler.client.report [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Deleted allocations for instance e689e2f0-16e9-402a-986e-a769d72fa0bd
Dec 05 12:10:54 compute-0 nova_compute[187208]: 2025-12-05 12:10:54.163 187212 DEBUG nova.compute.manager [req-1cbdfef5-31d3-4f17-9f50-e94e068c1026 req-8e65b00c-9c48-49f7-b45c-3c065f189ec0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Received event network-vif-deleted-10dc6775-d9c9-40ca-bd05-41c56cffc744 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:54 compute-0 nova_compute[187208]: 2025-12-05 12:10:54.167 187212 DEBUG oslo_concurrency.lockutils [None req-f80e9296-15cc-48fe-a0bf-b11b86a531f1 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:54 compute-0 nova_compute[187208]: 2025-12-05 12:10:54.216 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:54 compute-0 ovn_controller[95610]: 2025-12-05T12:10:54Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a2:40:d1 10.100.0.8
Dec 05 12:10:54 compute-0 ovn_controller[95610]: 2025-12-05T12:10:54Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:40:d1 10.100.0.8
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.521 187212 DEBUG nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.522 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.523 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.523 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.523 187212 DEBUG nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Processing event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.524 187212 DEBUG nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.524 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.524 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.525 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.525 187212 DEBUG nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.525 187212 WARNING nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state building and task_state spawning.
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.525 187212 DEBUG nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Received event network-vif-unplugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.526 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.526 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.526 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.527 187212 DEBUG nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] No waiting events found dispatching network-vif-unplugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.527 187212 WARNING nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Received unexpected event network-vif-unplugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 for instance with vm_state deleted and task_state None.
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.527 187212 DEBUG nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Received event network-vif-plugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.528 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.528 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.528 187212 DEBUG oslo_concurrency.lockutils [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.528 187212 DEBUG nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] No waiting events found dispatching network-vif-plugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.529 187212 WARNING nova.compute.manager [req-87b8a007-2299-49a9-9265-b480da710cd0 req-ebe4f9b1-2f93-44b0-9147-1f53a8840f39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Received unexpected event network-vif-plugged-10dc6775-d9c9-40ca-bd05-41c56cffc744 for instance with vm_state deleted and task_state None.
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.530 187212 DEBUG nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.530 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.540 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.541 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936655.5402358, 28e48516-8665-4d98-a92d-c84b7da9a284 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.542 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Resumed (Lifecycle Event)
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.547 187212 INFO nova.virt.libvirt.driver [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance spawned successfully.
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.547 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.666 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.671 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.675 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.676 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.676 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.677 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.677 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:55 compute-0 nova_compute[187208]: 2025-12-05 12:10:55.677 187212 DEBUG nova.virt.libvirt.driver [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:10:56 compute-0 nova_compute[187208]: 2025-12-05 12:10:56.045 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:10:56 compute-0 podman[232943]: 2025-12-05 12:10:56.216247029 +0000 UTC m=+0.068961440 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:10:56 compute-0 nova_compute[187208]: 2025-12-05 12:10:56.563 187212 INFO nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Took 13.13 seconds to spawn the instance on the hypervisor.
Dec 05 12:10:56 compute-0 nova_compute[187208]: 2025-12-05 12:10:56.564 187212 DEBUG nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:56 compute-0 nova_compute[187208]: 2025-12-05 12:10:56.637 187212 INFO nova.compute.manager [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Took 13.88 seconds to build instance.
Dec 05 12:10:56 compute-0 nova_compute[187208]: 2025-12-05 12:10:56.656 187212 DEBUG oslo_concurrency.lockutils [None req-d0b1b90a-7369-488c-929d-d7f14e473e8b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.989s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.056 187212 DEBUG nova.network.neutron [req-5e5e5ec9-ff80-48d8-8914-75fed84ae7bf req-a17bda68-a2b4-467d-abeb-cc5fb6c9eb8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updated VIF entry in instance network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.057 187212 DEBUG nova.network.neutron [req-5e5e5ec9-ff80-48d8-8914-75fed84ae7bf req-a17bda68-a2b4-467d-abeb-cc5fb6c9eb8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.084 187212 INFO nova.compute.manager [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Rebuilding instance
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.090 187212 DEBUG oslo_concurrency.lockutils [req-5e5e5ec9-ff80-48d8-8914-75fed84ae7bf req-a17bda68-a2b4-467d-abeb-cc5fb6c9eb8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.290 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.422 187212 DEBUG nova.compute.manager [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.563 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_requests' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.574 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_devices' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.586 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'resources' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.597 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'migration_context' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.612 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:10:58 compute-0 nova_compute[187208]: 2025-12-05 12:10:58.616 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:10:58 compute-0 ovn_controller[95610]: 2025-12-05T12:10:58Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:56:54:21 10.100.0.12
Dec 05 12:10:58 compute-0 ovn_controller[95610]: 2025-12-05T12:10:58Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:56:54:21 10.100.0.12
Dec 05 12:10:58 compute-0 ovn_controller[95610]: 2025-12-05T12:10:58Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:a5:e5 10.100.0.9
Dec 05 12:10:58 compute-0 ovn_controller[95610]: 2025-12-05T12:10:58Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:a5:e5 10.100.0.9
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.533 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.708 187212 DEBUG nova.compute.manager [req-524e0876-eca1-482d-b5cd-c136c04c24c5 req-c08cf4d1-10f0-4a82-81e7-a1c5b4a5a06f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.708 187212 DEBUG nova.compute.manager [req-524e0876-eca1-482d-b5cd-c136c04c24c5 req-c08cf4d1-10f0-4a82-81e7-a1c5b4a5a06f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing instance network info cache due to event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.709 187212 DEBUG oslo_concurrency.lockutils [req-524e0876-eca1-482d-b5cd-c136c04c24c5 req-c08cf4d1-10f0-4a82-81e7-a1c5b4a5a06f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.709 187212 DEBUG oslo_concurrency.lockutils [req-524e0876-eca1-482d-b5cd-c136c04c24c5 req-c08cf4d1-10f0-4a82-81e7-a1c5b4a5a06f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.709 187212 DEBUG nova.network.neutron [req-524e0876-eca1-482d-b5cd-c136c04c24c5 req-c08cf4d1-10f0-4a82-81e7-a1c5b4a5a06f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.755 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.755 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.775 187212 DEBUG nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.852 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.853 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.864 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.864 187212 INFO nova.compute.claims [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.981 187212 DEBUG oslo_concurrency.lockutils [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-54d9605a-998b-4492-afc8-f7a5b0dd4e84-7b183eee-c877-4387-a2f2-78923af9a88b" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.981 187212 DEBUG oslo_concurrency.lockutils [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-54d9605a-998b-4492-afc8-f7a5b0dd4e84-7b183eee-c877-4387-a2f2-78923af9a88b" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:00 compute-0 nova_compute[187208]: 2025-12-05 12:11:00.982 187212 DEBUG nova.objects.instance [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid 54d9605a-998b-4492-afc8-f7a5b0dd4e84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.197 187212 DEBUG nova.compute.provider_tree [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.218 187212 DEBUG nova.scheduler.client.report [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.250 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.251 187212 DEBUG nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.272 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "235e69b1-e04e-4d65-92e2-864b105f03cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.272 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.307 187212 DEBUG nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.312 187212 DEBUG nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.313 187212 DEBUG nova.network.neutron [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.349 187212 INFO nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.373 187212 DEBUG nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.401 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.402 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.412 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.412 187212 INFO nova.compute.claims [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.603 187212 DEBUG nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.604 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.605 187212 INFO nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Creating image(s)
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.605 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "/var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.605 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "/var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.606 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "/var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.623 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:01.632 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.682 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.683 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.683 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.694 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.750 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.751 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.791 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.792 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.792 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.852 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.854 187212 DEBUG nova.virt.disk.api [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Checking if we can resize image /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.855 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.915 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.917 187212 DEBUG nova.virt.disk.api [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Cannot resize image /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:11:01 compute-0 nova_compute[187208]: 2025-12-05 12:11:01.917 187212 DEBUG nova.objects.instance [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'migration_context' on Instance uuid 9147fd8a-b772-485b-8dff-7bb1a235c4dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.062 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.062 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Ensure instance console log exists: /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.064 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.064 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.065 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:02 compute-0 kernel: tap8c343187-71 (unregistering): left promiscuous mode
Dec 05 12:11:02 compute-0 NetworkManager[55691]: <info>  [1764936662.0967] device (tap8c343187-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.106 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:02 compute-0 ovn_controller[95610]: 2025-12-05T12:11:02Z|00755|binding|INFO|Releasing lport 8c343187-712d-4aee-9c47-18497ec1042e from this chassis (sb_readonly=0)
Dec 05 12:11:02 compute-0 ovn_controller[95610]: 2025-12-05T12:11:02Z|00756|binding|INFO|Setting lport 8c343187-712d-4aee-9c47-18497ec1042e down in Southbound
Dec 05 12:11:02 compute-0 ovn_controller[95610]: 2025-12-05T12:11:02Z|00757|binding|INFO|Removing iface tap8c343187-71 ovn-installed in OVS
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.112 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.115 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:54:21 10.100.0.12'], port_security=['fa:16:3e:56:54:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '8fe1c6df-f787-4c56-b3e7-899cf5e9f723', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8c343187-712d-4aee-9c47-18497ec1042e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.116 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8c343187-712d-4aee-9c47-18497ec1042e in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c unbound from our chassis
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.118 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.123 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[053a9a9f-aa1a-426a-937a-7f8de2b788b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.124 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace which is not needed anymore
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.132 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.160 187212 DEBUG nova.objects.instance [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid 54d9605a-998b-4492-afc8-f7a5b0dd4e84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.164 187212 DEBUG nova.compute.provider_tree [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:02 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Dec 05 12:11:02 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d0000004d.scope: Consumed 13.467s CPU time.
Dec 05 12:11:02 compute-0 systemd-machined[153543]: Machine qemu-87-instance-0000004d terminated.
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.179 187212 DEBUG nova.network.neutron [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.183 187212 DEBUG nova.scheduler.client.report [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.212 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.213 187212 DEBUG nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:11:02 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[232369]: [NOTICE]   (232392) : haproxy version is 2.8.14-c23fe91
Dec 05 12:11:02 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[232369]: [NOTICE]   (232392) : path to executable is /usr/sbin/haproxy
Dec 05 12:11:02 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[232369]: [WARNING]  (232392) : Exiting Master process...
Dec 05 12:11:02 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[232369]: [ALERT]    (232392) : Current worker (232394) exited with code 143 (Terminated)
Dec 05 12:11:02 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[232369]: [WARNING]  (232392) : All workers exited. Exiting... (0)
Dec 05 12:11:02 compute-0 systemd[1]: libpod-8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740.scope: Deactivated successfully.
Dec 05 12:11:02 compute-0 podman[233033]: 2025-12-05 12:11:02.263280582 +0000 UTC m=+0.049530622 container died 8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.271 187212 DEBUG nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.271 187212 DEBUG nova.network.neutron [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:11:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740-userdata-shm.mount: Deactivated successfully.
Dec 05 12:11:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-f57f6f5ca920f6b052ac4a2d8205d8028c66e8b7337dd5e242b29ed149ec9c59-merged.mount: Deactivated successfully.
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.295 187212 INFO nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:11:02 compute-0 podman[233033]: 2025-12-05 12:11:02.299532563 +0000 UTC m=+0.085782603 container cleanup 8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:11:02 compute-0 systemd[1]: libpod-conmon-8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740.scope: Deactivated successfully.
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.317 187212 DEBUG nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.335 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:02 compute-0 podman[233061]: 2025-12-05 12:11:02.376571863 +0000 UTC m=+0.056045959 container remove 8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.381 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0321a2-ad4d-474b-95b0-cb26eeb6510f]: (4, ('Fri Dec  5 12:11:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740)\n8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740\nFri Dec  5 12:11:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740)\n8ba1d880db173613fdeac6fb0cfee6943b5b871374e8cd3683f014de97496740\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.383 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a549c549-6159-42ef-a919-33848124f778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.384 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:02 compute-0 kernel: tap7be4540a-00: left promiscuous mode
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.386 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.403 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.406 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca63d71-3629-4e45-b44b-206df0784a85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.417 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e07582e4-84d6-4266-abfa-1a0f89e275b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.418 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0599e002-5644-4a3c-935e-d6dfc1258115]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.434 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[014a825d-da9d-4f01-9ad0-d796dfd81a69]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402108, 'reachable_time': 32005, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233092, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d7be4540a\x2d0e0d\x2d45dd\x2d9ed3\x2d2c2701ae3e2c.mount: Deactivated successfully.
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.437 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:11:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:02.437 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[83e8edca-a998-4cdb-bfa4-8cc34e3dbc49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.447 187212 DEBUG nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.448 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.448 187212 INFO nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Creating image(s)
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.449 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "/var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.449 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.449 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.462 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.523 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.524 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.524 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.535 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.598 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.599 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.634 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.636 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.636 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.658 187212 INFO nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance shutdown successfully after 4 seconds.
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.664 187212 INFO nova.virt.libvirt.driver [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance destroyed successfully.
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.670 187212 INFO nova.virt.libvirt.driver [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance destroyed successfully.
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.671 187212 DEBUG nova.virt.libvirt.vif [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2029374639',display_name='tempest-ServerDiskConfigTestJSON-server-2029374639',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2029374639',id=77,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ep0a320q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:57Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=8fe1c6df-f787-4c56-b3e7-899cf5e9f723,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.672 187212 DEBUG nova.network.os_vif_util [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.673 187212 DEBUG nova.network.os_vif_util [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.673 187212 DEBUG os_vif [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.676 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.676 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c343187-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.696 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.696 187212 DEBUG nova.virt.disk.api [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Checking if we can resize image /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.697 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.726 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.730 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.732 187212 INFO os_vif [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71')
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.733 187212 INFO nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Deleting instance files /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723_del
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.734 187212 INFO nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Deletion of /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723_del complete
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.774 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.775 187212 DEBUG nova.virt.disk.api [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Cannot resize image /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.776 187212 DEBUG nova.objects.instance [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'migration_context' on Instance uuid 235e69b1-e04e-4d65-92e2-864b105f03cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.794 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.794 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Ensure instance console log exists: /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.795 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.795 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.796 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.810 187212 DEBUG nova.policy [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.828 187212 DEBUG nova.policy [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.968 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.969 187212 INFO nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Creating image(s)
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.970 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.970 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.971 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:02 compute-0 nova_compute[187208]: 2025-12-05 12:11:02.990 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:03.016 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:03.016 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:03.017 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.051 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.052 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.052 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.064 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.090 187212 DEBUG nova.policy [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.121 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.121 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.153 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.154 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.155 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.210 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.211 187212 DEBUG nova.virt.disk.api [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Checking if we can resize image /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.211 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.267 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.268 187212 DEBUG nova.virt.disk.api [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Cannot resize image /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.268 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.269 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Ensure instance console log exists: /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.269 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.269 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.270 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.272 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start _get_guest_xml network_info=[{"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.276 187212 WARNING nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.280 187212 DEBUG nova.virt.libvirt.host [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.281 187212 DEBUG nova.virt.libvirt.host [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.284 187212 DEBUG nova.virt.libvirt.host [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.284 187212 DEBUG nova.virt.libvirt.host [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.284 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.285 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.285 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.285 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.286 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.286 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.286 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.286 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.287 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.287 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.287 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.288 187212 DEBUG nova.virt.hardware [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.288 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.327 187212 DEBUG nova.virt.libvirt.vif [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:10:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2029374639',display_name='tempest-ServerDiskConfigTestJSON-server-2029374639',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2029374639',id=77,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ep0a320q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:02Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=8fe1c6df-f787-4c56-b3e7-899cf5e9f723,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.327 187212 DEBUG nova.network.os_vif_util [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.328 187212 DEBUG nova.network.os_vif_util [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.329 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <uuid>8fe1c6df-f787-4c56-b3e7-899cf5e9f723</uuid>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <name>instance-0000004d</name>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-2029374639</nova:name>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:11:03</nova:creationTime>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:11:03 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:11:03 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:11:03 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:11:03 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:03 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:11:03 compute-0 nova_compute[187208]:         <nova:user uuid="ef254bb2df0442c6bcadfb3a6861c0e9">tempest-ServerDiskConfigTestJSON-1245488084-project-member</nova:user>
Dec 05 12:11:03 compute-0 nova_compute[187208]:         <nova:project uuid="e836357870d746e49bc783da7cd3accd">tempest-ServerDiskConfigTestJSON-1245488084</nova:project>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:11:03 compute-0 nova_compute[187208]:         <nova:port uuid="8c343187-712d-4aee-9c47-18497ec1042e">
Dec 05 12:11:03 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <entry name="serial">8fe1c6df-f787-4c56-b3e7-899cf5e9f723</entry>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <entry name="uuid">8fe1c6df-f787-4c56-b3e7-899cf5e9f723</entry>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.config"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:56:54:21"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <target dev="tap8c343187-71"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/console.log" append="off"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:11:03 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:11:03 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:03 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:03 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:03 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.330 187212 DEBUG nova.compute.manager [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Preparing to wait for external event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.330 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.331 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.331 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.331 187212 DEBUG nova.virt.libvirt.vif [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:10:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2029374639',display_name='tempest-ServerDiskConfigTestJSON-server-2029374639',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2029374639',id=77,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ep0a320q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:02Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=8fe1c6df-f787-4c56-b3e7-899cf5e9f723,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.332 187212 DEBUG nova.network.os_vif_util [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.332 187212 DEBUG nova.network.os_vif_util [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.332 187212 DEBUG os_vif [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.333 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.334 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.338 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.338 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c343187-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.339 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8c343187-71, col_values=(('external_ids', {'iface-id': '8c343187-712d-4aee-9c47-18497ec1042e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:54:21', 'vm-uuid': '8fe1c6df-f787-4c56-b3e7-899cf5e9f723'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:03 compute-0 NetworkManager[55691]: <info>  [1764936663.3415] manager: (tap8c343187-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.343 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.345 187212 INFO os_vif [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71')
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.435 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.435 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.436 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No VIF found with MAC fa:16:3e:56:54:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.437 187212 INFO nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Using config drive
Dec 05 12:11:03 compute-0 podman[233130]: 2025-12-05 12:11:03.442921675 +0000 UTC m=+0.064172612 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.458 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'ec2_ids' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:03 compute-0 podman[233131]: 2025-12-05 12:11:03.461118657 +0000 UTC m=+0.081621333 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 05 12:11:03 compute-0 nova_compute[187208]: 2025-12-05 12:11:03.492 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'keypairs' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.026 187212 INFO nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Creating config drive at /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.config
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.031 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyyawonoz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.174 187212 DEBUG oslo_concurrency.processutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyyawonoz" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:04 compute-0 kernel: tap8c343187-71: entered promiscuous mode
Dec 05 12:11:04 compute-0 ovn_controller[95610]: 2025-12-05T12:11:04Z|00758|binding|INFO|Claiming lport 8c343187-712d-4aee-9c47-18497ec1042e for this chassis.
Dec 05 12:11:04 compute-0 systemd-udevd[233013]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:04 compute-0 ovn_controller[95610]: 2025-12-05T12:11:04Z|00759|binding|INFO|8c343187-712d-4aee-9c47-18497ec1042e: Claiming fa:16:3e:56:54:21 10.100.0.12
Dec 05 12:11:04 compute-0 NetworkManager[55691]: <info>  [1764936664.2459] manager: (tap8c343187-71): new Tun device (/org/freedesktop/NetworkManager/Devices/302)
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.246 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.255 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:54:21 10.100.0.12'], port_security=['fa:16:3e:56:54:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '8fe1c6df-f787-4c56-b3e7-899cf5e9f723', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8c343187-712d-4aee-9c47-18497ec1042e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:04 compute-0 NetworkManager[55691]: <info>  [1764936664.2572] device (tap8c343187-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:04 compute-0 NetworkManager[55691]: <info>  [1764936664.2584] device (tap8c343187-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.256 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8c343187-712d-4aee-9c47-18497ec1042e in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c bound to our chassis
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.258 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.258 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:04 compute-0 ovn_controller[95610]: 2025-12-05T12:11:04Z|00760|binding|INFO|Setting lport 8c343187-712d-4aee-9c47-18497ec1042e ovn-installed in OVS
Dec 05 12:11:04 compute-0 ovn_controller[95610]: 2025-12-05T12:11:04Z|00761|binding|INFO|Setting lport 8c343187-712d-4aee-9c47-18497ec1042e up in Southbound
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.261 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.265 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.270 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[15e1d8b3-0648-472d-afcd-19055c7b87c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.271 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7be4540a-01 in ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.273 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7be4540a-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.273 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[911089c5-e0fb-4830-80e5-b3be07927911]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.274 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c9526ccc-a910-4f4e-b341-6db536f66957]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.289 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[4359d1b1-e055-481a-b7b4-001f926f8db3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 systemd-machined[153543]: New machine qemu-91-instance-0000004d.
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa422f3-ddd3-4eb0-90be-97add2f75795]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 systemd[1]: Started Virtual Machine qemu-91-instance-0000004d.
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.346 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ed87f210-9f1a-4a76-a1eb-c46bc347cc1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 NetworkManager[55691]: <info>  [1764936664.3558] manager: (tap7be4540a-00): new Veth device (/org/freedesktop/NetworkManager/Devices/303)
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.355 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a1c68d-9d56-4ce9-9f50-04f24dac42e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.390 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2be98467-8588-40e8-8ea6-0ff9d0a25a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.394 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[576f922c-060d-4f6e-82c3-6e29e82ed858]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 NetworkManager[55691]: <info>  [1764936664.4170] device (tap7be4540a-00): carrier: link connected
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.425 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a808cf76-376e-4577-b89b-1595f1ed128e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.443 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5985e0c5-33d4-4737-9754-f78ee07b1d1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404498, 'reachable_time': 21790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233216, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.463 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7081710c-b343-4350-8f68-d8c8658815be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:4893'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404498, 'tstamp': 404498}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233217, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.483 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e4f2127-f1d7-4df5-89b9-e3ce836218fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404498, 'reachable_time': 21790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233218, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.494 187212 DEBUG nova.network.neutron [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Successfully updated port: 7b183eee-c877-4387-a2f2-78923af9a88b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.512 187212 DEBUG oslo_concurrency.lockutils [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.512 187212 DEBUG oslo_concurrency.lockutils [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.512 187212 DEBUG nova.network.neutron [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.526 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7020e9e5-a044-485a-aa9e-b4215ae6e071]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.595 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[22c3abba-8e8e-42de-ac24-c73f2c32bb76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.599 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.599 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.601 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7be4540a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:04 compute-0 NetworkManager[55691]: <info>  [1764936664.6049] manager: (tap7be4540a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Dec 05 12:11:04 compute-0 kernel: tap7be4540a-00: entered promiscuous mode
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.605 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.608 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7be4540a-00, col_values=(('external_ids', {'iface-id': '4dcf8e96-bf04-4914-959a-aad071dfa454'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:04 compute-0 ovn_controller[95610]: 2025-12-05T12:11:04Z|00762|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.634 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.634 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.635 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e08ada68-c254-4ba5-b1b3-6b262d887ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.636 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:11:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:04.638 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'env', 'PROCESS_TAG=haproxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:11:04 compute-0 nova_compute[187208]: 2025-12-05 12:11:04.943 187212 WARNING nova.network.neutron [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.018 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.018 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936665.0177255, 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.018 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] VM Started (Lifecycle Event)
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.044 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.048 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936665.017947, 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.048 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] VM Paused (Lifecycle Event)
Dec 05 12:11:05 compute-0 podman[233256]: 2025-12-05 12:11:04.972934003 +0000 UTC m=+0.018242605 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.072 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.075 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.095 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:11:05 compute-0 podman[233256]: 2025-12-05 12:11:05.140968004 +0000 UTC m=+0.186276586 container create 36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:11:05 compute-0 systemd[1]: Started libpod-conmon-36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59.scope.
Dec 05 12:11:05 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:11:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/590ee7311e2c20992106ed0c9fa1f31e967ba25b2776e9022a31d643ac47a1db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:11:05 compute-0 podman[233256]: 2025-12-05 12:11:05.233235292 +0000 UTC m=+0.278543904 container init 36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 12:11:05 compute-0 podman[233256]: 2025-12-05 12:11:05.238912595 +0000 UTC m=+0.284221187 container start 36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:11:05 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233271]: [NOTICE]   (233275) : New worker (233277) forked
Dec 05 12:11:05 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233271]: [NOTICE]   (233275) : Loading success.
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.594 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936650.5000813, e689e2f0-16e9-402a-986e-a769d72fa0bd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.594 187212 INFO nova.compute.manager [-] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] VM Stopped (Lifecycle Event)
Dec 05 12:11:05 compute-0 nova_compute[187208]: 2025-12-05 12:11:05.624 187212 DEBUG nova.compute.manager [None req-99485a10-4bb2-4590-977d-fccbc50c9b5d - - - - - -] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:06 compute-0 nova_compute[187208]: 2025-12-05 12:11:06.316 187212 DEBUG nova.network.neutron [req-524e0876-eca1-482d-b5cd-c136c04c24c5 req-c08cf4d1-10f0-4a82-81e7-a1c5b4a5a06f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updated VIF entry in instance network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:06 compute-0 nova_compute[187208]: 2025-12-05 12:11:06.317 187212 DEBUG nova.network.neutron [req-524e0876-eca1-482d-b5cd-c136c04c24c5 req-c08cf4d1-10f0-4a82-81e7-a1c5b4a5a06f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:06 compute-0 nova_compute[187208]: 2025-12-05 12:11:06.347 187212 DEBUG oslo_concurrency.lockutils [req-524e0876-eca1-482d-b5cd-c136c04c24c5 req-c08cf4d1-10f0-4a82-81e7-a1c5b4a5a06f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:06 compute-0 nova_compute[187208]: 2025-12-05 12:11:06.658 187212 DEBUG nova.network.neutron [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Successfully created port: e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:11:06 compute-0 nova_compute[187208]: 2025-12-05 12:11:06.790 187212 DEBUG nova.network.neutron [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Successfully created port: 4568b7d9-2870-421a-abd0-1598977ec82d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:11:07 compute-0 ovn_controller[95610]: 2025-12-05T12:11:07Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:50:8e:78 10.100.0.9
Dec 05 12:11:07 compute-0 ovn_controller[95610]: 2025-12-05T12:11:07Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:50:8e:78 10.100.0.9
Dec 05 12:11:07 compute-0 nova_compute[187208]: 2025-12-05 12:11:07.570 187212 DEBUG nova.compute.manager [req-908bf710-9717-44b2-99ba-d1aa1ac1b774 req-3b24dbd6-b0da-4c3b-928d-6b2a2921639a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:07 compute-0 nova_compute[187208]: 2025-12-05 12:11:07.571 187212 DEBUG nova.compute.manager [req-908bf710-9717-44b2-99ba-d1aa1ac1b774 req-3b24dbd6-b0da-4c3b-928d-6b2a2921639a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing instance network info cache due to event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:07 compute-0 nova_compute[187208]: 2025-12-05 12:11:07.572 187212 DEBUG oslo_concurrency.lockutils [req-908bf710-9717-44b2-99ba-d1aa1ac1b774 req-3b24dbd6-b0da-4c3b-928d-6b2a2921639a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:08 compute-0 nova_compute[187208]: 2025-12-05 12:11:08.170 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:08 compute-0 nova_compute[187208]: 2025-12-05 12:11:08.342 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.182 187212 DEBUG nova.network.neutron [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:09 compute-0 podman[233300]: 2025-12-05 12:11:09.201900762 +0000 UTC m=+0.058442648 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.215 187212 DEBUG oslo_concurrency.lockutils [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.216 187212 DEBUG oslo_concurrency.lockutils [req-908bf710-9717-44b2-99ba-d1aa1ac1b774 req-3b24dbd6-b0da-4c3b-928d-6b2a2921639a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.217 187212 DEBUG nova.network.neutron [req-908bf710-9717-44b2-99ba-d1aa1ac1b774 req-3b24dbd6-b0da-4c3b-928d-6b2a2921639a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.219 187212 DEBUG nova.virt.libvirt.vif [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-569275018',display_name='tempest-tempest.common.compute-instance-569275018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-569275018',id=74,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-h1nkz7rb',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=54d9605a-998b-4492-afc8-f7a5b0dd4e84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.219 187212 DEBUG nova.network.os_vif_util [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.220 187212 DEBUG nova.network.os_vif_util [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.220 187212 DEBUG os_vif [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.221 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.221 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.221 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.224 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.224 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b183eee-c8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.224 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b183eee-c8, col_values=(('external_ids', {'iface-id': '7b183eee-c877-4387-a2f2-78923af9a88b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:13:10', 'vm-uuid': '54d9605a-998b-4492-afc8-f7a5b0dd4e84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 NetworkManager[55691]: <info>  [1764936669.2275] manager: (tap7b183eee-c8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.227 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.236 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.238 187212 INFO os_vif [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8')
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.240 187212 DEBUG nova.virt.libvirt.vif [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-569275018',display_name='tempest-tempest.common.compute-instance-569275018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-569275018',id=74,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-h1nkz7rb',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=54d9605a-998b-4492-afc8-f7a5b0dd4e84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.240 187212 DEBUG nova.network.os_vif_util [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.241 187212 DEBUG nova.network.os_vif_util [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.243 187212 DEBUG nova.virt.libvirt.guest [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:58:13:10"/>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <target dev="tap7b183eee-c8"/>
Dec 05 12:11:09 compute-0 nova_compute[187208]: </interface>
Dec 05 12:11:09 compute-0 nova_compute[187208]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 05 12:11:09 compute-0 kernel: tap7b183eee-c8: entered promiscuous mode
Dec 05 12:11:09 compute-0 NetworkManager[55691]: <info>  [1764936669.2552] manager: (tap7b183eee-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/306)
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.256 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 ovn_controller[95610]: 2025-12-05T12:11:09Z|00763|binding|INFO|Claiming lport 7b183eee-c877-4387-a2f2-78923af9a88b for this chassis.
Dec 05 12:11:09 compute-0 ovn_controller[95610]: 2025-12-05T12:11:09Z|00764|binding|INFO|7b183eee-c877-4387-a2f2-78923af9a88b: Claiming fa:16:3e:58:13:10 10.100.0.5
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.264 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:13:10 10.100.0.5'], port_security=['fa:16:3e:58:13:10 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1639127318', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1639127318', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7b183eee-c877-4387-a2f2-78923af9a88b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.266 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7b183eee-c877-4387-a2f2-78923af9a88b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.268 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:11:09 compute-0 podman[233301]: 2025-12-05 12:11:09.271940032 +0000 UTC m=+0.126265664 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:11:09 compute-0 ovn_controller[95610]: 2025-12-05T12:11:09Z|00765|binding|INFO|Setting lport 7b183eee-c877-4387-a2f2-78923af9a88b ovn-installed in OVS
Dec 05 12:11:09 compute-0 ovn_controller[95610]: 2025-12-05T12:11:09Z|00766|binding|INFO|Setting lport 7b183eee-c877-4387-a2f2-78923af9a88b up in Southbound
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.275 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.283 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b63c4c16-a059-4787-a306-56015b7e1f42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:09 compute-0 systemd-udevd[233356]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:09 compute-0 NetworkManager[55691]: <info>  [1764936669.2988] device (tap7b183eee-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:09 compute-0 NetworkManager[55691]: <info>  [1764936669.2997] device (tap7b183eee-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.313 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5e83b7cf-6432-4fe2-ae94-75ac75b390d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.316 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a31e4b-07ff-4e9f-8f29-1f05659b9ee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.342 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[682543cd-fdea-4bd6-98e2-d9dcc3a7aa91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.360 187212 DEBUG nova.virt.libvirt.driver [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.360 187212 DEBUG nova.virt.libvirt.driver [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.361 187212 DEBUG nova.virt.libvirt.driver [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:bd:e5:94, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.361 187212 DEBUG nova.virt.libvirt.driver [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:58:13:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.360 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0747f073-8930-438f-9793-bdedf280342e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233364, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.377 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3a773b3b-b2d6-49a1-9b30-4d8392744fb5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398326, 'tstamp': 398326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233365, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398330, 'tstamp': 398330}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233365, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.379 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.380 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.381 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.382 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.382 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.382 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:09.383 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.397 187212 DEBUG nova.virt.libvirt.guest [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <nova:name>tempest-tempest.common.compute-instance-569275018</nova:name>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:11:09</nova:creationTime>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:11:09 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     <nova:port uuid="ef99bad5-d092-46f6-9b3a-8225cc233d1e">
Dec 05 12:11:09 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     <nova:port uuid="7b183eee-c877-4387-a2f2-78923af9a88b">
Dec 05 12:11:09 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:11:09 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:09 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:11:09 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:11:09 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.429 187212 DEBUG oslo_concurrency.lockutils [None req-49c337fd-e8c9-4a6a-a076-5660e89c6acc 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-54d9605a-998b-4492-afc8-f7a5b0dd4e84-7b183eee-c877-4387-a2f2-78923af9a88b" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.448s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:09 compute-0 ovn_controller[95610]: 2025-12-05T12:11:09Z|00767|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:11:09 compute-0 ovn_controller[95610]: 2025-12-05T12:11:09Z|00768|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:11:09 compute-0 ovn_controller[95610]: 2025-12-05T12:11:09Z|00769|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:11:09 compute-0 ovn_controller[95610]: 2025-12-05T12:11:09Z|00770|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.496 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.659 187212 DEBUG nova.compute.manager [req-f5691496-31b0-4c1d-b414-3f1ca2cc9d68 req-61fb1e39-2142-46d8-970d-648b13f306ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-changed-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.660 187212 DEBUG nova.compute.manager [req-f5691496-31b0-4c1d-b414-3f1ca2cc9d68 req-61fb1e39-2142-46d8-970d-648b13f306ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing instance network info cache due to event network-changed-7b183eee-c877-4387-a2f2-78923af9a88b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.660 187212 DEBUG oslo_concurrency.lockutils [req-f5691496-31b0-4c1d-b414-3f1ca2cc9d68 req-61fb1e39-2142-46d8-970d-648b13f306ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.787 187212 DEBUG nova.network.neutron [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Successfully updated port: 4568b7d9-2870-421a-abd0-1598977ec82d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.805 187212 DEBUG nova.network.neutron [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Successfully updated port: e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.807 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "refresh_cache-235e69b1-e04e-4d65-92e2-864b105f03cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.807 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquired lock "refresh_cache-235e69b1-e04e-4d65-92e2-864b105f03cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.808 187212 DEBUG nova.network.neutron [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.825 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "refresh_cache-9147fd8a-b772-485b-8dff-7bb1a235c4dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.825 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquired lock "refresh_cache-9147fd8a-b772-485b-8dff-7bb1a235c4dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.825 187212 DEBUG nova.network.neutron [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:09 compute-0 nova_compute[187208]: 2025-12-05 12:11:09.995 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:10 compute-0 nova_compute[187208]: 2025-12-05 12:11:10.346 187212 DEBUG nova.network.neutron [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:11:10 compute-0 nova_compute[187208]: 2025-12-05 12:11:10.383 187212 DEBUG nova.network.neutron [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:11:11 compute-0 ovn_controller[95610]: 2025-12-05T12:11:11Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:13:10 10.100.0.5
Dec 05 12:11:11 compute-0 ovn_controller[95610]: 2025-12-05T12:11:11Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:13:10 10.100.0.5
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.420 187212 DEBUG nova.compute.manager [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-vif-unplugged-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.420 187212 DEBUG oslo_concurrency.lockutils [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.420 187212 DEBUG oslo_concurrency.lockutils [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.421 187212 DEBUG oslo_concurrency.lockutils [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.421 187212 DEBUG nova.compute.manager [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] No event matching network-vif-unplugged-8c343187-712d-4aee-9c47-18497ec1042e in dict_keys([('network-vif-plugged', '8c343187-712d-4aee-9c47-18497ec1042e')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.421 187212 WARNING nova.compute.manager [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received unexpected event network-vif-unplugged-8c343187-712d-4aee-9c47-18497ec1042e for instance with vm_state active and task_state rebuild_spawning.
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.421 187212 DEBUG nova.compute.manager [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.422 187212 DEBUG oslo_concurrency.lockutils [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.422 187212 DEBUG oslo_concurrency.lockutils [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.422 187212 DEBUG oslo_concurrency.lockutils [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.422 187212 DEBUG nova.compute.manager [req-346184aa-c8e2-4ecd-8a89-be6f568544d8 req-7e662e72-53c0-4d7f-8523-2aba41d8d3b3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Processing event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.423 187212 DEBUG nova.compute.manager [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.426 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936671.4266655, 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.427 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] VM Resumed (Lifecycle Event)
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.428 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.432 187212 INFO nova.virt.libvirt.driver [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance spawned successfully.
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.432 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.468 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.474 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.478 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.478 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.479 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.479 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.479 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.481 187212 DEBUG nova.virt.libvirt.driver [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.521 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.567 187212 DEBUG nova.compute.manager [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.864 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.864 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.867 187212 DEBUG nova.objects.instance [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:11:11 compute-0 nova_compute[187208]: 2025-12-05 12:11:11.938 187212 DEBUG oslo_concurrency.lockutils [None req-5dea4cf4-2769-4398-a910-633a5f9742e3 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:12 compute-0 podman[233366]: 2025-12-05 12:11:12.229169588 +0000 UTC m=+0.073116729 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 12:11:12 compute-0 nova_compute[187208]: 2025-12-05 12:11:12.862 187212 DEBUG nova.network.neutron [req-908bf710-9717-44b2-99ba-d1aa1ac1b774 req-3b24dbd6-b0da-4c3b-928d-6b2a2921639a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updated VIF entry in instance network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:12 compute-0 nova_compute[187208]: 2025-12-05 12:11:12.863 187212 DEBUG nova.network.neutron [req-908bf710-9717-44b2-99ba-d1aa1ac1b774 req-3b24dbd6-b0da-4c3b-928d-6b2a2921639a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:12 compute-0 nova_compute[187208]: 2025-12-05 12:11:12.886 187212 DEBUG oslo_concurrency.lockutils [req-908bf710-9717-44b2-99ba-d1aa1ac1b774 req-3b24dbd6-b0da-4c3b-928d-6b2a2921639a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:12 compute-0 nova_compute[187208]: 2025-12-05 12:11:12.888 187212 DEBUG oslo_concurrency.lockutils [req-f5691496-31b0-4c1d-b414-3f1ca2cc9d68 req-61fb1e39-2142-46d8-970d-648b13f306ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:12 compute-0 nova_compute[187208]: 2025-12-05 12:11:12.888 187212 DEBUG nova.network.neutron [req-f5691496-31b0-4c1d-b414-3f1ca2cc9d68 req-61fb1e39-2142-46d8-970d-648b13f306ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing network info cache for port 7b183eee-c877-4387-a2f2-78923af9a88b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.645 187212 DEBUG nova.network.neutron [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Updating instance_info_cache with network_info: [{"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.677 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Releasing lock "refresh_cache-9147fd8a-b772-485b-8dff-7bb1a235c4dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.678 187212 DEBUG nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Instance network_info: |[{"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.681 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Start _get_guest_xml network_info=[{"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.685 187212 WARNING nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.692 187212 DEBUG nova.virt.libvirt.host [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.693 187212 DEBUG nova.virt.libvirt.host [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.710 187212 DEBUG nova.virt.libvirt.host [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.711 187212 DEBUG nova.virt.libvirt.host [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.712 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.712 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.712 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.712 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.712 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.713 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.713 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.713 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.713 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.713 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.714 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.714 187212 DEBUG nova.virt.hardware [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.717 187212 DEBUG nova.virt.libvirt.vif [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1940190658',display_name='tempest-ServersNegativeTestJSON-server-1940190658',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1940190658',id=81,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-f3tfdzox',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:01Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=9147fd8a-b772-485b-8dff-7bb1a235c4dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.718 187212 DEBUG nova.network.os_vif_util [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.718 187212 DEBUG nova.network.os_vif_util [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:22:ff,bridge_name='br-int',has_traffic_filtering=True,id=e0aaf2cf-092f-46ad-90e2-9d960c0b7f06,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0aaf2cf-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.719 187212 DEBUG nova.objects.instance [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'pci_devices' on Instance uuid 9147fd8a-b772-485b-8dff-7bb1a235c4dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.737 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <uuid>9147fd8a-b772-485b-8dff-7bb1a235c4dd</uuid>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <name>instance-00000051</name>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersNegativeTestJSON-server-1940190658</nova:name>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:11:13</nova:creationTime>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:user uuid="e90fa3a379b4494c84626bb6a761cd30">tempest-ServersNegativeTestJSON-1063007033-project-member</nova:user>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:project uuid="c5b34686513f4abc8165113eb8c6831e">tempest-ServersNegativeTestJSON-1063007033</nova:project>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:port uuid="e0aaf2cf-092f-46ad-90e2-9d960c0b7f06">
Dec 05 12:11:13 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="serial">9147fd8a-b772-485b-8dff-7bb1a235c4dd</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="uuid">9147fd8a-b772-485b-8dff-7bb1a235c4dd</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk.config"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:39:22:ff"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <target dev="tape0aaf2cf-09"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/console.log" append="off"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:13 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:13 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.738 187212 DEBUG nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Preparing to wait for external event network-vif-plugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.738 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.738 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.739 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.739 187212 DEBUG nova.virt.libvirt.vif [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1940190658',display_name='tempest-ServersNegativeTestJSON-server-1940190658',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1940190658',id=81,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-f3tfdzox',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:01Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=9147fd8a-b772-485b-8dff-7bb1a235c4dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.739 187212 DEBUG nova.network.os_vif_util [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.740 187212 DEBUG nova.network.os_vif_util [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:22:ff,bridge_name='br-int',has_traffic_filtering=True,id=e0aaf2cf-092f-46ad-90e2-9d960c0b7f06,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0aaf2cf-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.740 187212 DEBUG os_vif [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:22:ff,bridge_name='br-int',has_traffic_filtering=True,id=e0aaf2cf-092f-46ad-90e2-9d960c0b7f06,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0aaf2cf-09') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.741 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.741 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.741 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.742 187212 DEBUG nova.network.neutron [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Updating instance_info_cache with network_info: [{"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.748 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape0aaf2cf-09, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.748 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape0aaf2cf-09, col_values=(('external_ids', {'iface-id': 'e0aaf2cf-092f-46ad-90e2-9d960c0b7f06', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:22:ff', 'vm-uuid': '9147fd8a-b772-485b-8dff-7bb1a235c4dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.762 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Releasing lock "refresh_cache-235e69b1-e04e-4d65-92e2-864b105f03cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.763 187212 DEBUG nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Instance network_info: |[{"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.765 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Start _get_guest_xml network_info=[{"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:11:13 compute-0 NetworkManager[55691]: <info>  [1764936673.7857] manager: (tape0aaf2cf-09): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.788 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.790 187212 WARNING nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.793 187212 INFO os_vif [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:22:ff,bridge_name='br-int',has_traffic_filtering=True,id=e0aaf2cf-092f-46ad-90e2-9d960c0b7f06,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0aaf2cf-09')
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.795 187212 DEBUG nova.virt.libvirt.host [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.795 187212 DEBUG nova.virt.libvirt.host [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.799 187212 DEBUG nova.virt.libvirt.host [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.799 187212 DEBUG nova.virt.libvirt.host [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.800 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.800 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.800 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.801 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.801 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.801 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.801 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.801 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.802 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.802 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.802 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.802 187212 DEBUG nova.virt.hardware [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.805 187212 DEBUG nova.virt.libvirt.vif [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-824833708',display_name='tempest-ServersTestJSON-server-824833708',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-824833708',id=82,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-711owwc2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:02Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=235e69b1-e04e-4d65-92e2-864b105f03cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.806 187212 DEBUG nova.network.os_vif_util [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.806 187212 DEBUG nova.network.os_vif_util [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:f0:09,bridge_name='br-int',has_traffic_filtering=True,id=4568b7d9-2870-421a-abd0-1598977ec82d,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4568b7d9-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.807 187212 DEBUG nova.objects.instance [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 235e69b1-e04e-4d65-92e2-864b105f03cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.820 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <uuid>235e69b1-e04e-4d65-92e2-864b105f03cf</uuid>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <name>instance-00000052</name>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestJSON-server-824833708</nova:name>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:11:13</nova:creationTime>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:user uuid="62153b585ecc4e6fa2ad567851d49081">tempest-ServersTestJSON-1492365581-project-member</nova:user>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:project uuid="0c982a61e3fc4c8da9248076bb0361ac">tempest-ServersTestJSON-1492365581</nova:project>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         <nova:port uuid="4568b7d9-2870-421a-abd0-1598977ec82d">
Dec 05 12:11:13 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="serial">235e69b1-e04e-4d65-92e2-864b105f03cf</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="uuid">235e69b1-e04e-4d65-92e2-864b105f03cf</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk.config"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:39:f0:09"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <target dev="tap4568b7d9-28"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/console.log" append="off"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:11:13 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:11:13 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:13 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:13 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:13 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.821 187212 DEBUG nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Preparing to wait for external event network-vif-plugged-4568b7d9-2870-421a-abd0-1598977ec82d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.821 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.821 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.822 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.822 187212 DEBUG nova.virt.libvirt.vif [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-824833708',display_name='tempest-ServersTestJSON-server-824833708',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-824833708',id=82,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-711owwc2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:02Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=235e69b1-e04e-4d65-92e2-864b105f03cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.823 187212 DEBUG nova.network.os_vif_util [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.823 187212 DEBUG nova.network.os_vif_util [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:f0:09,bridge_name='br-int',has_traffic_filtering=True,id=4568b7d9-2870-421a-abd0-1598977ec82d,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4568b7d9-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.824 187212 DEBUG os_vif [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:f0:09,bridge_name='br-int',has_traffic_filtering=True,id=4568b7d9-2870-421a-abd0-1598977ec82d,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4568b7d9-28') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.825 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.825 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.825 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.828 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.828 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4568b7d9-28, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.828 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4568b7d9-28, col_values=(('external_ids', {'iface-id': '4568b7d9-2870-421a-abd0-1598977ec82d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:f0:09', 'vm-uuid': '235e69b1-e04e-4d65-92e2-864b105f03cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:13 compute-0 NetworkManager[55691]: <info>  [1764936673.8313] manager: (tap4568b7d9-28): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.838 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.839 187212 INFO os_vif [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:f0:09,bridge_name='br-int',has_traffic_filtering=True,id=4568b7d9-2870-421a-abd0-1598977ec82d,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4568b7d9-28')
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.855 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.856 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.856 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] No VIF found with MAC fa:16:3e:39:22:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.857 187212 INFO nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Using config drive
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.891 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.891 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.892 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No VIF found with MAC fa:16:3e:39:f0:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.892 187212 INFO nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Using config drive
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.993 187212 DEBUG nova.compute.manager [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Received event network-changed-4568b7d9-2870-421a-abd0-1598977ec82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:13 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.993 187212 DEBUG nova.compute.manager [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Refreshing instance network info cache due to event network-changed-4568b7d9-2870-421a-abd0-1598977ec82d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:14 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.994 187212 DEBUG oslo_concurrency.lockutils [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-235e69b1-e04e-4d65-92e2-864b105f03cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:14 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.994 187212 DEBUG oslo_concurrency.lockutils [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-235e69b1-e04e-4d65-92e2-864b105f03cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:14 compute-0 nova_compute[187208]: 2025-12-05 12:11:13.994 187212 DEBUG nova.network.neutron [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Refreshing network info cache for port 4568b7d9-2870-421a-abd0-1598977ec82d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:14 compute-0 nova_compute[187208]: 2025-12-05 12:11:14.785 187212 INFO nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Creating config drive at /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk.config
Dec 05 12:11:14 compute-0 nova_compute[187208]: 2025-12-05 12:11:14.792 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp28rylsqn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:14 compute-0 nova_compute[187208]: 2025-12-05 12:11:14.855 187212 INFO nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Creating config drive at /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk.config
Dec 05 12:11:14 compute-0 nova_compute[187208]: 2025-12-05 12:11:14.862 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0o28t04n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:14 compute-0 nova_compute[187208]: 2025-12-05 12:11:14.928 187212 DEBUG oslo_concurrency.processutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp28rylsqn" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:14 compute-0 nova_compute[187208]: 2025-12-05 12:11:14.995 187212 DEBUG oslo_concurrency.processutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0o28t04n" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:15 compute-0 NetworkManager[55691]: <info>  [1764936675.0068] manager: (tape0aaf2cf-09): new Tun device (/org/freedesktop/NetworkManager/Devices/309)
Dec 05 12:11:15 compute-0 kernel: tape0aaf2cf-09: entered promiscuous mode
Dec 05 12:11:15 compute-0 ovn_controller[95610]: 2025-12-05T12:11:15Z|00771|binding|INFO|Claiming lport e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 for this chassis.
Dec 05 12:11:15 compute-0 ovn_controller[95610]: 2025-12-05T12:11:15Z|00772|binding|INFO|e0aaf2cf-092f-46ad-90e2-9d960c0b7f06: Claiming fa:16:3e:39:22:ff 10.100.0.7
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.026 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:22:ff 10.100.0.7'], port_security=['fa:16:3e:39:22:ff 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9147fd8a-b772-485b-8dff-7bb1a235c4dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e0aaf2cf-092f-46ad-90e2-9d960c0b7f06) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.027 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be bound to our chassis
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.029 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:11:15 compute-0 ovn_controller[95610]: 2025-12-05T12:11:15Z|00773|binding|INFO|Setting lport e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 ovn-installed in OVS
Dec 05 12:11:15 compute-0 ovn_controller[95610]: 2025-12-05T12:11:15Z|00774|binding|INFO|Setting lport e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 up in Southbound
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.033 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.036 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 systemd-udevd[233415]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.055 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af3c23da-ac8a-4c2b-81d6-635e04be1c0b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 NetworkManager[55691]: <info>  [1764936675.0698] device (tape0aaf2cf-09): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:15 compute-0 NetworkManager[55691]: <info>  [1764936675.0705] device (tape0aaf2cf-09): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:15 compute-0 kernel: tap4568b7d9-28: entered promiscuous mode
Dec 05 12:11:15 compute-0 NetworkManager[55691]: <info>  [1764936675.0903] manager: (tap4568b7d9-28): new Tun device (/org/freedesktop/NetworkManager/Devices/310)
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 ovn_controller[95610]: 2025-12-05T12:11:15Z|00775|binding|INFO|Claiming lport 4568b7d9-2870-421a-abd0-1598977ec82d for this chassis.
Dec 05 12:11:15 compute-0 ovn_controller[95610]: 2025-12-05T12:11:15Z|00776|binding|INFO|4568b7d9-2870-421a-abd0-1598977ec82d: Claiming fa:16:3e:39:f0:09 10.100.0.13
Dec 05 12:11:15 compute-0 systemd-machined[153543]: New machine qemu-92-instance-00000051.
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.103 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:f0:09 10.100.0.13'], port_security=['fa:16:3e:39:f0:09 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '235e69b1-e04e-4d65-92e2-864b105f03cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4568b7d9-2870-421a-abd0-1598977ec82d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:15 compute-0 NetworkManager[55691]: <info>  [1764936675.1046] device (tap4568b7d9-28): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:15 compute-0 NetworkManager[55691]: <info>  [1764936675.1054] device (tap4568b7d9-28): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.107 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[47ef0dfc-c608-42e4-8346-cd17a823a03c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_controller[95610]: 2025-12-05T12:11:15Z|00777|binding|INFO|Setting lport 4568b7d9-2870-421a-abd0-1598977ec82d ovn-installed in OVS
Dec 05 12:11:15 compute-0 ovn_controller[95610]: 2025-12-05T12:11:15Z|00778|binding|INFO|Setting lport 4568b7d9-2870-421a-abd0-1598977ec82d up in Southbound
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 systemd[1]: Started Virtual Machine qemu-92-instance-00000051.
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.114 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3a25d225-6faf-4e2f-bff2-b1d1ac451347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 systemd-machined[153543]: New machine qemu-93-instance-00000052.
Dec 05 12:11:15 compute-0 systemd[1]: Started Virtual Machine qemu-93-instance-00000052.
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.153 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[43bed4db-c4fb-4d6c-8557-24f7973a6013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.174 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d3916c9-42a6-4b6e-9ff9-a30249d255ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82130d25-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:36:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403095, 'reachable_time': 38277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233440, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.204 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bfab3252-89ad-4781-80ba-506248ffb349]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap82130d25-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403109, 'tstamp': 403109}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233444, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap82130d25-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403113, 'tstamp': 403113}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233444, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.207 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82130d25-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.212 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.215 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82130d25-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.217 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82130d25-f0, col_values=(('external_ids', {'iface-id': 'f81c4a80-27d3-4231-a37a-7c231838aca7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.218 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.223 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4568b7d9-2870-421a-abd0-1598977ec82d in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 unbound from our chassis
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.226 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.252 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[34836427-c3f7-4059-9792-a1379b2bb0e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.290 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3abc5c-bbc6-4069-8002-3155b7028447]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.294 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e85545-83fe-410c-9a32-66dcb3fa175a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.331 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[697d4d36-4488-4576-8c34-b1a0981a6b47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.351 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[86ee6b0e-cd9f-4487-8ea2-49f3b936a2ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233457, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.367 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f1cf833-9ead-4e43-a6fb-ba28d95faa51]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233458, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233458, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.369 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.371 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.372 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.373 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.374 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.374 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:15 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:15.374 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.549 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936675.548756, 235e69b1-e04e-4d65-92e2-864b105f03cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.549 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] VM Started (Lifecycle Event)
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.691 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.696 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936675.54889, 235e69b1-e04e-4d65-92e2-864b105f03cf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.697 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] VM Paused (Lifecycle Event)
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.745 187212 DEBUG nova.network.neutron [req-f5691496-31b0-4c1d-b414-3f1ca2cc9d68 req-61fb1e39-2142-46d8-970d-648b13f306ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updated VIF entry in instance network info cache for port 7b183eee-c877-4387-a2f2-78923af9a88b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.746 187212 DEBUG nova.network.neutron [req-f5691496-31b0-4c1d-b414-3f1ca2cc9d68 req-61fb1e39-2142-46d8-970d-648b13f306ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.788 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.791 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.850 187212 DEBUG oslo_concurrency.lockutils [req-f5691496-31b0-4c1d-b414-3f1ca2cc9d68 req-61fb1e39-2142-46d8-970d-648b13f306ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.856 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.887 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936675.8869596, 9147fd8a-b772-485b-8dff-7bb1a235c4dd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.888 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] VM Started (Lifecycle Event)
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.911 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.916 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936675.8871696, 9147fd8a-b772-485b-8dff-7bb1a235c4dd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.916 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] VM Paused (Lifecycle Event)
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.962 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.967 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:15 compute-0 nova_compute[187208]: 2025-12-05 12:11:15.995 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.238 187212 DEBUG nova.compute.manager [req-681725ad-14b8-46f8-b810-ff90a357f07a req-e2458c33-cefd-40c9-b0fb-d19bacefbab1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Received event network-vif-plugged-4568b7d9-2870-421a-abd0-1598977ec82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.239 187212 DEBUG oslo_concurrency.lockutils [req-681725ad-14b8-46f8-b810-ff90a357f07a req-e2458c33-cefd-40c9-b0fb-d19bacefbab1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.239 187212 DEBUG oslo_concurrency.lockutils [req-681725ad-14b8-46f8-b810-ff90a357f07a req-e2458c33-cefd-40c9-b0fb-d19bacefbab1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.239 187212 DEBUG oslo_concurrency.lockutils [req-681725ad-14b8-46f8-b810-ff90a357f07a req-e2458c33-cefd-40c9-b0fb-d19bacefbab1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.239 187212 DEBUG nova.compute.manager [req-681725ad-14b8-46f8-b810-ff90a357f07a req-e2458c33-cefd-40c9-b0fb-d19bacefbab1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Processing event network-vif-plugged-4568b7d9-2870-421a-abd0-1598977ec82d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.240 187212 DEBUG nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.247 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936676.2433875, 235e69b1-e04e-4d65-92e2-864b105f03cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.247 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] VM Resumed (Lifecycle Event)
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.250 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.253 187212 INFO nova.virt.libvirt.driver [-] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Instance spawned successfully.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.254 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.269 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.274 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.277 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.278 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.278 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.278 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.279 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.279 187212 DEBUG nova.virt.libvirt.driver [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.304 187212 DEBUG nova.network.neutron [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Updated VIF entry in instance network info cache for port 4568b7d9-2870-421a-abd0-1598977ec82d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.305 187212 DEBUG nova.network.neutron [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Updating instance_info_cache with network_info: [{"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.307 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.329 187212 DEBUG oslo_concurrency.lockutils [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-235e69b1-e04e-4d65-92e2-864b105f03cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.329 187212 DEBUG nova.compute.manager [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Received event network-changed-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.330 187212 DEBUG nova.compute.manager [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Refreshing instance network info cache due to event network-changed-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.330 187212 DEBUG oslo_concurrency.lockutils [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-9147fd8a-b772-485b-8dff-7bb1a235c4dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.330 187212 DEBUG oslo_concurrency.lockutils [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-9147fd8a-b772-485b-8dff-7bb1a235c4dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.330 187212 DEBUG nova.network.neutron [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Refreshing network info cache for port e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.337 187212 DEBUG nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.337 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.337 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.338 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.338 187212 DEBUG nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] No waiting events found dispatching network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.338 187212 WARNING nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received unexpected event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e for instance with vm_state active and task_state None.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.338 187212 DEBUG nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.339 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.339 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.339 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.339 187212 DEBUG nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] No waiting events found dispatching network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.339 187212 WARNING nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received unexpected event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b for instance with vm_state active and task_state None.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.340 187212 DEBUG nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.340 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.340 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.340 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.340 187212 DEBUG nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] No waiting events found dispatching network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.341 187212 WARNING nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received unexpected event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b for instance with vm_state active and task_state None.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.341 187212 DEBUG nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Received event network-vif-plugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.341 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.341 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.342 187212 DEBUG oslo_concurrency.lockutils [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.342 187212 DEBUG nova.compute.manager [req-eaf8d12e-33ef-40d1-be2b-398d00f7923d req-06304150-9304-452a-be61-2d03ffe00929 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Processing event network-vif-plugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.343 187212 DEBUG nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.345 187212 INFO nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Took 13.90 seconds to spawn the instance on the hypervisor.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.346 187212 DEBUG nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.347 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.347 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936676.3473032, 9147fd8a-b772-485b-8dff-7bb1a235c4dd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.347 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] VM Resumed (Lifecycle Event)
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.359 187212 INFO nova.virt.libvirt.driver [-] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Instance spawned successfully.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.360 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.395 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.402 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.406 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.407 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.407 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.408 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.408 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.409 187212 DEBUG nova.virt.libvirt.driver [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.447 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.459 187212 DEBUG oslo_concurrency.lockutils [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.460 187212 DEBUG oslo_concurrency.lockutils [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.460 187212 DEBUG oslo_concurrency.lockutils [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.460 187212 DEBUG oslo_concurrency.lockutils [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.460 187212 DEBUG oslo_concurrency.lockutils [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.463 187212 INFO nova.compute.manager [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Terminating instance
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.465 187212 DEBUG nova.compute.manager [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.467 187212 INFO nova.compute.manager [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Took 15.09 seconds to build instance.
Dec 05 12:11:16 compute-0 kernel: tap8c343187-71 (unregistering): left promiscuous mode
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.490 187212 DEBUG oslo_concurrency.lockutils [None req-eeb3cff4-c36e-4466-acc4-dbdc4746762d 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:16 compute-0 NetworkManager[55691]: <info>  [1764936676.4911] device (tap8c343187-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:11:16 compute-0 ovn_controller[95610]: 2025-12-05T12:11:16Z|00779|binding|INFO|Releasing lport 8c343187-712d-4aee-9c47-18497ec1042e from this chassis (sb_readonly=0)
Dec 05 12:11:16 compute-0 ovn_controller[95610]: 2025-12-05T12:11:16Z|00780|binding|INFO|Setting lport 8c343187-712d-4aee-9c47-18497ec1042e down in Southbound
Dec 05 12:11:16 compute-0 ovn_controller[95610]: 2025-12-05T12:11:16Z|00781|binding|INFO|Removing iface tap8c343187-71 ovn-installed in OVS
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.499 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:16.505 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:54:21 10.100.0.12'], port_security=['fa:16:3e:56:54:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '8fe1c6df-f787-4c56-b3e7-899cf5e9f723', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8c343187-712d-4aee-9c47-18497ec1042e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:16.506 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8c343187-712d-4aee-9c47-18497ec1042e in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c unbound from our chassis
Dec 05 12:11:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:16.508 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:11:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:16.510 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d14b43f6-306b-4e36-9d06-cde9ea780564]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:16.512 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace which is not needed anymore
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.514 187212 INFO nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Took 14.91 seconds to spawn the instance on the hypervisor.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.515 187212 DEBUG nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.515 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:16 compute-0 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Dec 05 12:11:16 compute-0 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d0000004d.scope: Consumed 5.628s CPU time.
Dec 05 12:11:16 compute-0 systemd-machined[153543]: Machine qemu-91-instance-0000004d terminated.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.563 187212 DEBUG oslo_concurrency.lockutils [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-54d9605a-998b-4492-afc8-f7a5b0dd4e84-7b183eee-c877-4387-a2f2-78923af9a88b" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.563 187212 DEBUG oslo_concurrency.lockutils [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-54d9605a-998b-4492-afc8-f7a5b0dd4e84-7b183eee-c877-4387-a2f2-78923af9a88b" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.589 187212 INFO nova.compute.manager [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Took 15.75 seconds to build instance.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.591 187212 DEBUG nova.objects.instance [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid 54d9605a-998b-4492-afc8-f7a5b0dd4e84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.618 187212 DEBUG nova.virt.libvirt.vif [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-569275018',display_name='tempest-tempest.common.compute-instance-569275018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-569275018',id=74,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-h1nkz7rb',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=54d9605a-998b-4492-afc8-f7a5b0dd4e84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.618 187212 DEBUG nova.network.os_vif_util [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.619 187212 DEBUG nova.network.os_vif_util [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.622 187212 DEBUG oslo_concurrency.lockutils [None req-68cf1724-6af8-4c3b-8c50-5ad138e1d572 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.623 187212 DEBUG nova.virt.libvirt.guest [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.625 187212 DEBUG nova.virt.libvirt.guest [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.628 187212 DEBUG nova.virt.libvirt.driver [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Attempting to detach device tap7b183eee-c8 from instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.628 187212 DEBUG nova.virt.libvirt.guest [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:58:13:10"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <target dev="tap7b183eee-c8"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]: </interface>
Dec 05 12:11:16 compute-0 nova_compute[187208]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.738 187212 INFO nova.virt.libvirt.driver [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance destroyed successfully.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.739 187212 DEBUG nova.objects.instance [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'resources' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.752 187212 DEBUG nova.virt.libvirt.guest [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.759 187212 DEBUG nova.virt.libvirt.vif [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:10:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2029374639',display_name='tempest-ServerDiskConfigTestJSON-server-2029374639',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2029374639',id=77,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:11:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ep0a320q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:11:11Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=8fe1c6df-f787-4c56-b3e7-899cf5e9f723,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.759 187212 DEBUG nova.network.os_vif_util [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.760 187212 DEBUG nova.network.os_vif_util [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.760 187212 DEBUG os_vif [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.764 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.765 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c343187-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.770 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.772 187212 DEBUG nova.virt.libvirt.guest [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface>not found in domain: <domain type='kvm' id='83'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <name>instance-0000004a</name>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <uuid>54d9605a-998b-4492-afc8-f7a5b0dd4e84</uuid>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:name>tempest-tempest.common.compute-instance-569275018</nova:name>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:11:09</nova:creationTime>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:port uuid="ef99bad5-d092-46f6-9b3a-8225cc233d1e">
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:port uuid="7b183eee-c877-4387-a2f2-78923af9a88b">
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:11:16 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <memory unit='KiB'>131072</memory>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <vcpu placement='static'>1</vcpu>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <resource>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <partition>/machine</partition>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </resource>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <sysinfo type='smbios'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='manufacturer'>RDO</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='serial'>54d9605a-998b-4492-afc8-f7a5b0dd4e84</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='uuid'>54d9605a-998b-4492-afc8-f7a5b0dd4e84</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='family'>Virtual Machine</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <boot dev='hd'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <smbios mode='sysinfo'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <vmcoreinfo state='on'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <vendor>AMD</vendor>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='x2apic'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc-deadline'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='hypervisor'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc_adjust'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='spec-ctrl'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='stibp'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='ssbd'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='cmp_legacy'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='overflow-recov'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='succor'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='ibrs'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='amd-ssbd'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='virt-ssbd'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='lbrv'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='tsc-scale'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='vmcb-clean'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='flushbyasid'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='pause-filter'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='pfthreshold'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='xsaves'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='svm'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='topoext'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='npt'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='nrip-save'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <clock offset='utc'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <timer name='hpet' present='no'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <on_poweroff>destroy</on_poweroff>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <on_reboot>restart</on_reboot>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <on_crash>destroy</on_crash>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <disk type='file' device='disk'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk' index='2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <backingStore type='file' index='3'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:         <format type='raw'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:         <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:         <backingStore/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       </backingStore>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target dev='vda' bus='virtio'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='virtio-disk0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <disk type='file' device='cdrom'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config' index='1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <backingStore/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target dev='sda' bus='sata'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <readonly/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='sata0-0-0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pcie.0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='1' port='0x10'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='2' port='0x11'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='3' port='0x12'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.3'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='4' port='0x13'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.4'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='5' port='0x14'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.5'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='6' port='0x15'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.6'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='7' port='0x16'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.7'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='8' port='0x17'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.8'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='9' port='0x18'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.9'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='10' port='0x19'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.10'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='11' port='0x1a'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.11'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='12' port='0x1b'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.12'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='13' port='0x1c'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.13'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='14' port='0x1d'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.14'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='15' port='0x1e'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.15'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='16' port='0x1f'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.16'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='17' port='0x20'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.17'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='18' port='0x21'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.18'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='19' port='0x22'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.19'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='20' port='0x23'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.20'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='21' port='0x24'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.21'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='22' port='0x25'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.22'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='23' port='0x26'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.23'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='24' port='0x27'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.24'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='25' port='0x28'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.25'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-pci-bridge'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.26'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='usb'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='sata' index='0'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='ide'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:bd:e5:94'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target dev='tapef99bad5-d0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='net0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:58:13:10'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target dev='tap7b183eee-c8'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='net1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <serial type='pty'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <source path='/dev/pts/1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/console.log' append='off'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target type='isa-serial' port='0'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:         <model name='isa-serial'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       </target>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <console type='pty' tty='/dev/pts/1'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <source path='/dev/pts/1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/console.log' append='off'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target type='serial' port='0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </console>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <input type='tablet' bus='usb'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='input0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='usb' bus='0' port='1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <input type='mouse' bus='ps2'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='input1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <input type='keyboard' bus='ps2'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='input2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <listen type='address' address='::0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </graphics>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <audio id='1' type='none'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='video0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <watchdog model='itco' action='reset'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='watchdog0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </watchdog>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <memballoon model='virtio'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <stats period='10'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='balloon0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <rng model='virtio'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <backend model='random'>/dev/urandom</backend>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='rng0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <label>system_u:system_r:svirt_t:s0:c259,c823</label>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c259,c823</imagelabel>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <label>+107:+107</label>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <imagelabel>+107:+107</imagelabel>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:11:16 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:16 compute-0 nova_compute[187208]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.772 187212 INFO nova.virt.libvirt.driver [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tap7b183eee-c8 from instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84 from the persistent domain config.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.772 187212 DEBUG nova.virt.libvirt.driver [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] (1/8): Attempting to detach device tap7b183eee-c8 with device alias net1 from instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.772 187212 DEBUG nova.virt.libvirt.guest [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:58:13:10"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <target dev="tap7b183eee-c8"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]: </interface>
Dec 05 12:11:16 compute-0 nova_compute[187208]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.773 187212 INFO os_vif [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71')
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.774 187212 INFO nova.virt.libvirt.driver [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Deleting instance files /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723_del
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.774 187212 INFO nova.virt.libvirt.driver [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Deletion of /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723_del complete
Dec 05 12:11:16 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233271]: [NOTICE]   (233275) : haproxy version is 2.8.14-c23fe91
Dec 05 12:11:16 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233271]: [NOTICE]   (233275) : path to executable is /usr/sbin/haproxy
Dec 05 12:11:16 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233271]: [WARNING]  (233275) : Exiting Master process...
Dec 05 12:11:16 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233271]: [WARNING]  (233275) : Exiting Master process...
Dec 05 12:11:16 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233271]: [ALERT]    (233275) : Current worker (233277) exited with code 143 (Terminated)
Dec 05 12:11:16 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233271]: [WARNING]  (233275) : All workers exited. Exiting... (0)
Dec 05 12:11:16 compute-0 systemd[1]: libpod-36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59.scope: Deactivated successfully.
Dec 05 12:11:16 compute-0 podman[233493]: 2025-12-05 12:11:16.816619955 +0000 UTC m=+0.212067927 container died 36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.856 187212 INFO nova.compute.manager [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Took 0.39 seconds to destroy the instance on the hypervisor.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.857 187212 DEBUG oslo.service.loopingcall [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.857 187212 DEBUG nova.compute.manager [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.857 187212 DEBUG nova.network.neutron [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:11:16 compute-0 kernel: tap7b183eee-c8 (unregistering): left promiscuous mode
Dec 05 12:11:16 compute-0 NetworkManager[55691]: <info>  [1764936676.8956] device (tap7b183eee-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:11:16 compute-0 ovn_controller[95610]: 2025-12-05T12:11:16Z|00782|binding|INFO|Releasing lport 7b183eee-c877-4387-a2f2-78923af9a88b from this chassis (sb_readonly=0)
Dec 05 12:11:16 compute-0 ovn_controller[95610]: 2025-12-05T12:11:16Z|00783|binding|INFO|Setting lport 7b183eee-c877-4387-a2f2-78923af9a88b down in Southbound
Dec 05 12:11:16 compute-0 ovn_controller[95610]: 2025-12-05T12:11:16Z|00784|binding|INFO|Removing iface tap7b183eee-c8 ovn-installed in OVS
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.899 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.913 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Received event <DeviceRemovedEvent: 1764936676.913203, 54d9605a-998b-4492-afc8-f7a5b0dd4e84 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Dec 05 12:11:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:16.915 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:13:10 10.100.0.5'], port_security=['fa:16:3e:58:13:10 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1639127318', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1639127318', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7b183eee-c877-4387-a2f2-78923af9a88b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.920 187212 DEBUG nova.virt.libvirt.driver [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Start waiting for the detach event from libvirt for device tap7b183eee-c8 with device alias net1 for instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.921 187212 DEBUG nova.virt.libvirt.guest [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.925 187212 DEBUG nova.virt.libvirt.guest [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface>not found in domain: <domain type='kvm' id='83'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <name>instance-0000004a</name>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <uuid>54d9605a-998b-4492-afc8-f7a5b0dd4e84</uuid>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:name>tempest-tempest.common.compute-instance-569275018</nova:name>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:11:09</nova:creationTime>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:port uuid="ef99bad5-d092-46f6-9b3a-8225cc233d1e">
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:port uuid="7b183eee-c877-4387-a2f2-78923af9a88b">
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:11:16 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <memory unit='KiB'>131072</memory>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <vcpu placement='static'>1</vcpu>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <resource>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <partition>/machine</partition>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </resource>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <sysinfo type='smbios'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='manufacturer'>RDO</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='serial'>54d9605a-998b-4492-afc8-f7a5b0dd4e84</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='uuid'>54d9605a-998b-4492-afc8-f7a5b0dd4e84</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <entry name='family'>Virtual Machine</entry>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <boot dev='hd'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <smbios mode='sysinfo'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <vmcoreinfo state='on'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <vendor>AMD</vendor>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='x2apic'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc-deadline'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='hypervisor'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc_adjust'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='spec-ctrl'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='stibp'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='ssbd'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='cmp_legacy'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='overflow-recov'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='succor'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='ibrs'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='amd-ssbd'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='virt-ssbd'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='lbrv'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='tsc-scale'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='vmcb-clean'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='flushbyasid'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='pause-filter'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='pfthreshold'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='xsaves'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='svm'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='require' name='topoext'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='npt'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <feature policy='disable' name='nrip-save'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <clock offset='utc'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <timer name='hpet' present='no'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <on_poweroff>destroy</on_poweroff>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <on_reboot>restart</on_reboot>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <on_crash>destroy</on_crash>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <disk type='file' device='disk'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk' index='2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <backingStore type='file' index='3'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:         <format type='raw'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:         <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:         <backingStore/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       </backingStore>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target dev='vda' bus='virtio'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='virtio-disk0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <disk type='file' device='cdrom'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config' index='1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <backingStore/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target dev='sda' bus='sata'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <readonly/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='sata0-0-0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pcie.0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='1' port='0x10'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='2' port='0x11'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='3' port='0x12'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.3'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='4' port='0x13'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.4'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='5' port='0x14'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.5'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='6' port='0x15'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.6'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='7' port='0x16'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.7'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='8' port='0x17'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.8'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='9' port='0x18'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.9'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='10' port='0x19'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.10'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='11' port='0x1a'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.11'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='12' port='0x1b'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.12'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='13' port='0x1c'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.13'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='14' port='0x1d'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.14'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='15' port='0x1e'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.15'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='16' port='0x1f'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.16'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='17' port='0x20'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.17'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='18' port='0x21'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.18'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='19' port='0x22'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.19'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='20' port='0x23'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.20'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='21' port='0x24'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.21'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='22' port='0x25'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.22'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='23' port='0x26'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.23'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='24' port='0x27'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.24'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target chassis='25' port='0x28'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.25'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model name='pcie-pci-bridge'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='pci.26'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='usb'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <controller type='sata' index='0'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='ide'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:bd:e5:94'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target dev='tapef99bad5-d0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='net0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <serial type='pty'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <source path='/dev/pts/1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/console.log' append='off'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target type='isa-serial' port='0'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:         <model name='isa-serial'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       </target>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <console type='pty' tty='/dev/pts/1'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <source path='/dev/pts/1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/console.log' append='off'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <target type='serial' port='0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </console>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <input type='tablet' bus='usb'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='input0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='usb' bus='0' port='1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <input type='mouse' bus='ps2'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='input1'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <input type='keyboard' bus='ps2'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='input2'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <listen type='address' address='::0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </graphics>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <audio id='1' type='none'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='video0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <watchdog model='itco' action='reset'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='watchdog0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </watchdog>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <memballoon model='virtio'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <stats period='10'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='balloon0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <rng model='virtio'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <backend model='random'>/dev/urandom</backend>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <alias name='rng0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <label>system_u:system_r:svirt_t:s0:c259,c823</label>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c259,c823</imagelabel>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <label>+107:+107</label>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <imagelabel>+107:+107</imagelabel>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:11:16 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:16 compute-0 nova_compute[187208]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.925 187212 INFO nova.virt.libvirt.driver [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tap7b183eee-c8 from instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84 from the live domain config.
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.926 187212 DEBUG nova.virt.libvirt.vif [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-569275018',display_name='tempest-tempest.common.compute-instance-569275018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-569275018',id=74,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-h1nkz7rb',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=54d9605a-998b-4492-afc8-f7a5b0dd4e84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.926 187212 DEBUG nova.network.os_vif_util [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.927 187212 DEBUG nova.network.os_vif_util [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.927 187212 DEBUG os_vif [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.929 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.929 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b183eee-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.931 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.934 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.937 187212 INFO os_vif [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8')
Dec 05 12:11:16 compute-0 nova_compute[187208]: 2025-12-05 12:11:16.938 187212 DEBUG nova.virt.libvirt.guest [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:name>tempest-tempest.common.compute-instance-569275018</nova:name>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:11:16</nova:creationTime>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     <nova:port uuid="ef99bad5-d092-46f6-9b3a-8225cc233d1e">
Dec 05 12:11:16 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 05 12:11:16 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:16 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:11:16 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:11:16 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:11:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-590ee7311e2c20992106ed0c9fa1f31e967ba25b2776e9022a31d643ac47a1db-merged.mount: Deactivated successfully.
Dec 05 12:11:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59-userdata-shm.mount: Deactivated successfully.
Dec 05 12:11:17 compute-0 podman[233493]: 2025-12-05 12:11:17.060489333 +0000 UTC m=+0.455937285 container cleanup 36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:11:17 compute-0 systemd[1]: libpod-conmon-36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59.scope: Deactivated successfully.
Dec 05 12:11:17 compute-0 podman[233542]: 2025-12-05 12:11:17.137762831 +0000 UTC m=+0.045472006 container remove 36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.143 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[426ab6cd-3fec-4a1b-8fad-6a0cc2a28c21]: (4, ('Fri Dec  5 12:11:16 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59)\n36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59\nFri Dec  5 12:11:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59)\n36448294ea5b9b0bd6c9c263b4f7b3d6c62f98734f5d26ab58f3afc1e5b64a59\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.145 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a81f7a57-9153-4dab-8723-f82a49d52ca8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.146 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:17 compute-0 kernel: tap7be4540a-00: left promiscuous mode
Dec 05 12:11:17 compute-0 nova_compute[187208]: 2025-12-05 12:11:17.151 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:17 compute-0 nova_compute[187208]: 2025-12-05 12:11:17.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b25d1c6f-c79b-4943-80bb-b34b09b80dcc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.193 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3aad0d-ae82-4961-8d86-db3d36e2d401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.194 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[36c14e1d-8df4-4050-b509-e3dcb4b54737]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.210 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3752fb79-b07c-425c-9666-966753b61fac]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404490, 'reachable_time': 15984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233556, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d7be4540a\x2d0e0d\x2d45dd\x2d9ed3\x2d2c2701ae3e2c.mount: Deactivated successfully.
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.213 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.213 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[4659d7d4-6b00-4bd3-b0b3-089b5f229ce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.214 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7b183eee-c877-4387-a2f2-78923af9a88b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.216 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.230 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[89c23c83-5cfd-4a45-b4e8-6850f1cbf879]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.254 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[46863648-5edb-4538-ad71-195373eb4ec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.258 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab9a4ef-8282-4f51-91be-3389cb3c8571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.287 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b451740a-75a2-4208-9fc8-112dd889abe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.305 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[412f7e5f-817d-4e62-9a24-bc5fecff0407]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 10, 'rx_bytes': 742, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 10, 'rx_bytes': 742, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233563, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.320 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e31720c4-111b-4c8a-82c4-ee924fb78843]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398326, 'tstamp': 398326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233564, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398330, 'tstamp': 398330}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233564, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.322 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:17 compute-0 nova_compute[187208]: 2025-12-05 12:11:17.323 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:17 compute-0 nova_compute[187208]: 2025-12-05 12:11:17.324 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.325 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.325 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.325 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:17.326 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:17 compute-0 nova_compute[187208]: 2025-12-05 12:11:17.873 187212 DEBUG nova.network.neutron [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:17 compute-0 nova_compute[187208]: 2025-12-05 12:11:17.905 187212 INFO nova.compute.manager [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Took 1.05 seconds to deallocate network for instance.
Dec 05 12:11:17 compute-0 nova_compute[187208]: 2025-12-05 12:11:17.988 187212 DEBUG oslo_concurrency.lockutils [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:17 compute-0 nova_compute[187208]: 2025-12-05 12:11:17.988 187212 DEBUG oslo_concurrency.lockutils [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.115 187212 DEBUG nova.network.neutron [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Updated VIF entry in instance network info cache for port e0aaf2cf-092f-46ad-90e2-9d960c0b7f06. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.115 187212 DEBUG nova.network.neutron [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Updating instance_info_cache with network_info: [{"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.137 187212 DEBUG oslo_concurrency.lockutils [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-9147fd8a-b772-485b-8dff-7bb1a235c4dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.137 187212 DEBUG nova.compute.manager [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.138 187212 DEBUG oslo_concurrency.lockutils [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.138 187212 DEBUG oslo_concurrency.lockutils [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.138 187212 DEBUG oslo_concurrency.lockutils [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.138 187212 DEBUG nova.compute.manager [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] No waiting events found dispatching network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.138 187212 WARNING nova.compute.manager [req-a6e39be2-5f24-40c5-96db-c2ca803ff33a req-8fcaac51-7f43-4231-b27f-78575c294579 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received unexpected event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e for instance with vm_state active and task_state None.
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.173 187212 DEBUG nova.compute.provider_tree [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.195 187212 DEBUG nova.scheduler.client.report [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.237 187212 DEBUG oslo_concurrency.lockutils [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.278 187212 INFO nova.scheduler.client.report [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Deleted allocations for instance 8fe1c6df-f787-4c56-b3e7-899cf5e9f723
Dec 05 12:11:18 compute-0 nova_compute[187208]: 2025-12-05 12:11:18.518 187212 DEBUG oslo_concurrency.lockutils [None req-88eaf7ae-b093-4ec1-8a9b-da9b30f64ec7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:19 compute-0 nova_compute[187208]: 2025-12-05 12:11:19.995 187212 DEBUG oslo_concurrency.lockutils [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:19 compute-0 nova_compute[187208]: 2025-12-05 12:11:19.995 187212 DEBUG oslo_concurrency.lockutils [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:19 compute-0 nova_compute[187208]: 2025-12-05 12:11:19.996 187212 DEBUG oslo_concurrency.lockutils [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:19 compute-0 nova_compute[187208]: 2025-12-05 12:11:19.996 187212 DEBUG oslo_concurrency.lockutils [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:19 compute-0 nova_compute[187208]: 2025-12-05 12:11:19.996 187212 DEBUG oslo_concurrency.lockutils [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:19 compute-0 nova_compute[187208]: 2025-12-05 12:11:19.997 187212 INFO nova.compute.manager [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Terminating instance
Dec 05 12:11:19 compute-0 nova_compute[187208]: 2025-12-05 12:11:19.998 187212 DEBUG nova.compute.manager [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:11:20 compute-0 kernel: tape0aaf2cf-09 (unregistering): left promiscuous mode
Dec 05 12:11:20 compute-0 NetworkManager[55691]: <info>  [1764936680.0226] device (tape0aaf2cf-09): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:11:20 compute-0 ovn_controller[95610]: 2025-12-05T12:11:20Z|00785|binding|INFO|Releasing lport e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 from this chassis (sb_readonly=0)
Dec 05 12:11:20 compute-0 ovn_controller[95610]: 2025-12-05T12:11:20Z|00786|binding|INFO|Setting lport e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 down in Southbound
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.028 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 ovn_controller[95610]: 2025-12-05T12:11:20Z|00787|binding|INFO|Removing iface tape0aaf2cf-09 ovn-installed in OVS
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.030 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.049 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d00000051.scope: Deactivated successfully.
Dec 05 12:11:20 compute-0 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d00000051.scope: Consumed 4.403s CPU time.
Dec 05 12:11:20 compute-0 systemd-machined[153543]: Machine qemu-92-instance-00000051 terminated.
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.195 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:22:ff 10.100.0.7'], port_security=['fa:16:3e:39:22:ff 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9147fd8a-b772-485b-8dff-7bb1a235c4dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e0aaf2cf-092f-46ad-90e2-9d960c0b7f06) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.196 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be unbound from our chassis
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.198 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.200 187212 DEBUG nova.compute.manager [req-a2fe4df7-4fa5-4894-bdcd-27fcd1fc9a68 req-def65f15-f537-4c00-943e-4b4d13545b79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Received event network-vif-plugged-4568b7d9-2870-421a-abd0-1598977ec82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.201 187212 DEBUG oslo_concurrency.lockutils [req-a2fe4df7-4fa5-4894-bdcd-27fcd1fc9a68 req-def65f15-f537-4c00-943e-4b4d13545b79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.201 187212 DEBUG oslo_concurrency.lockutils [req-a2fe4df7-4fa5-4894-bdcd-27fcd1fc9a68 req-def65f15-f537-4c00-943e-4b4d13545b79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.201 187212 DEBUG oslo_concurrency.lockutils [req-a2fe4df7-4fa5-4894-bdcd-27fcd1fc9a68 req-def65f15-f537-4c00-943e-4b4d13545b79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.202 187212 DEBUG nova.compute.manager [req-a2fe4df7-4fa5-4894-bdcd-27fcd1fc9a68 req-def65f15-f537-4c00-943e-4b4d13545b79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] No waiting events found dispatching network-vif-plugged-4568b7d9-2870-421a-abd0-1598977ec82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.202 187212 WARNING nova.compute.manager [req-a2fe4df7-4fa5-4894-bdcd-27fcd1fc9a68 req-def65f15-f537-4c00-943e-4b4d13545b79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Received unexpected event network-vif-plugged-4568b7d9-2870-421a-abd0-1598977ec82d for instance with vm_state active and task_state None.
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.202 187212 DEBUG nova.compute.manager [req-a2fe4df7-4fa5-4894-bdcd-27fcd1fc9a68 req-def65f15-f537-4c00-943e-4b4d13545b79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-vif-deleted-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.217 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c0de0c-72c7-458e-a13b-24a36c65c68a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.250 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef628e8-3c5d-48f4-bb86-dcd8c623f230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.254 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bb03e5-7245-4354-9916-e06074b89b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.270 187212 INFO nova.virt.libvirt.driver [-] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Instance destroyed successfully.
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.271 187212 DEBUG nova.objects.instance [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'resources' on Instance uuid 9147fd8a-b772-485b-8dff-7bb1a235c4dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.285 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d2194475-7a16-4c3b-b8d3-9e40522f1056]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.302 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc73154-9798-4a10-9b11-98aff83b6769]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82130d25-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:36:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403095, 'reachable_time': 38277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233600, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.321 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[df3944eb-8721-4daa-81e5-4e1cbde20941]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap82130d25-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403109, 'tstamp': 403109}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233605, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap82130d25-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403113, 'tstamp': 403113}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233605, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.322 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82130d25-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.324 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.332 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 podman[233589]: 2025-12-05 12:11:20.332898052 +0000 UTC m=+0.053894257 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.333 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82130d25-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.333 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.333 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82130d25-f0, col_values=(('external_ids', {'iface-id': 'f81c4a80-27d3-4231-a37a-7c231838aca7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:20.333 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.356 187212 DEBUG nova.virt.libvirt.vif [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1940190658',display_name='tempest-ServersNegativeTestJSON-server-1940190658',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1940190658',id=81,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:11:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-f3tfdzox',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:11:16Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=9147fd8a-b772-485b-8dff-7bb1a235c4dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.357 187212 DEBUG nova.network.os_vif_util [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "address": "fa:16:3e:39:22:ff", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0aaf2cf-09", "ovs_interfaceid": "e0aaf2cf-092f-46ad-90e2-9d960c0b7f06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.357 187212 DEBUG nova.network.os_vif_util [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:22:ff,bridge_name='br-int',has_traffic_filtering=True,id=e0aaf2cf-092f-46ad-90e2-9d960c0b7f06,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0aaf2cf-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.357 187212 DEBUG os_vif [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:22:ff,bridge_name='br-int',has_traffic_filtering=True,id=e0aaf2cf-092f-46ad-90e2-9d960c0b7f06,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0aaf2cf-09') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.359 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.359 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape0aaf2cf-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.361 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.362 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.364 187212 INFO os_vif [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:22:ff,bridge_name='br-int',has_traffic_filtering=True,id=e0aaf2cf-092f-46ad-90e2-9d960c0b7f06,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0aaf2cf-09')
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.364 187212 INFO nova.virt.libvirt.driver [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Deleting instance files /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd_del
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.365 187212 INFO nova.virt.libvirt.driver [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Deletion of /var/lib/nova/instances/9147fd8a-b772-485b-8dff-7bb1a235c4dd_del complete
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.382 187212 DEBUG nova.compute.manager [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Received event network-vif-plugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.382 187212 DEBUG oslo_concurrency.lockutils [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.383 187212 DEBUG oslo_concurrency.lockutils [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.383 187212 DEBUG oslo_concurrency.lockutils [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.383 187212 DEBUG nova.compute.manager [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] No waiting events found dispatching network-vif-plugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.383 187212 WARNING nova.compute.manager [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Received unexpected event network-vif-plugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 for instance with vm_state active and task_state deleting.
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.383 187212 DEBUG nova.compute.manager [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-vif-unplugged-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.384 187212 DEBUG oslo_concurrency.lockutils [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.384 187212 DEBUG oslo_concurrency.lockutils [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.384 187212 DEBUG oslo_concurrency.lockutils [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.384 187212 DEBUG nova.compute.manager [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] No waiting events found dispatching network-vif-unplugged-8c343187-712d-4aee-9c47-18497ec1042e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.384 187212 WARNING nova.compute.manager [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received unexpected event network-vif-unplugged-8c343187-712d-4aee-9c47-18497ec1042e for instance with vm_state deleted and task_state None.
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.385 187212 DEBUG nova.compute.manager [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.385 187212 DEBUG oslo_concurrency.lockutils [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.385 187212 DEBUG oslo_concurrency.lockutils [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.385 187212 DEBUG oslo_concurrency.lockutils [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.385 187212 DEBUG nova.compute.manager [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] No waiting events found dispatching network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.386 187212 WARNING nova.compute.manager [req-e6fde87d-14ad-41d6-9589-e8252abe8b34 req-1bff603f-aec1-489e-9039-2e4106e8a314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received unexpected event network-vif-plugged-8c343187-712d-4aee-9c47-18497ec1042e for instance with vm_state deleted and task_state None.
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.527 187212 INFO nova.compute.manager [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Took 0.53 seconds to destroy the instance on the hypervisor.
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.527 187212 DEBUG oslo.service.loopingcall [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.528 187212 DEBUG nova.compute.manager [-] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.528 187212 DEBUG nova.network.neutron [-] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "235e69b1-e04e-4d65-92e2-864b105f03cf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.781 187212 DEBUG oslo_concurrency.lockutils [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.781 187212 DEBUG oslo_concurrency.lockutils [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.782 187212 INFO nova.compute.manager [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Terminating instance
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.783 187212 DEBUG nova.compute.manager [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:11:20 compute-0 kernel: tap4568b7d9-28 (unregistering): left promiscuous mode
Dec 05 12:11:20 compute-0 NetworkManager[55691]: <info>  [1764936680.8053] device (tap4568b7d9-28): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.810 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 ovn_controller[95610]: 2025-12-05T12:11:20Z|00788|binding|INFO|Releasing lport 4568b7d9-2870-421a-abd0-1598977ec82d from this chassis (sb_readonly=0)
Dec 05 12:11:20 compute-0 ovn_controller[95610]: 2025-12-05T12:11:20Z|00789|binding|INFO|Setting lport 4568b7d9-2870-421a-abd0-1598977ec82d down in Southbound
Dec 05 12:11:20 compute-0 ovn_controller[95610]: 2025-12-05T12:11:20Z|00790|binding|INFO|Removing iface tap4568b7d9-28 ovn-installed in OVS
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.811 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 nova_compute[187208]: 2025-12-05 12:11:20.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:20 compute-0 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d00000052.scope: Deactivated successfully.
Dec 05 12:11:20 compute-0 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d00000052.scope: Consumed 4.877s CPU time.
Dec 05 12:11:20 compute-0 systemd-machined[153543]: Machine qemu-93-instance-00000052 terminated.
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.049 187212 INFO nova.virt.libvirt.driver [-] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Instance destroyed successfully.
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.049 187212 DEBUG nova.objects.instance [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'resources' on Instance uuid 235e69b1-e04e-4d65-92e2-864b105f03cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.071 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:f0:09 10.100.0.13'], port_security=['fa:16:3e:39:f0:09 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '235e69b1-e04e-4d65-92e2-864b105f03cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4568b7d9-2870-421a-abd0-1598977ec82d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.073 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4568b7d9-2870-421a-abd0-1598977ec82d in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 unbound from our chassis
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.075 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.091 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cf050f99-3f1a-48fd-b838-075caa61d7cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.120 187212 DEBUG nova.virt.libvirt.vif [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-824833708',display_name='tempest-ServersTestJSON-server-824833708',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-824833708',id=82,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:11:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-711owwc2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:11:16Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=235e69b1-e04e-4d65-92e2-864b105f03cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.120 187212 DEBUG nova.network.os_vif_util [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "4568b7d9-2870-421a-abd0-1598977ec82d", "address": "fa:16:3e:39:f0:09", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4568b7d9-28", "ovs_interfaceid": "4568b7d9-2870-421a-abd0-1598977ec82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.121 187212 DEBUG nova.network.os_vif_util [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:f0:09,bridge_name='br-int',has_traffic_filtering=True,id=4568b7d9-2870-421a-abd0-1598977ec82d,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4568b7d9-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.122 187212 DEBUG os_vif [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:f0:09,bridge_name='br-int',has_traffic_filtering=True,id=4568b7d9-2870-421a-abd0-1598977ec82d,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4568b7d9-28') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.122 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8de39a98-af37-455f-8425-e53aa6058a50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.124 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4568b7d9-28, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.125 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.126 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[69d6ef87-148f-49eb-b9dc-dfe9cd1e5bde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.126 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.129 187212 INFO os_vif [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:f0:09,bridge_name='br-int',has_traffic_filtering=True,id=4568b7d9-2870-421a-abd0-1598977ec82d,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4568b7d9-28')
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.129 187212 INFO nova.virt.libvirt.driver [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Deleting instance files /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf_del
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.130 187212 INFO nova.virt.libvirt.driver [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Deletion of /var/lib/nova/instances/235e69b1-e04e-4d65-92e2-864b105f03cf_del complete
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.155 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4b50a514-3bfc-4493-906d-7543e531d6fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ffd472-7062-4515-b037-c6b4db54e275]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233645, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.194 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b6c324-139f-4892-b248-762ea0b83049]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233646, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233646, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.196 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.197 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.198 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.199 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.199 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.200 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:21.200 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.291 187212 INFO nova.compute.manager [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Took 0.51 seconds to destroy the instance on the hypervisor.
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.292 187212 DEBUG oslo.service.loopingcall [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.292 187212 DEBUG nova.compute.manager [-] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:11:21 compute-0 nova_compute[187208]: 2025-12-05 12:11:21.292 187212 DEBUG nova.network.neutron [-] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.450 187212 DEBUG nova.compute.manager [req-42061af8-a3ac-42c6-adee-c884937dbbbd req-3b4e4d1d-7b18-43a9-b554-6b5b784b5347 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Received event network-vif-unplugged-4568b7d9-2870-421a-abd0-1598977ec82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.451 187212 DEBUG oslo_concurrency.lockutils [req-42061af8-a3ac-42c6-adee-c884937dbbbd req-3b4e4d1d-7b18-43a9-b554-6b5b784b5347 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.451 187212 DEBUG oslo_concurrency.lockutils [req-42061af8-a3ac-42c6-adee-c884937dbbbd req-3b4e4d1d-7b18-43a9-b554-6b5b784b5347 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.451 187212 DEBUG oslo_concurrency.lockutils [req-42061af8-a3ac-42c6-adee-c884937dbbbd req-3b4e4d1d-7b18-43a9-b554-6b5b784b5347 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.451 187212 DEBUG nova.compute.manager [req-42061af8-a3ac-42c6-adee-c884937dbbbd req-3b4e4d1d-7b18-43a9-b554-6b5b784b5347 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] No waiting events found dispatching network-vif-unplugged-4568b7d9-2870-421a-abd0-1598977ec82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.452 187212 DEBUG nova.compute.manager [req-42061af8-a3ac-42c6-adee-c884937dbbbd req-3b4e4d1d-7b18-43a9-b554-6b5b784b5347 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Received event network-vif-unplugged-4568b7d9-2870-421a-abd0-1598977ec82d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.508 187212 DEBUG oslo_concurrency.lockutils [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.508 187212 DEBUG oslo_concurrency.lockutils [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.508 187212 DEBUG nova.network.neutron [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.553 187212 DEBUG nova.compute.manager [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-unplugged-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.553 187212 DEBUG oslo_concurrency.lockutils [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.553 187212 DEBUG oslo_concurrency.lockutils [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.554 187212 DEBUG oslo_concurrency.lockutils [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.555 187212 DEBUG nova.compute.manager [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] No waiting events found dispatching network-vif-unplugged-7b183eee-c877-4387-a2f2-78923af9a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.556 187212 WARNING nova.compute.manager [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received unexpected event network-vif-unplugged-7b183eee-c877-4387-a2f2-78923af9a88b for instance with vm_state active and task_state None.
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.556 187212 DEBUG nova.compute.manager [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.556 187212 DEBUG oslo_concurrency.lockutils [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.556 187212 DEBUG oslo_concurrency.lockutils [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.557 187212 DEBUG oslo_concurrency.lockutils [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.557 187212 DEBUG nova.compute.manager [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] No waiting events found dispatching network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.557 187212 WARNING nova.compute.manager [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received unexpected event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b for instance with vm_state active and task_state None.
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.557 187212 DEBUG nova.compute.manager [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Received event network-vif-unplugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.557 187212 DEBUG oslo_concurrency.lockutils [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.558 187212 DEBUG oslo_concurrency.lockutils [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.558 187212 DEBUG oslo_concurrency.lockutils [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.558 187212 DEBUG nova.compute.manager [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] No waiting events found dispatching network-vif-unplugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.558 187212 DEBUG nova.compute.manager [req-bc9eb9e2-6651-4c3a-8897-4e118f44b7e8 req-96a8caf4-0686-4413-8883-61679debaf95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Received event network-vif-unplugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.587 187212 DEBUG nova.network.neutron [-] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.632 187212 INFO nova.compute.manager [-] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Took 2.10 seconds to deallocate network for instance.
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.705 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "a70dccfb-2a89-4283-aba2-934af2667db3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.706 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.710 187212 DEBUG oslo_concurrency.lockutils [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.711 187212 DEBUG oslo_concurrency.lockutils [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.874 187212 DEBUG nova.compute.provider_tree [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.944 187212 DEBUG nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.948 187212 DEBUG nova.scheduler.client.report [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:22 compute-0 nova_compute[187208]: 2025-12-05 12:11:22.988 187212 DEBUG oslo_concurrency.lockutils [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.013 187212 INFO nova.scheduler.client.report [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Deleted allocations for instance 9147fd8a-b772-485b-8dff-7bb1a235c4dd
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.017 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.017 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.024 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.025 187212 INFO nova.compute.claims [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.123 187212 DEBUG oslo_concurrency.lockutils [None req-578eda78-bacd-40aa-a7d8-c439a524ec17 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.176 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.199 187212 DEBUG nova.network.neutron [-] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.223 187212 INFO nova.compute.manager [-] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Took 1.93 seconds to deallocate network for instance.
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.286 187212 DEBUG oslo_concurrency.lockutils [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.297 187212 DEBUG nova.compute.provider_tree [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.325 187212 DEBUG nova.scheduler.client.report [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.348 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.349 187212 DEBUG nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.352 187212 DEBUG oslo_concurrency.lockutils [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.404 187212 DEBUG nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.405 187212 DEBUG nova.network.neutron [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.442 187212 INFO nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.463 187212 DEBUG nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.563 187212 DEBUG nova.compute.provider_tree [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.583 187212 DEBUG nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.584 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.585 187212 INFO nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Creating image(s)
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.586 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "/var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.586 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.587 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.601 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.643 187212 DEBUG nova.scheduler.client.report [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.671 187212 DEBUG oslo_concurrency.lockutils [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.705 187212 INFO nova.scheduler.client.report [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Deleted allocations for instance 235e69b1-e04e-4d65-92e2-864b105f03cf
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.707 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.707 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.708 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.719 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.749 187212 DEBUG nova.policy [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef254bb2df0442c6bcadfb3a6861c0e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e836357870d746e49bc783da7cd3accd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.775 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.776 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.798 187212 DEBUG oslo_concurrency.lockutils [None req-a21d15c5-d9fd-4287-acac-d8ef156054d1 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.815 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.816 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.817 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.874 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.875 187212 DEBUG nova.virt.disk.api [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Checking if we can resize image /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.876 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.934 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.935 187212 DEBUG nova.virt.disk.api [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Cannot resize image /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.936 187212 DEBUG nova.objects.instance [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'migration_context' on Instance uuid a70dccfb-2a89-4283-aba2-934af2667db3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.978 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.979 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Ensure instance console log exists: /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.979 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.980 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:23 compute-0 nova_compute[187208]: 2025-12-05 12:11:23.980 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:24 compute-0 nova_compute[187208]: 2025-12-05 12:11:24.985 187212 DEBUG nova.network.neutron [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Successfully created port: 4870c9e1-8549-42a1-a77d-f9824cc38f59 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.049 187212 DEBUG nova.compute.manager [req-38d22882-cb51-4458-a93e-6048e78cfc88 req-13fc82b7-1eed-45d3-89e0-9418cddb25e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Received event network-vif-plugged-4568b7d9-2870-421a-abd0-1598977ec82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.050 187212 DEBUG oslo_concurrency.lockutils [req-38d22882-cb51-4458-a93e-6048e78cfc88 req-13fc82b7-1eed-45d3-89e0-9418cddb25e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.050 187212 DEBUG oslo_concurrency.lockutils [req-38d22882-cb51-4458-a93e-6048e78cfc88 req-13fc82b7-1eed-45d3-89e0-9418cddb25e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.050 187212 DEBUG oslo_concurrency.lockutils [req-38d22882-cb51-4458-a93e-6048e78cfc88 req-13fc82b7-1eed-45d3-89e0-9418cddb25e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "235e69b1-e04e-4d65-92e2-864b105f03cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.051 187212 DEBUG nova.compute.manager [req-38d22882-cb51-4458-a93e-6048e78cfc88 req-13fc82b7-1eed-45d3-89e0-9418cddb25e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] No waiting events found dispatching network-vif-plugged-4568b7d9-2870-421a-abd0-1598977ec82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.051 187212 WARNING nova.compute.manager [req-38d22882-cb51-4458-a93e-6048e78cfc88 req-13fc82b7-1eed-45d3-89e0-9418cddb25e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Received unexpected event network-vif-plugged-4568b7d9-2870-421a-abd0-1598977ec82d for instance with vm_state deleted and task_state None.
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.051 187212 DEBUG nova.compute.manager [req-38d22882-cb51-4458-a93e-6048e78cfc88 req-13fc82b7-1eed-45d3-89e0-9418cddb25e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Received event network-vif-deleted-4568b7d9-2870-421a-abd0-1598977ec82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.214 187212 DEBUG nova.compute.manager [req-5dc8d585-bd62-4263-976b-e075bfbada45 req-de76a09a-5ed3-4a07-8f92-d1e5dd03b7ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Received event network-vif-plugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.214 187212 DEBUG oslo_concurrency.lockutils [req-5dc8d585-bd62-4263-976b-e075bfbada45 req-de76a09a-5ed3-4a07-8f92-d1e5dd03b7ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.216 187212 DEBUG oslo_concurrency.lockutils [req-5dc8d585-bd62-4263-976b-e075bfbada45 req-de76a09a-5ed3-4a07-8f92-d1e5dd03b7ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.216 187212 DEBUG oslo_concurrency.lockutils [req-5dc8d585-bd62-4263-976b-e075bfbada45 req-de76a09a-5ed3-4a07-8f92-d1e5dd03b7ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9147fd8a-b772-485b-8dff-7bb1a235c4dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.217 187212 DEBUG nova.compute.manager [req-5dc8d585-bd62-4263-976b-e075bfbada45 req-de76a09a-5ed3-4a07-8f92-d1e5dd03b7ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] No waiting events found dispatching network-vif-plugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.217 187212 WARNING nova.compute.manager [req-5dc8d585-bd62-4263-976b-e075bfbada45 req-de76a09a-5ed3-4a07-8f92-d1e5dd03b7ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Received unexpected event network-vif-plugged-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 for instance with vm_state deleted and task_state None.
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.218 187212 DEBUG nova.compute.manager [req-5dc8d585-bd62-4263-976b-e075bfbada45 req-de76a09a-5ed3-4a07-8f92-d1e5dd03b7ad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Received event network-vif-deleted-e0aaf2cf-092f-46ad-90e2-9d960c0b7f06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.220 187212 INFO nova.network.neutron [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Port 7b183eee-c877-4387-a2f2-78923af9a88b from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.220 187212 DEBUG nova.network.neutron [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.245 187212 DEBUG oslo_concurrency.lockutils [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.269 187212 DEBUG oslo_concurrency.lockutils [None req-912a0dcd-8bd9-48e8-aacb-5d6be9192016 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-54d9605a-998b-4492-afc8-f7a5b0dd4e84-7b183eee-c877-4387-a2f2-78923af9a88b" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 8.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:25 compute-0 nova_compute[187208]: 2025-12-05 12:11:25.499 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:26 compute-0 nova_compute[187208]: 2025-12-05 12:11:26.126 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:26 compute-0 nova_compute[187208]: 2025-12-05 12:11:26.586 187212 DEBUG nova.network.neutron [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Successfully updated port: 4870c9e1-8549-42a1-a77d-f9824cc38f59 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:26 compute-0 nova_compute[187208]: 2025-12-05 12:11:26.623 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "refresh_cache-a70dccfb-2a89-4283-aba2-934af2667db3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:26 compute-0 nova_compute[187208]: 2025-12-05 12:11:26.624 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquired lock "refresh_cache-a70dccfb-2a89-4283-aba2-934af2667db3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:26 compute-0 nova_compute[187208]: 2025-12-05 12:11:26.624 187212 DEBUG nova.network.neutron [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:26 compute-0 nova_compute[187208]: 2025-12-05 12:11:26.929 187212 DEBUG nova.network.neutron [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:11:27 compute-0 podman[233663]: 2025-12-05 12:11:27.200949356 +0000 UTC m=+0.053445035 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:11:27 compute-0 nova_compute[187208]: 2025-12-05 12:11:27.452 187212 DEBUG nova.compute.manager [req-e9709137-40af-4bdc-b779-efeb427f18bc req-26492240-29fb-4cc0-9ad5-750279d34d76 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Received event network-changed-4870c9e1-8549-42a1-a77d-f9824cc38f59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:27 compute-0 nova_compute[187208]: 2025-12-05 12:11:27.452 187212 DEBUG nova.compute.manager [req-e9709137-40af-4bdc-b779-efeb427f18bc req-26492240-29fb-4cc0-9ad5-750279d34d76 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Refreshing instance network info cache due to event network-changed-4870c9e1-8549-42a1-a77d-f9824cc38f59. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:27 compute-0 nova_compute[187208]: 2025-12-05 12:11:27.453 187212 DEBUG oslo_concurrency.lockutils [req-e9709137-40af-4bdc-b779-efeb427f18bc req-26492240-29fb-4cc0-9ad5-750279d34d76 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-a70dccfb-2a89-4283-aba2-934af2667db3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:28 compute-0 nova_compute[187208]: 2025-12-05 12:11:28.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:28 compute-0 nova_compute[187208]: 2025-12-05 12:11:28.902 187212 DEBUG nova.network.neutron [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Updating instance_info_cache with network_info: [{"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.135 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Releasing lock "refresh_cache-a70dccfb-2a89-4283-aba2-934af2667db3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.136 187212 DEBUG nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Instance network_info: |[{"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.136 187212 DEBUG oslo_concurrency.lockutils [req-e9709137-40af-4bdc-b779-efeb427f18bc req-26492240-29fb-4cc0-9ad5-750279d34d76 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-a70dccfb-2a89-4283-aba2-934af2667db3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.136 187212 DEBUG nova.network.neutron [req-e9709137-40af-4bdc-b779-efeb427f18bc req-26492240-29fb-4cc0-9ad5-750279d34d76 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Refreshing network info cache for port 4870c9e1-8549-42a1-a77d-f9824cc38f59 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.141 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Start _get_guest_xml network_info=[{"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.146 187212 WARNING nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.152 187212 DEBUG nova.virt.libvirt.host [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.153 187212 DEBUG nova.virt.libvirt.host [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.160 187212 DEBUG nova.virt.libvirt.host [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.161 187212 DEBUG nova.virt.libvirt.host [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.162 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.162 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.163 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.163 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.163 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.164 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.164 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.164 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.165 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.165 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.165 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.166 187212 DEBUG nova.virt.hardware [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.171 187212 DEBUG nova.virt.libvirt.vif [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1890664302',display_name='tempest-ServerDiskConfigTestJSON-server-1890664302',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1890664302',id=83,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ijylq08c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:23Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=a70dccfb-2a89-4283-aba2-934af2667db3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.171 187212 DEBUG nova.network.os_vif_util [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.172 187212 DEBUG nova.network.os_vif_util [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:5a:9c,bridge_name='br-int',has_traffic_filtering=True,id=4870c9e1-8549-42a1-a77d-f9824cc38f59,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4870c9e1-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.173 187212 DEBUG nova.objects.instance [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_devices' on Instance uuid a70dccfb-2a89-4283-aba2-934af2667db3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.328 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <uuid>a70dccfb-2a89-4283-aba2-934af2667db3</uuid>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <name>instance-00000053</name>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-1890664302</nova:name>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:11:29</nova:creationTime>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:11:29 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:11:29 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:11:29 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:11:29 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:29 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:11:29 compute-0 nova_compute[187208]:         <nova:user uuid="ef254bb2df0442c6bcadfb3a6861c0e9">tempest-ServerDiskConfigTestJSON-1245488084-project-member</nova:user>
Dec 05 12:11:29 compute-0 nova_compute[187208]:         <nova:project uuid="e836357870d746e49bc783da7cd3accd">tempest-ServerDiskConfigTestJSON-1245488084</nova:project>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:11:29 compute-0 nova_compute[187208]:         <nova:port uuid="4870c9e1-8549-42a1-a77d-f9824cc38f59">
Dec 05 12:11:29 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <entry name="serial">a70dccfb-2a89-4283-aba2-934af2667db3</entry>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <entry name="uuid">a70dccfb-2a89-4283-aba2-934af2667db3</entry>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk.config"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:f1:5a:9c"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <target dev="tap4870c9e1-85"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/console.log" append="off"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:11:29 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:11:29 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:29 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:29 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:29 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.330 187212 DEBUG nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Preparing to wait for external event network-vif-plugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.330 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.330 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.330 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.331 187212 DEBUG nova.virt.libvirt.vif [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1890664302',display_name='tempest-ServerDiskConfigTestJSON-server-1890664302',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1890664302',id=83,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ijylq08c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:23Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=a70dccfb-2a89-4283-aba2-934af2667db3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.331 187212 DEBUG nova.network.os_vif_util [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.332 187212 DEBUG nova.network.os_vif_util [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:5a:9c,bridge_name='br-int',has_traffic_filtering=True,id=4870c9e1-8549-42a1-a77d-f9824cc38f59,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4870c9e1-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.332 187212 DEBUG os_vif [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:5a:9c,bridge_name='br-int',has_traffic_filtering=True,id=4870c9e1-8549-42a1-a77d-f9824cc38f59,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4870c9e1-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.333 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.334 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.336 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4870c9e1-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.337 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4870c9e1-85, col_values=(('external_ids', {'iface-id': '4870c9e1-8549-42a1-a77d-f9824cc38f59', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:5a:9c', 'vm-uuid': 'a70dccfb-2a89-4283-aba2-934af2667db3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.338 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:29 compute-0 NetworkManager[55691]: <info>  [1764936689.3399] manager: (tap4870c9e1-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.341 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.344 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.346 187212 INFO os_vif [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:5a:9c,bridge_name='br-int',has_traffic_filtering=True,id=4870c9e1-8549-42a1-a77d-f9824cc38f59,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4870c9e1-85')
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.490 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.491 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.503 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.503 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.504 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No VIF found with MAC fa:16:3e:f1:5a:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.504 187212 INFO nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Using config drive
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.507 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.983 187212 DEBUG nova.compute.manager [req-a26f5fd5-f9ad-445e-9419-b0c97d73b171 req-ef4d3d0e-9fef-4ba3-8d97-1c3465e7153b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.984 187212 DEBUG nova.compute.manager [req-a26f5fd5-f9ad-445e-9419-b0c97d73b171 req-ef4d3d0e-9fef-4ba3-8d97-1c3465e7153b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing instance network info cache due to event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.985 187212 DEBUG oslo_concurrency.lockutils [req-a26f5fd5-f9ad-445e-9419-b0c97d73b171 req-ef4d3d0e-9fef-4ba3-8d97-1c3465e7153b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.985 187212 DEBUG oslo_concurrency.lockutils [req-a26f5fd5-f9ad-445e-9419-b0c97d73b171 req-ef4d3d0e-9fef-4ba3-8d97-1c3465e7153b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:29 compute-0 nova_compute[187208]: 2025-12-05 12:11:29.985 187212 DEBUG nova.network.neutron [req-a26f5fd5-f9ad-445e-9419-b0c97d73b171 req-ef4d3d0e-9fef-4ba3-8d97-1c3465e7153b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.091 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.092 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.097 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.097 187212 INFO nova.compute.claims [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.747 187212 INFO nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Creating config drive at /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk.config
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.752 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbbgolh0n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.881 187212 DEBUG oslo_concurrency.processutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbbgolh0n" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.913 187212 DEBUG nova.compute.provider_tree [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:30 compute-0 kernel: tap4870c9e1-85: entered promiscuous mode
Dec 05 12:11:30 compute-0 NetworkManager[55691]: <info>  [1764936690.9488] manager: (tap4870c9e1-85): new Tun device (/org/freedesktop/NetworkManager/Devices/312)
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.949 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:30 compute-0 ovn_controller[95610]: 2025-12-05T12:11:30Z|00791|binding|INFO|Claiming lport 4870c9e1-8549-42a1-a77d-f9824cc38f59 for this chassis.
Dec 05 12:11:30 compute-0 ovn_controller[95610]: 2025-12-05T12:11:30Z|00792|binding|INFO|4870c9e1-8549-42a1-a77d-f9824cc38f59: Claiming fa:16:3e:f1:5a:9c 10.100.0.13
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.963 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:30 compute-0 ovn_controller[95610]: 2025-12-05T12:11:30Z|00793|binding|INFO|Setting lport 4870c9e1-8549-42a1-a77d-f9824cc38f59 ovn-installed in OVS
Dec 05 12:11:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:30.964 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:5a:9c 10.100.0.13'], port_security=['fa:16:3e:f1:5a:9c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a70dccfb-2a89-4283-aba2-934af2667db3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4870c9e1-8549-42a1-a77d-f9824cc38f59) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:30 compute-0 ovn_controller[95610]: 2025-12-05T12:11:30Z|00794|binding|INFO|Setting lport 4870c9e1-8549-42a1-a77d-f9824cc38f59 up in Southbound
Dec 05 12:11:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:30.965 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4870c9e1-8549-42a1-a77d-f9824cc38f59 in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c bound to our chassis
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.965 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:30.968 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:11:30 compute-0 nova_compute[187208]: 2025-12-05 12:11:30.971 187212 DEBUG nova.scheduler.client.report [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:30 compute-0 systemd-udevd[233706]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:30.980 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e6c771-dbab-4cfa-8056-cebcd4c1ade1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:30.981 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7be4540a-01 in ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:11:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:30.984 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7be4540a-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:11:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:30.984 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[53b716e7-8990-4d08-9717-41bbf428e1fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:30.985 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[32518b89-f9c3-4d5a-aa24-fdca31701730]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:30 compute-0 NetworkManager[55691]: <info>  [1764936690.9945] device (tap4870c9e1-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:30 compute-0 NetworkManager[55691]: <info>  [1764936690.9953] device (tap4870c9e1-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:30 compute-0 systemd-machined[153543]: New machine qemu-94-instance-00000053.
Dec 05 12:11:30 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:30.997 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a4dd2b4b-8eb1-4387-a72d-c4370f8adc06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.021 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8cbe2d-4cc9-442a-abd9-faff29ca2d43]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 systemd[1]: Started Virtual Machine qemu-94-instance-00000053.
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.052 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a773c71e-dec1-4155-a3cc-fa1749217d71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 systemd-udevd[233710]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:31 compute-0 NetworkManager[55691]: <info>  [1764936691.0588] manager: (tap7be4540a-00): new Veth device (/org/freedesktop/NetworkManager/Devices/313)
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.060 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[13e2bfdc-56d0-4f1a-9610-e49b2ecc59c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.097 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[85204051-ee42-45de-8a6f-51932e830ac4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.100 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9d27aac7-b21e-46c4-b371-2ff09c3219a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 NetworkManager[55691]: <info>  [1764936691.1291] device (tap7be4540a-00): carrier: link connected
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.136 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4b326beb-febe-4b7c-b403-7ed70faeef25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.153 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[76346279-6918-48bc-9010-41136b042208]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 407169, 'reachable_time': 21673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233739, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.172 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[82b3b74f-f784-47f6-9a4e-54f8c5f73ccb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:4893'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 407169, 'tstamp': 407169}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233740, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.192 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91a1dff8-9219-4e2c-90ea-5c20f6af7cef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 407169, 'reachable_time': 21673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233741, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.226 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0d7004b4-d725-45eb-8366-bf167832535a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.301 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.303 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.302 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[48de92f5-8c3f-4740-9b52-9e6a43d9a153]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.304 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.304 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.305 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7be4540a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.306 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:31 compute-0 NetworkManager[55691]: <info>  [1764936691.3076] manager: (tap7be4540a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Dec 05 12:11:31 compute-0 kernel: tap7be4540a-00: entered promiscuous mode
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.310 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.312 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7be4540a-00, col_values=(('external_ids', {'iface-id': '4dcf8e96-bf04-4914-959a-aad071dfa454'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:31 compute-0 ovn_controller[95610]: 2025-12-05T12:11:31Z|00795|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.329 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.331 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.333 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca81afe-5755-4997-968f-04029cc02d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.335 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:11:31 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:31.335 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'env', 'PROCESS_TAG=haproxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.353 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.354 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.383 187212 INFO nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.410 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.577 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.579 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.579 187212 INFO nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Creating image(s)
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.580 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "/var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.581 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "/var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.582 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "/var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.598 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.660 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.661 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.662 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.678 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.737 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936676.7358205, 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.738 187212 INFO nova.compute.manager [-] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] VM Stopped (Lifecycle Event)
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.741 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.742 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.764 187212 DEBUG nova.compute.manager [None req-45be81c5-e9ba-4b3b-9704-a3a19a3dc672 - - - - - -] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.776 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936691.776057, a70dccfb-2a89-4283-aba2-934af2667db3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.776 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] VM Started (Lifecycle Event)
Dec 05 12:11:31 compute-0 podman[233778]: 2025-12-05 12:11:31.695508148 +0000 UTC m=+0.024556946 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.798 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.803 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936691.7767563, a70dccfb-2a89-4283-aba2-934af2667db3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.803 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] VM Paused (Lifecycle Event)
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.828 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.833 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.856 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.888 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk 1073741824" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.889 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.889 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:31 compute-0 podman[233778]: 2025-12-05 12:11:31.907193313 +0000 UTC m=+0.236242111 container create 995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:11:31 compute-0 systemd[1]: Started libpod-conmon-995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c.scope.
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.965 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.967 187212 DEBUG nova.virt.disk.api [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Checking if we can resize image /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:11:31 compute-0 nova_compute[187208]: 2025-12-05 12:11:31.967 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:31 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:11:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eea16742dc7dafaa5415894d6e2f2bbe9e168c04b0874ff39379e45f2a2874c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:11:32 compute-0 podman[233778]: 2025-12-05 12:11:32.012836925 +0000 UTC m=+0.341885713 container init 995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 12:11:32 compute-0 podman[233778]: 2025-12-05 12:11:32.019278699 +0000 UTC m=+0.348327497 container start 995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.032 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.033 187212 DEBUG nova.virt.disk.api [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Cannot resize image /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.034 187212 DEBUG nova.objects.instance [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lazy-loading 'migration_context' on Instance uuid ef8fcf55-e147-4baf-b506-1d99af05d330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:32 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233805]: [NOTICE]   (233812) : New worker (233816) forked
Dec 05 12:11:32 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233805]: [NOTICE]   (233812) : Loading success.
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.054 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.054 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Ensure instance console log exists: /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.055 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.055 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.056 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.202 187212 DEBUG nova.network.neutron [req-e9709137-40af-4bdc-b779-efeb427f18bc req-26492240-29fb-4cc0-9ad5-750279d34d76 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Updated VIF entry in instance network info cache for port 4870c9e1-8549-42a1-a77d-f9824cc38f59. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.203 187212 DEBUG nova.network.neutron [req-e9709137-40af-4bdc-b779-efeb427f18bc req-26492240-29fb-4cc0-9ad5-750279d34d76 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Updating instance_info_cache with network_info: [{"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.274 187212 DEBUG nova.policy [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '430719002c284cd28237859ea6061eef', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.289 187212 DEBUG oslo_concurrency.lockutils [req-e9709137-40af-4bdc-b779-efeb427f18bc req-26492240-29fb-4cc0-9ad5-750279d34d76 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-a70dccfb-2a89-4283-aba2-934af2667db3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.350 187212 DEBUG oslo_concurrency.lockutils [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-7b183eee-c877-4387-a2f2-78923af9a88b" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.350 187212 DEBUG oslo_concurrency.lockutils [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-7b183eee-c877-4387-a2f2-78923af9a88b" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.351 187212 DEBUG nova.objects.instance [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid ecc25cb4-5b3a-43f7-949d-ca9a1a19056a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.610 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.611 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.695 187212 DEBUG nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.935 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.936 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.945 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:11:32 compute-0 nova_compute[187208]: 2025-12-05 12:11:32.946 187212 INFO nova.compute.claims [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.201 187212 DEBUG nova.objects.instance [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid ecc25cb4-5b3a-43f7-949d-ca9a1a19056a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.232 187212 DEBUG nova.network.neutron [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.292 187212 DEBUG nova.compute.provider_tree [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.315 187212 DEBUG nova.scheduler.client.report [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.474 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.474 187212 DEBUG nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.533 187212 DEBUG nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.533 187212 DEBUG nova.network.neutron [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.558 187212 INFO nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.627 187212 DEBUG nova.policy [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:11:33 compute-0 nova_compute[187208]: 2025-12-05 12:11:33.662 187212 DEBUG nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.075 187212 DEBUG nova.compute.manager [req-a4b3341c-001a-4806-8d22-cad0ef113cdc req-52c2c0c0-74c8-4efd-a9d6-0d87ec26647e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.075 187212 DEBUG nova.compute.manager [req-a4b3341c-001a-4806-8d22-cad0ef113cdc req-52c2c0c0-74c8-4efd-a9d6-0d87ec26647e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing instance network info cache due to event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.076 187212 DEBUG oslo_concurrency.lockutils [req-a4b3341c-001a-4806-8d22-cad0ef113cdc req-52c2c0c0-74c8-4efd-a9d6-0d87ec26647e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.076 187212 DEBUG oslo_concurrency.lockutils [req-a4b3341c-001a-4806-8d22-cad0ef113cdc req-52c2c0c0-74c8-4efd-a9d6-0d87ec26647e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.076 187212 DEBUG nova.network.neutron [req-a4b3341c-001a-4806-8d22-cad0ef113cdc req-52c2c0c0-74c8-4efd-a9d6-0d87ec26647e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:34 compute-0 podman[233826]: 2025-12-05 12:11:34.204936812 +0000 UTC m=+0.053297621 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 12:11:34 compute-0 podman[233825]: 2025-12-05 12:11:34.223842544 +0000 UTC m=+0.073581052 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.324 187212 DEBUG nova.policy [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.339 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.901 187212 DEBUG nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.903 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.903 187212 INFO nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Creating image(s)
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.904 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "/var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.904 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.905 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.920 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.992 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.994 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:34 compute-0 nova_compute[187208]: 2025-12-05 12:11:34.995 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.007 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.076 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.078 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.105 187212 DEBUG nova.network.neutron [req-a26f5fd5-f9ad-445e-9419-b0c97d73b171 req-ef4d3d0e-9fef-4ba3-8d97-1c3465e7153b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updated VIF entry in instance network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.106 187212 DEBUG nova.network.neutron [req-a26f5fd5-f9ad-445e-9419-b0c97d73b171 req-ef4d3d0e-9fef-4ba3-8d97-1c3465e7153b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.147 187212 DEBUG oslo_concurrency.lockutils [req-a26f5fd5-f9ad-445e-9419-b0c97d73b171 req-ef4d3d0e-9fef-4ba3-8d97-1c3465e7153b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.269 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936680.2682638, 9147fd8a-b772-485b-8dff-7bb1a235c4dd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.270 187212 INFO nova.compute.manager [-] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] VM Stopped (Lifecycle Event)
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.293 187212 DEBUG nova.compute.manager [None req-4d9110a8-55ec-4197-895d-45611cd0e58c - - - - - -] [instance: 9147fd8a-b772-485b-8dff-7bb1a235c4dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.321 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk 1073741824" returned: 0 in 0.243s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.322 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.322 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.379 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.380 187212 DEBUG nova.virt.disk.api [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Checking if we can resize image /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.380 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.401 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Successfully created port: 5f1f909d-4147-44de-9adf-829a12fc8bfa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.442 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.443 187212 DEBUG nova.virt.disk.api [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Cannot resize image /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.444 187212 DEBUG nova.objects.instance [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'migration_context' on Instance uuid c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.459 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.460 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Ensure instance console log exists: /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.460 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.461 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.461 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.860 187212 DEBUG nova.network.neutron [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Successfully updated port: 7b183eee-c877-4387-a2f2-78923af9a88b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:35 compute-0 nova_compute[187208]: 2025-12-05 12:11:35.883 187212 DEBUG oslo_concurrency.lockutils [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:36 compute-0 nova_compute[187208]: 2025-12-05 12:11:36.048 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936681.0475948, 235e69b1-e04e-4d65-92e2-864b105f03cf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:36 compute-0 nova_compute[187208]: 2025-12-05 12:11:36.049 187212 INFO nova.compute.manager [-] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] VM Stopped (Lifecycle Event)
Dec 05 12:11:36 compute-0 nova_compute[187208]: 2025-12-05 12:11:36.276 187212 DEBUG nova.compute.manager [None req-271f7511-50d8-41fc-8ab8-503626e55a80 - - - - - -] [instance: 235e69b1-e04e-4d65-92e2-864b105f03cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:37 compute-0 nova_compute[187208]: 2025-12-05 12:11:37.010 187212 DEBUG nova.network.neutron [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Successfully created port: 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:11:37 compute-0 nova_compute[187208]: 2025-12-05 12:11:37.139 187212 DEBUG nova.compute.manager [req-a640452f-9fa7-4562-a760-b65e300be332 req-bb793df2-90d7-499a-ae82-5bcb41192ba1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-changed-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:37 compute-0 nova_compute[187208]: 2025-12-05 12:11:37.140 187212 DEBUG nova.compute.manager [req-a640452f-9fa7-4562-a760-b65e300be332 req-bb793df2-90d7-499a-ae82-5bcb41192ba1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing instance network info cache due to event network-changed-7b183eee-c877-4387-a2f2-78923af9a88b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:37 compute-0 nova_compute[187208]: 2025-12-05 12:11:37.140 187212 DEBUG oslo_concurrency.lockutils [req-a640452f-9fa7-4562-a760-b65e300be332 req-bb793df2-90d7-499a-ae82-5bcb41192ba1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:37 compute-0 nova_compute[187208]: 2025-12-05 12:11:37.569 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Successfully created port: 67a9975d-5d22-4a6a-af5f-83ab6b080d9a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:11:37 compute-0 nova_compute[187208]: 2025-12-05 12:11:37.585 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.029 187212 DEBUG nova.compute.manager [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Received event network-vif-plugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.030 187212 DEBUG oslo_concurrency.lockutils [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.032 187212 DEBUG oslo_concurrency.lockutils [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.032 187212 DEBUG oslo_concurrency.lockutils [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.032 187212 DEBUG nova.compute.manager [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Processing event network-vif-plugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.032 187212 DEBUG nova.compute.manager [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Received event network-vif-plugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.033 187212 DEBUG oslo_concurrency.lockutils [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.033 187212 DEBUG oslo_concurrency.lockutils [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.033 187212 DEBUG oslo_concurrency.lockutils [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.033 187212 DEBUG nova.compute.manager [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] No waiting events found dispatching network-vif-plugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.034 187212 WARNING nova.compute.manager [req-e4b9618d-743f-4d73-86b9-81d750785ee6 req-56536aed-0b61-4a1a-ac1a-17ca8bb6da70 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Received unexpected event network-vif-plugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 for instance with vm_state building and task_state spawning.
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.034 187212 DEBUG nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.039 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936698.0385406, a70dccfb-2a89-4283-aba2-934af2667db3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.039 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] VM Resumed (Lifecycle Event)
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.041 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.044 187212 INFO nova.virt.libvirt.driver [-] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Instance spawned successfully.
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.045 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.069 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.075 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.078 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.078 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.079 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.079 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.079 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.080 187212 DEBUG nova.virt.libvirt.driver [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.125 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.136 187212 DEBUG nova.network.neutron [req-a4b3341c-001a-4806-8d22-cad0ef113cdc req-52c2c0c0-74c8-4efd-a9d6-0d87ec26647e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updated VIF entry in instance network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.136 187212 DEBUG nova.network.neutron [req-a4b3341c-001a-4806-8d22-cad0ef113cdc req-52c2c0c0-74c8-4efd-a9d6-0d87ec26647e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.147 187212 INFO nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Took 14.56 seconds to spawn the instance on the hypervisor.
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.148 187212 DEBUG nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.160 187212 DEBUG oslo_concurrency.lockutils [req-a4b3341c-001a-4806-8d22-cad0ef113cdc req-52c2c0c0-74c8-4efd-a9d6-0d87ec26647e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.161 187212 DEBUG oslo_concurrency.lockutils [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.161 187212 DEBUG nova.network.neutron [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.182 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.218 187212 INFO nova.compute.manager [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Took 15.22 seconds to build instance.
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.249 187212 DEBUG oslo_concurrency.lockutils [None req-04bddc3e-2bf0-4398-ad94-13ab8b12af6c ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.553 187212 WARNING nova.network.neutron [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.593 187212 DEBUG nova.network.neutron [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Successfully updated port: 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.614 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "refresh_cache-c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.614 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquired lock "refresh_cache-c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.614 187212 DEBUG nova.network.neutron [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.841 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Successfully created port: 04328ce4-34a7-41df-8b27-e1d5b7f3f280 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:11:38 compute-0 nova_compute[187208]: 2025-12-05 12:11:38.899 187212 DEBUG nova.network.neutron [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:11:39 compute-0 nova_compute[187208]: 2025-12-05 12:11:39.342 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:40 compute-0 podman[233876]: 2025-12-05 12:11:40.20853519 +0000 UTC m=+0.057829941 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 12:11:40 compute-0 podman[233877]: 2025-12-05 12:11:40.263548199 +0000 UTC m=+0.105980433 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.293 187212 DEBUG nova.network.neutron [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Updating instance_info_cache with network_info: [{"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.340 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Releasing lock "refresh_cache-c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.341 187212 DEBUG nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Instance network_info: |[{"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.343 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Start _get_guest_xml network_info=[{"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.348 187212 WARNING nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.356 187212 DEBUG nova.virt.libvirt.host [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.357 187212 DEBUG nova.virt.libvirt.host [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.360 187212 DEBUG nova.virt.libvirt.host [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.361 187212 DEBUG nova.virt.libvirt.host [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.362 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.362 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.363 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.363 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.363 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.363 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.364 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.364 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.364 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.365 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.365 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.365 187212 DEBUG nova.virt.hardware [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.372 187212 DEBUG nova.virt.libvirt.vif [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1213220860',display_name='tempest-ServersTestJSON-server-1213220860',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1213220860',id=85,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBqXxjjZG3zRpoVdmm3gug/kuSI6oQzY8us2IsTu/9Zy7Ex4wlL/7VzaOtZ9WYCzOWxb9KMd4tJR1k9CVgqnqR3cmI+vR+6Obgtfj46DxP9OaK/oOG4PK0jIDIbrTs0g7A==',key_name='tempest-key-1731656875',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-iecebzas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:34Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.372 187212 DEBUG nova.network.os_vif_util [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.373 187212 DEBUG nova.network.os_vif_util [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:c3:62,bridge_name='br-int',has_traffic_filtering=True,id=9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e4c5b24-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.374 187212 DEBUG nova.objects.instance [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'pci_devices' on Instance uuid c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.391 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <uuid>c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7</uuid>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <name>instance-00000055</name>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestJSON-server-1213220860</nova:name>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:11:41</nova:creationTime>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:11:41 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:11:41 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:11:41 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:11:41 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:41 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:11:41 compute-0 nova_compute[187208]:         <nova:user uuid="62153b585ecc4e6fa2ad567851d49081">tempest-ServersTestJSON-1492365581-project-member</nova:user>
Dec 05 12:11:41 compute-0 nova_compute[187208]:         <nova:project uuid="0c982a61e3fc4c8da9248076bb0361ac">tempest-ServersTestJSON-1492365581</nova:project>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:11:41 compute-0 nova_compute[187208]:         <nova:port uuid="9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f">
Dec 05 12:11:41 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <entry name="serial">c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7</entry>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <entry name="uuid">c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7</entry>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk.config"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:6f:c3:62"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <target dev="tap9e4c5b24-e3"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/console.log" append="off"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:11:41 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:11:41 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:41 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:41 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:41 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.392 187212 DEBUG nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Preparing to wait for external event network-vif-plugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.392 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.392 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.392 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.393 187212 DEBUG nova.virt.libvirt.vif [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1213220860',display_name='tempest-ServersTestJSON-server-1213220860',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1213220860',id=85,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBqXxjjZG3zRpoVdmm3gug/kuSI6oQzY8us2IsTu/9Zy7Ex4wlL/7VzaOtZ9WYCzOWxb9KMd4tJR1k9CVgqnqR3cmI+vR+6Obgtfj46DxP9OaK/oOG4PK0jIDIbrTs0g7A==',key_name='tempest-key-1731656875',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-iecebzas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:34Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.393 187212 DEBUG nova.network.os_vif_util [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.394 187212 DEBUG nova.network.os_vif_util [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:c3:62,bridge_name='br-int',has_traffic_filtering=True,id=9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e4c5b24-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.394 187212 DEBUG os_vif [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:c3:62,bridge_name='br-int',has_traffic_filtering=True,id=9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e4c5b24-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.395 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.395 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.396 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.398 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.399 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e4c5b24-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.399 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9e4c5b24-e3, col_values=(('external_ids', {'iface-id': '9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:c3:62', 'vm-uuid': 'c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:41 compute-0 NetworkManager[55691]: <info>  [1764936701.4026] manager: (tap9e4c5b24-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.404 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.411 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.412 187212 INFO os_vif [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:c3:62,bridge_name='br-int',has_traffic_filtering=True,id=9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e4c5b24-e3')
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.494 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.495 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.495 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No VIF found with MAC fa:16:3e:6f:c3:62, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:41 compute-0 nova_compute[187208]: 2025-12-05 12:11:41.496 187212 INFO nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Using config drive
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.089 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.089 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.269 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Successfully updated port: 5f1f909d-4147-44de-9adf-829a12fc8bfa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.271 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.271 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.272 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.272 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 54d9605a-998b-4492-afc8-f7a5b0dd4e84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.509 187212 INFO nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Creating config drive at /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk.config
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.515 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8e2p42ng execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.643 187212 DEBUG oslo_concurrency.processutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8e2p42ng" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:42 compute-0 NetworkManager[55691]: <info>  [1764936702.7325] manager: (tap9e4c5b24-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/316)
Dec 05 12:11:42 compute-0 kernel: tap9e4c5b24-e3: entered promiscuous mode
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.738 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 ovn_controller[95610]: 2025-12-05T12:11:42Z|00796|binding|INFO|Claiming lport 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f for this chassis.
Dec 05 12:11:42 compute-0 ovn_controller[95610]: 2025-12-05T12:11:42Z|00797|binding|INFO|9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f: Claiming fa:16:3e:6f:c3:62 10.100.0.14
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.748 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:c3:62 10.100.0.14'], port_security=['fa:16:3e:6f:c3:62 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.749 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 bound to our chassis
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.752 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:11:42 compute-0 ovn_controller[95610]: 2025-12-05T12:11:42Z|00798|binding|INFO|Setting lport 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f ovn-installed in OVS
Dec 05 12:11:42 compute-0 ovn_controller[95610]: 2025-12-05T12:11:42Z|00799|binding|INFO|Setting lport 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f up in Southbound
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.761 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.767 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[01cd56fe-1d9e-4272-9ffc-bd4142b035f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:42 compute-0 systemd-machined[153543]: New machine qemu-95-instance-00000055.
Dec 05 12:11:42 compute-0 systemd-udevd[233961]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:42 compute-0 systemd[1]: Started Virtual Machine qemu-95-instance-00000055.
Dec 05 12:11:42 compute-0 NetworkManager[55691]: <info>  [1764936702.7963] device (tap9e4c5b24-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:42 compute-0 NetworkManager[55691]: <info>  [1764936702.8004] device (tap9e4c5b24-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.800 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa83bd9-7d96-41a8-afc3-2736479ea1d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.804 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f4a27f68-369b-4ea2-8861-9a1e8a380e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.814 187212 DEBUG nova.network.neutron [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:42 compute-0 podman[233937]: 2025-12-05 12:11:42.814635098 +0000 UTC m=+0.101348960 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.834 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[93c21a57-908c-4e46-a0e2-c72af18e21a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.849 187212 DEBUG oslo_concurrency.lockutils [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.850 187212 DEBUG oslo_concurrency.lockutils [req-a640452f-9fa7-4562-a760-b65e300be332 req-bb793df2-90d7-499a-ae82-5bcb41192ba1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.850 187212 DEBUG nova.network.neutron [req-a640452f-9fa7-4562-a760-b65e300be332 req-bb793df2-90d7-499a-ae82-5bcb41192ba1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing network info cache for port 7b183eee-c877-4387-a2f2-78923af9a88b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.854 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[814cd1e5-535e-4889-915d-23d84120c650]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233977, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.862 187212 DEBUG nova.virt.libvirt.vif [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.862 187212 DEBUG nova.network.os_vif_util [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.863 187212 DEBUG nova.network.os_vif_util [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.863 187212 DEBUG os_vif [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.864 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.864 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.864 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.867 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.868 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b183eee-c8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.868 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b183eee-c8, col_values=(('external_ids', {'iface-id': '7b183eee-c877-4387-a2f2-78923af9a88b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:13:10', 'vm-uuid': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:42 compute-0 NetworkManager[55691]: <info>  [1764936702.8706] manager: (tap7b183eee-c8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.871 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.878 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.880 187212 INFO os_vif [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8')
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.880 187212 DEBUG nova.virt.libvirt.vif [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.880 187212 DEBUG nova.network.os_vif_util [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.881 187212 DEBUG nova.network.os_vif_util [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.883 187212 DEBUG nova.virt.libvirt.guest [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec 05 12:11:42 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:58:13:10"/>
Dec 05 12:11:42 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:11:42 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:42 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:11:42 compute-0 nova_compute[187208]:   <target dev="tap7b183eee-c8"/>
Dec 05 12:11:42 compute-0 nova_compute[187208]: </interface>
Dec 05 12:11:42 compute-0 nova_compute[187208]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.883 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fc717f-c837-4b27-8b40-be8221690350]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233980, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233980, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.885 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.887 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.891 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.892 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.892 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.892 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:42 compute-0 kernel: tap7b183eee-c8: entered promiscuous mode
Dec 05 12:11:42 compute-0 NetworkManager[55691]: <info>  [1764936702.8965] manager: (tap7b183eee-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Dec 05 12:11:42 compute-0 systemd-udevd[233968]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:42 compute-0 ovn_controller[95610]: 2025-12-05T12:11:42Z|00800|binding|INFO|Claiming lport 7b183eee-c877-4387-a2f2-78923af9a88b for this chassis.
Dec 05 12:11:42 compute-0 ovn_controller[95610]: 2025-12-05T12:11:42Z|00801|binding|INFO|7b183eee-c877-4387-a2f2-78923af9a88b: Claiming fa:16:3e:58:13:10 10.100.0.5
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.899 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 NetworkManager[55691]: <info>  [1764936702.9119] device (tap7b183eee-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:42 compute-0 NetworkManager[55691]: <info>  [1764936702.9138] device (tap7b183eee-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.914 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:13:10 10.100.0.5'], port_security=['fa:16:3e:58:13:10 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1639127318', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1639127318', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7b183eee-c877-4387-a2f2-78923af9a88b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.915 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7b183eee-c877-4387-a2f2-78923af9a88b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis
Dec 05 12:11:42 compute-0 ovn_controller[95610]: 2025-12-05T12:11:42Z|00802|binding|INFO|Setting lport 7b183eee-c877-4387-a2f2-78923af9a88b ovn-installed in OVS
Dec 05 12:11:42 compute-0 ovn_controller[95610]: 2025-12-05T12:11:42Z|00803|binding|INFO|Setting lport 7b183eee-c877-4387-a2f2-78923af9a88b up in Southbound
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.920 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.920 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.938 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ff39dd15-9b77-4d01-bca2-e6fec4767611]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.975 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[08c9cac3-60e0-4f7b-b4de-aee904a509dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:42.978 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ebfddfe9-d37a-4e03-9e1a-3206464f4fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.998 187212 DEBUG nova.virt.libvirt.driver [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.999 187212 DEBUG nova.virt.libvirt.driver [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.999 187212 DEBUG nova.virt.libvirt.driver [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:a2:40:d1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:42 compute-0 nova_compute[187208]: 2025-12-05 12:11:42.999 187212 DEBUG nova.virt.libvirt.driver [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:58:13:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:43.007 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[24b98e9b-546d-4ce5-a74f-384ec4530616]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:43.023 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e99f40-ea0f-4b15-bd3f-18b6542dfc8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 12, 'rx_bytes': 742, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 12, 'rx_bytes': 742, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233994, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.034 187212 DEBUG nova.virt.libvirt.guest [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:43 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:43 compute-0 nova_compute[187208]:   <nova:name>tempest-tempest.common.compute-instance-914539058</nova:name>
Dec 05 12:11:43 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:11:43</nova:creationTime>
Dec 05 12:11:43 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:11:43 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:43 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:11:43 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:11:43 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:11:43 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:43 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     <nova:port uuid="88e41011-3ebc-4215-ad20-58a49d31a6d4">
Dec 05 12:11:43 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     <nova:port uuid="7b183eee-c877-4387-a2f2-78923af9a88b">
Dec 05 12:11:43 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:11:43 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:43 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:11:43 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:11:43 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:11:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:43.043 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6f9452af-34f9-4ead-ae2f-8367fe21a8ee]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398326, 'tstamp': 398326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233995, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398330, 'tstamp': 398330}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233995, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:43.044 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.046 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.047 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:43.047 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:43.047 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:43.047 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:43 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:43.047 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.059 187212 DEBUG oslo_concurrency.lockutils [None req-44510f2a-11c4-4534-9394-716f2676afdd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-7b183eee-c877-4387-a2f2-78923af9a88b" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 10.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.185 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.574 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936703.5736582, c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.575 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] VM Started (Lifecycle Event)
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.605 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.609 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936703.573859, c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.610 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] VM Paused (Lifecycle Event)
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.639 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.643 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.672 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Successfully updated port: 67a9975d-5d22-4a6a-af5f-83ab6b080d9a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.674 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.759 187212 DEBUG nova.compute.manager [req-2ebb21ea-ecc0-4044-8e0c-2d121884759a req-8232a411-c900-4def-8b28-6a3aafbc636f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Received event network-changed-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.759 187212 DEBUG nova.compute.manager [req-2ebb21ea-ecc0-4044-8e0c-2d121884759a req-8232a411-c900-4def-8b28-6a3aafbc636f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Refreshing instance network info cache due to event network-changed-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.760 187212 DEBUG oslo_concurrency.lockutils [req-2ebb21ea-ecc0-4044-8e0c-2d121884759a req-8232a411-c900-4def-8b28-6a3aafbc636f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.760 187212 DEBUG oslo_concurrency.lockutils [req-2ebb21ea-ecc0-4044-8e0c-2d121884759a req-8232a411-c900-4def-8b28-6a3aafbc636f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:43 compute-0 nova_compute[187208]: 2025-12-05 12:11:43.760 187212 DEBUG nova.network.neutron [req-2ebb21ea-ecc0-4044-8e0c-2d121884759a req-8232a411-c900-4def-8b28-6a3aafbc636f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Refreshing network info cache for port 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.153 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.154 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.394 187212 DEBUG nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.483 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.484 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.493 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.493 187212 INFO nova.compute.claims [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.773 187212 DEBUG nova.compute.provider_tree [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.914 187212 DEBUG nova.scheduler.client.report [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:44 compute-0 nova_compute[187208]: 2025-12-05 12:11:44.989 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.084 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.086 187212 DEBUG nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.090 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.090 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.091 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.091 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.091 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.091 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.093 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Successfully updated port: 04328ce4-34a7-41df-8b27-e1d5b7f3f280 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.130 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.131 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquired lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.131 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.165 187212 DEBUG nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.165 187212 DEBUG nova.network.neutron [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.186 187212 INFO nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.211 187212 DEBUG nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.330 187212 DEBUG nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.331 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.331 187212 INFO nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Creating image(s)
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.332 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.332 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.332 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.347 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:45 compute-0 ovn_controller[95610]: 2025-12-05T12:11:45Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:13:10 10.100.0.5
Dec 05 12:11:45 compute-0 ovn_controller[95610]: 2025-12-05T12:11:45Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:13:10 10.100.0.5
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.426 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.427 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.428 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.439 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.509 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.511 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.564 187212 DEBUG nova.policy [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.567 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.567 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.568 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.634 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.636 187212 DEBUG nova.virt.disk.api [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Checking if we can resize image /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.636 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.699 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.700 187212 DEBUG nova.virt.disk.api [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Cannot resize image /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.700 187212 DEBUG nova.objects.instance [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'migration_context' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.722 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.722 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Ensure instance console log exists: /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.723 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.723 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.723 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.732 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.875 187212 DEBUG nova.network.neutron [req-a640452f-9fa7-4562-a760-b65e300be332 req-bb793df2-90d7-499a-ae82-5bcb41192ba1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updated VIF entry in instance network info cache for port 7b183eee-c877-4387-a2f2-78923af9a88b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.875 187212 DEBUG nova.network.neutron [req-a640452f-9fa7-4562-a760-b65e300be332 req-bb793df2-90d7-499a-ae82-5bcb41192ba1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:45 compute-0 nova_compute[187208]: 2025-12-05 12:11:45.896 187212 DEBUG oslo_concurrency.lockutils [req-a640452f-9fa7-4562-a760-b65e300be332 req-bb793df2-90d7-499a-ae82-5bcb41192ba1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.038 187212 DEBUG nova.compute.manager [req-dd6d4c83-0ef4-47fe-be36-611187804548 req-e367da7b-f416-46bb-b8ca-dd807d5ae220 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-changed-67a9975d-5d22-4a6a-af5f-83ab6b080d9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.039 187212 DEBUG nova.compute.manager [req-dd6d4c83-0ef4-47fe-be36-611187804548 req-e367da7b-f416-46bb-b8ca-dd807d5ae220 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Refreshing instance network info cache due to event network-changed-67a9975d-5d22-4a6a-af5f-83ab6b080d9a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.039 187212 DEBUG oslo_concurrency.lockutils [req-dd6d4c83-0ef4-47fe-be36-611187804548 req-e367da7b-f416-46bb-b8ca-dd807d5ae220 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.679 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-changed-5f1f909d-4147-44de-9adf-829a12fc8bfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.679 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Refreshing instance network info cache due to event network-changed-5f1f909d-4147-44de-9adf-829a12fc8bfa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.680 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.926 187212 DEBUG nova.network.neutron [req-2ebb21ea-ecc0-4044-8e0c-2d121884759a req-8232a411-c900-4def-8b28-6a3aafbc636f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Updated VIF entry in instance network info cache for port 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.927 187212 DEBUG nova.network.neutron [req-2ebb21ea-ecc0-4044-8e0c-2d121884759a req-8232a411-c900-4def-8b28-6a3aafbc636f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Updating instance_info_cache with network_info: [{"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.946 187212 DEBUG nova.network.neutron [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Successfully created port: 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:11:46 compute-0 nova_compute[187208]: 2025-12-05 12:11:46.966 187212 DEBUG oslo_concurrency.lockutils [req-2ebb21ea-ecc0-4044-8e0c-2d121884759a req-8232a411-c900-4def-8b28-6a3aafbc636f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.090 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.114 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.115 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.115 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.115 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.239 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.308 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.310 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.374 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.381 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.445 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.447 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.516 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.524 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.725 187212 DEBUG oslo_concurrency.lockutils [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-7b183eee-c877-4387-a2f2-78923af9a88b" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.725 187212 DEBUG oslo_concurrency.lockutils [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-7b183eee-c877-4387-a2f2-78923af9a88b" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.730 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk --force-share --output=json" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.731 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.765 187212 DEBUG nova.objects.instance [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid ecc25cb4-5b3a-43f7-949d-ca9a1a19056a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.802 187212 DEBUG nova.virt.libvirt.vif [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.804 187212 DEBUG nova.network.os_vif_util [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.805 187212 DEBUG nova.network.os_vif_util [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.808 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.814 187212 DEBUG nova.virt.libvirt.guest [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.817 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.840 187212 DEBUG nova.virt.libvirt.guest [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.843 187212 DEBUG nova.virt.libvirt.driver [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Attempting to detach device tap7b183eee-c8 from instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.844 187212 DEBUG nova.virt.libvirt.guest [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:58:13:10"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <target dev="tap7b183eee-c8"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]: </interface>
Dec 05 12:11:47 compute-0 nova_compute[187208]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.850 187212 DEBUG nova.virt.libvirt.guest [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.853 187212 DEBUG nova.virt.libvirt.guest [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface>not found in domain: <domain type='kvm' id='86'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <name>instance-0000004b</name>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <uuid>ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</uuid>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <nova:name>tempest-tempest.common.compute-instance-914539058</nova:name>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:11:43</nova:creationTime>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:port uuid="88e41011-3ebc-4215-ad20-58a49d31a6d4">
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <nova:port uuid="7b183eee-c877-4387-a2f2-78923af9a88b">
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:11:47 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <memory unit='KiB'>131072</memory>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <vcpu placement='static'>1</vcpu>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <resource>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <partition>/machine</partition>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </resource>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <sysinfo type='smbios'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <entry name='manufacturer'>RDO</entry>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <entry name='serial'>ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</entry>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <entry name='uuid'>ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</entry>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <entry name='family'>Virtual Machine</entry>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <boot dev='hd'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <smbios mode='sysinfo'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <vmcoreinfo state='on'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <vendor>AMD</vendor>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='x2apic'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc-deadline'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='hypervisor'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc_adjust'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='spec-ctrl'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='stibp'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='ssbd'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='cmp_legacy'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='overflow-recov'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='succor'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='ibrs'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='amd-ssbd'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='virt-ssbd'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='lbrv'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='tsc-scale'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='vmcb-clean'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='flushbyasid'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='pause-filter'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='pfthreshold'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='xsaves'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='svm'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='require' name='topoext'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='npt'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <feature policy='disable' name='nrip-save'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <clock offset='utc'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <timer name='hpet' present='no'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <on_poweroff>destroy</on_poweroff>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <on_reboot>restart</on_reboot>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <on_crash>destroy</on_crash>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <disk type='file' device='disk'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk' index='2'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <backingStore type='file' index='3'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:         <format type='raw'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:         <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:         <backingStore/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       </backingStore>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target dev='vda' bus='virtio'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='virtio-disk0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <disk type='file' device='cdrom'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config' index='1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <backingStore/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target dev='sda' bus='sata'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <readonly/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='sata0-0-0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pcie.0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='1' port='0x10'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='2' port='0x11'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.2'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='3' port='0x12'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.3'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='4' port='0x13'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.4'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='5' port='0x14'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.5'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='6' port='0x15'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.6'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='7' port='0x16'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.7'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='8' port='0x17'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.8'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='9' port='0x18'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.9'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='10' port='0x19'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.10'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='11' port='0x1a'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.11'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='12' port='0x1b'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.12'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='13' port='0x1c'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.13'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='14' port='0x1d'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.14'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='15' port='0x1e'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.15'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='16' port='0x1f'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.16'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='17' port='0x20'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.17'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='18' port='0x21'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.18'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='19' port='0x22'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.19'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='20' port='0x23'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.20'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='21' port='0x24'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.21'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='22' port='0x25'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.22'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='23' port='0x26'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.23'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='24' port='0x27'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.24'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target chassis='25' port='0x28'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.25'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model name='pcie-pci-bridge'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='pci.26'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='usb'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <controller type='sata' index='0'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='ide'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:a2:40:d1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target dev='tap88e41011-3e'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='net0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:58:13:10'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target dev='tap7b183eee-c8'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='net1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <serial type='pty'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <source path='/dev/pts/0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/console.log' append='off'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target type='isa-serial' port='0'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:         <model name='isa-serial'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       </target>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <console type='pty' tty='/dev/pts/0'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <source path='/dev/pts/0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/console.log' append='off'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <target type='serial' port='0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </console>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <input type='tablet' bus='usb'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='input0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='usb' bus='0' port='1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <input type='mouse' bus='ps2'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='input1'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <input type='keyboard' bus='ps2'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='input2'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <listen type='address' address='::0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </graphics>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <audio id='1' type='none'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='video0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <watchdog model='itco' action='reset'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='watchdog0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </watchdog>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <memballoon model='virtio'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <stats period='10'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='balloon0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <rng model='virtio'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <backend model='random'>/dev/urandom</backend>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <alias name='rng0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <label>system_u:system_r:svirt_t:s0:c307,c851</label>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c307,c851</imagelabel>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <label>+107:+107</label>
Dec 05 12:11:47 compute-0 nova_compute[187208]:     <imagelabel>+107:+107</imagelabel>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:11:47 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:47 compute-0 nova_compute[187208]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.871 187212 INFO nova.virt.libvirt.driver [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tap7b183eee-c8 from instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a from the persistent domain config.
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.871 187212 DEBUG nova.virt.libvirt.driver [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] (1/8): Attempting to detach device tap7b183eee-c8 with device alias net1 from instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.871 187212 DEBUG nova.virt.libvirt.guest [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <mac address="fa:16:3e:58:13:10"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <model type="virtio"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <mtu size="1442"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]:   <target dev="tap7b183eee-c8"/>
Dec 05 12:11:47 compute-0 nova_compute[187208]: </interface>
Dec 05 12:11:47 compute-0 nova_compute[187208]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.873 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.884 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.885 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.955 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:47 compute-0 kernel: tap7b183eee-c8 (unregistering): left promiscuous mode
Dec 05 12:11:47 compute-0 NetworkManager[55691]: <info>  [1764936707.9726] device (tap7b183eee-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.989 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:47 compute-0 ovn_controller[95610]: 2025-12-05T12:11:47Z|00804|binding|INFO|Releasing lport 7b183eee-c877-4387-a2f2-78923af9a88b from this chassis (sb_readonly=0)
Dec 05 12:11:47 compute-0 ovn_controller[95610]: 2025-12-05T12:11:47Z|00805|binding|INFO|Setting lport 7b183eee-c877-4387-a2f2-78923af9a88b down in Southbound
Dec 05 12:11:47 compute-0 ovn_controller[95610]: 2025-12-05T12:11:47Z|00806|binding|INFO|Removing iface tap7b183eee-c8 ovn-installed in OVS
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.992 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Received event <DeviceRemovedEvent: 1764936707.9923866, ecc25cb4-5b3a-43f7-949d-ca9a1a19056a => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.993 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.996 187212 DEBUG nova.virt.libvirt.driver [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Start waiting for the detach event from libvirt for device tap7b183eee-c8 with device alias net1 for instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Dec 05 12:11:47 compute-0 nova_compute[187208]: 2025-12-05 12:11:47.996 187212 DEBUG nova.virt.libvirt.guest [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.001 187212 DEBUG nova.virt.libvirt.guest [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:58:13:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b183eee-c8"/></interface>not found in domain: <domain type='kvm' id='86'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <name>instance-0000004b</name>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <uuid>ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</uuid>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:name>tempest-tempest.common.compute-instance-914539058</nova:name>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:11:43</nova:creationTime>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:port uuid="88e41011-3ebc-4215-ad20-58a49d31a6d4">
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:port uuid="7b183eee-c877-4387-a2f2-78923af9a88b">
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:11:48 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <memory unit='KiB'>131072</memory>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <vcpu placement='static'>1</vcpu>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <resource>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <partition>/machine</partition>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </resource>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <sysinfo type='smbios'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <entry name='manufacturer'>RDO</entry>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <entry name='serial'>ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</entry>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <entry name='uuid'>ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</entry>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <entry name='family'>Virtual Machine</entry>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <boot dev='hd'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <smbios mode='sysinfo'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <vmcoreinfo state='on'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <vendor>AMD</vendor>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='x2apic'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc-deadline'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='hypervisor'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='tsc_adjust'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='spec-ctrl'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='stibp'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='ssbd'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='cmp_legacy'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='overflow-recov'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='succor'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='ibrs'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='amd-ssbd'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='virt-ssbd'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='lbrv'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='tsc-scale'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='vmcb-clean'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='flushbyasid'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='pause-filter'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='pfthreshold'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='xsaves'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='svm'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='require' name='topoext'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='npt'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <feature policy='disable' name='nrip-save'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <clock offset='utc'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <timer name='hpet' present='no'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <on_poweroff>destroy</on_poweroff>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <on_reboot>restart</on_reboot>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <on_crash>destroy</on_crash>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <disk type='file' device='disk'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk' index='2'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <backingStore type='file' index='3'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:         <format type='raw'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:         <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:         <backingStore/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       </backingStore>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target dev='vda' bus='virtio'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='virtio-disk0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <disk type='file' device='cdrom'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <source file='/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config' index='1'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <backingStore/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target dev='sda' bus='sata'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <readonly/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='sata0-0-0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pcie.0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='1' port='0x10'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.1'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='2' port='0x11'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.2'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='3' port='0x12'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.3'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='4' port='0x13'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.4'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='5' port='0x14'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.5'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='6' port='0x15'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.6'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='7' port='0x16'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.7'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='8' port='0x17'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.8'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='9' port='0x18'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.9'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='10' port='0x19'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.10'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='11' port='0x1a'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.11'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='12' port='0x1b'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.12'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='13' port='0x1c'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.13'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='14' port='0x1d'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.14'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='15' port='0x1e'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.15'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='16' port='0x1f'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.16'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='17' port='0x20'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.17'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='18' port='0x21'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.18'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='19' port='0x22'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.19'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='20' port='0x23'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.20'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='21' port='0x24'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.21'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='22' port='0x25'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.22'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='23' port='0x26'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.23'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='24' port='0x27'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.24'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-root-port'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target chassis='25' port='0x28'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.25'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model name='pcie-pci-bridge'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='pci.26'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='usb'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <controller type='sata' index='0'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='ide'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </controller>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <interface type='ethernet'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <mac address='fa:16:3e:a2:40:d1'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target dev='tap88e41011-3e'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model type='virtio'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <mtu size='1442'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='net0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <serial type='pty'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <source path='/dev/pts/0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/console.log' append='off'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target type='isa-serial' port='0'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:         <model name='isa-serial'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       </target>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <console type='pty' tty='/dev/pts/0'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <source path='/dev/pts/0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <log file='/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/console.log' append='off'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <target type='serial' port='0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='serial0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </console>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <input type='tablet' bus='usb'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='input0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='usb' bus='0' port='1'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <input type='mouse' bus='ps2'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='input1'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <input type='keyboard' bus='ps2'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='input2'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </input>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <listen type='address' address='::0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </graphics>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <audio id='1' type='none'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='video0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <watchdog model='itco' action='reset'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='watchdog0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </watchdog>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <memballoon model='virtio'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <stats period='10'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='balloon0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <rng model='virtio'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <backend model='random'>/dev/urandom</backend>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <alias name='rng0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <label>system_u:system_r:svirt_t:s0:c307,c851</label>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c307,c851</imagelabel>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <label>+107:+107</label>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <imagelabel>+107:+107</imagelabel>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </seclabel>
Dec 05 12:11:48 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:48 compute-0 nova_compute[187208]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.001 187212 INFO nova.virt.libvirt.driver [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tap7b183eee-c8 from instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a from the live domain config.
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.002 187212 DEBUG nova.virt.libvirt.vif [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.002 187212 DEBUG nova.network.os_vif_util [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.003 187212 DEBUG nova.network.os_vif_util [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.003 187212 DEBUG os_vif [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.008 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:13:10 10.100.0.5'], port_security=['fa:16:3e:58:13:10 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1639127318', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1639127318', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7b183eee-c877-4387-a2f2-78923af9a88b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.005 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.005 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b183eee-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.008 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.010 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7b183eee-c877-4387-a2f2-78923af9a88b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.014 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.033 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f2f9f3-74ea-4b67-ac56-d133263d29d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.037 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.041 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.043 187212 INFO os_vif [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8')
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.044 187212 DEBUG nova.virt.libvirt.guest [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:name>tempest-tempest.common.compute-instance-914539058</nova:name>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:creationTime>2025-12-05 12:11:48</nova:creationTime>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:flavor name="m1.nano">
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:memory>128</nova:memory>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:disk>1</nova:disk>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:swap>0</nova:swap>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </nova:flavor>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:owner>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </nova:owner>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   <nova:ports>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     <nova:port uuid="88e41011-3ebc-4215-ad20-58a49d31a6d4">
Dec 05 12:11:48 compute-0 nova_compute[187208]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:11:48 compute-0 nova_compute[187208]:     </nova:port>
Dec 05 12:11:48 compute-0 nova_compute[187208]:   </nova:ports>
Dec 05 12:11:48 compute-0 nova_compute[187208]: </nova:instance>
Dec 05 12:11:48 compute-0 nova_compute[187208]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a236be-e7de-4c96-a38c-1d6fac6680b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.069 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[dc14508d-d2d1-4e5a-b826-a1d7463ee42f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.080 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.080 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.098 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[76c516dd-37c0-4482-9655-c508001ddec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.108 187212 DEBUG nova.network.neutron [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Successfully updated port: 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.116 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ce592f-77f0-42e8-ab1e-2baa8b0f2ba6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 14, 'rx_bytes': 742, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 14, 'rx_bytes': 742, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234059, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.130 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "refresh_cache-f2a101e0-138f-404e-b6e0-e1359272f560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.130 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquired lock "refresh_cache-f2a101e0-138f-404e-b6e0-e1359272f560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.131 187212 DEBUG nova.network.neutron [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.131 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[39dc7dec-d8b1-40ed-9d2c-4785b5bd96ae]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398326, 'tstamp': 398326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234060, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398330, 'tstamp': 398330}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234060, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.132 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.155 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.165 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.166 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.166 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.166 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:48.167 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.189 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.225 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.226 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.284 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.507 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.508 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4798MB free_disk=72.99586486816406GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.508 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.508 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.565 187212 DEBUG nova.network.neutron [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.609 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.609 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.609 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 30cb83d4-3a34-4420-bc83-099b266da48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.610 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 28e48516-8665-4d98-a92d-c84b7da9a284 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.610 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance a70dccfb-2a89-4283-aba2-934af2667db3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.610 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance ef8fcf55-e147-4baf-b506-1d99af05d330 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.610 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.611 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance f2a101e0-138f-404e-b6e0-e1359272f560 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.612 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 8 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.612 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1536MB phys_disk=79GB used_disk=8GB total_vcpus=8 used_vcpus=8 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.853 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:48 compute-0 nova_compute[187208]: 2025-12-05 12:11:48.872 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:49 compute-0 nova_compute[187208]: 2025-12-05 12:11:49.185 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:11:49 compute-0 nova_compute[187208]: 2025-12-05 12:11:49.186 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:49 compute-0 nova_compute[187208]: 2025-12-05 12:11:49.188 187212 DEBUG nova.compute.manager [req-47cc6d8f-1754-4df4-a0f3-756f7785293e req-06834155-a981-4e53-9a76-ee40f034e3bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-changed-04328ce4-34a7-41df-8b27-e1d5b7f3f280 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:49 compute-0 nova_compute[187208]: 2025-12-05 12:11:49.188 187212 DEBUG nova.compute.manager [req-47cc6d8f-1754-4df4-a0f3-756f7785293e req-06834155-a981-4e53-9a76-ee40f034e3bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Refreshing instance network info cache due to event network-changed-04328ce4-34a7-41df-8b27-e1d5b7f3f280. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:49 compute-0 nova_compute[187208]: 2025-12-05 12:11:49.188 187212 DEBUG oslo_concurrency.lockutils [req-47cc6d8f-1754-4df4-a0f3-756f7785293e req-06834155-a981-4e53-9a76-ee40f034e3bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:49 compute-0 nova_compute[187208]: 2025-12-05 12:11:49.214 187212 DEBUG nova.compute.manager [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received event network-changed-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:49 compute-0 nova_compute[187208]: 2025-12-05 12:11:49.214 187212 DEBUG nova.compute.manager [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Refreshing instance network info cache due to event network-changed-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:11:49 compute-0 nova_compute[187208]: 2025-12-05 12:11:49.215 187212 DEBUG oslo_concurrency.lockutils [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f2a101e0-138f-404e-b6e0-e1359272f560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:49 compute-0 ovn_controller[95610]: 2025-12-05T12:11:49Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f1:5a:9c 10.100.0.13
Dec 05 12:11:49 compute-0 ovn_controller[95610]: 2025-12-05T12:11:49Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f1:5a:9c 10.100.0.13
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.518 187212 DEBUG nova.network.neutron [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Updating instance_info_cache with network_info: [{"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.711 187212 DEBUG oslo_concurrency.lockutils [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.711 187212 DEBUG oslo_concurrency.lockutils [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.712 187212 DEBUG nova.network.neutron [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.714 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Releasing lock "refresh_cache-f2a101e0-138f-404e-b6e0-e1359272f560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.714 187212 DEBUG nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance network_info: |[{"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.715 187212 DEBUG oslo_concurrency.lockutils [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f2a101e0-138f-404e-b6e0-e1359272f560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.715 187212 DEBUG nova.network.neutron [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Refreshing network info cache for port 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.719 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Start _get_guest_xml network_info=[{"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.724 187212 WARNING nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.733 187212 DEBUG nova.virt.libvirt.host [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.734 187212 DEBUG nova.virt.libvirt.host [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.743 187212 DEBUG nova.virt.libvirt.host [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.744 187212 DEBUG nova.virt.libvirt.host [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.744 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.744 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.745 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.745 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.745 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.746 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.746 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.746 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.746 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.746 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.747 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.747 187212 DEBUG nova.virt.hardware [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.751 187212 DEBUG nova.virt.libvirt.vif [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-234668707',display_name='tempest-ServerRescueTestJSON-server-234668707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-234668707',id=86,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f73626a62534c97a06b6ec98d749111',ramdisk_id='',reservation_id='r-p6hppypz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-122605385',owner_user_name='tempest-ServerRescueTestJSON-122605385-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:45Z,user_data=None,user_id='d12bb49c0ca84e8dad933b49753c7b24',uuid=f2a101e0-138f-404e-b6e0-e1359272f560,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.752 187212 DEBUG nova.network.os_vif_util [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converting VIF {"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.753 187212 DEBUG nova.network.os_vif_util [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a0:97,bridge_name='br-int',has_traffic_filtering=True,id=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bab1586-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.754 187212 DEBUG nova.objects.instance [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.772 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <uuid>f2a101e0-138f-404e-b6e0-e1359272f560</uuid>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <name>instance-00000056</name>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerRescueTestJSON-server-234668707</nova:name>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:11:50</nova:creationTime>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:11:50 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:11:50 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:11:50 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:11:50 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:50 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:11:50 compute-0 nova_compute[187208]:         <nova:user uuid="d12bb49c0ca84e8dad933b49753c7b24">tempest-ServerRescueTestJSON-122605385-project-member</nova:user>
Dec 05 12:11:50 compute-0 nova_compute[187208]:         <nova:project uuid="8f73626a62534c97a06b6ec98d749111">tempest-ServerRescueTestJSON-122605385</nova:project>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:11:50 compute-0 nova_compute[187208]:         <nova:port uuid="0bab1586-b06a-4ae9-a0f9-9fbea816c5c2">
Dec 05 12:11:50 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <entry name="serial">f2a101e0-138f-404e-b6e0-e1359272f560</entry>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <entry name="uuid">f2a101e0-138f-404e-b6e0-e1359272f560</entry>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.config"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:d5:a0:97"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <target dev="tap0bab1586-b0"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/console.log" append="off"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:11:50 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:11:50 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:50 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:50 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:50 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.772 187212 DEBUG nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Preparing to wait for external event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.772 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.773 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.773 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.773 187212 DEBUG nova.virt.libvirt.vif [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-234668707',display_name='tempest-ServerRescueTestJSON-server-234668707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-234668707',id=86,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f73626a62534c97a06b6ec98d749111',ramdisk_id='',reservation_id='r-p6hppypz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-122605385',owner_user_name='tempest-ServerRescueTestJSON-122605385-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:45Z,user_data=None,user_id='d12bb49c0ca84e8dad933b49753c7b24',uuid=f2a101e0-138f-404e-b6e0-e1359272f560,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.774 187212 DEBUG nova.network.os_vif_util [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converting VIF {"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.774 187212 DEBUG nova.network.os_vif_util [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a0:97,bridge_name='br-int',has_traffic_filtering=True,id=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bab1586-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.775 187212 DEBUG os_vif [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a0:97,bridge_name='br-int',has_traffic_filtering=True,id=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bab1586-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.775 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.775 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.776 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.779 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.779 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0bab1586-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.779 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0bab1586-b0, col_values=(('external_ids', {'iface-id': '0bab1586-b06a-4ae9-a0f9-9fbea816c5c2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:a0:97', 'vm-uuid': 'f2a101e0-138f-404e-b6e0-e1359272f560'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.781 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:50 compute-0 NetworkManager[55691]: <info>  [1764936710.7819] manager: (tap0bab1586-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.783 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.787 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.788 187212 INFO os_vif [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a0:97,bridge_name='br-int',has_traffic_filtering=True,id=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bab1586-b0')
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.873 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.873 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.873 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No VIF found with MAC fa:16:3e:d5:a0:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:50 compute-0 nova_compute[187208]: 2025-12-05 12:11:50.874 187212 INFO nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Using config drive
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.156 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.158 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.158 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:11:51 compute-0 podman[234089]: 2025-12-05 12:11:51.221957694 +0000 UTC m=+0.065743098 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.233 187212 DEBUG nova.network.neutron [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Updating instance_info_cache with network_info: [{"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.265 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Releasing lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.265 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Instance network_info: |[{"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.266 187212 DEBUG oslo_concurrency.lockutils [req-dd6d4c83-0ef4-47fe-be36-611187804548 req-e367da7b-f416-46bb-b8ca-dd807d5ae220 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.266 187212 DEBUG nova.network.neutron [req-dd6d4c83-0ef4-47fe-be36-611187804548 req-e367da7b-f416-46bb-b8ca-dd807d5ae220 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Refreshing network info cache for port 67a9975d-5d22-4a6a-af5f-83ab6b080d9a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.270 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Start _get_guest_xml network_info=[{"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.275 187212 WARNING nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.280 187212 DEBUG nova.virt.libvirt.host [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.281 187212 DEBUG nova.virt.libvirt.host [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.294 187212 DEBUG nova.virt.libvirt.host [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.295 187212 DEBUG nova.virt.libvirt.host [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.295 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.296 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.296 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.296 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.297 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.297 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.297 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.298 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.298 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.298 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.299 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.299 187212 DEBUG nova.virt.hardware [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.303 187212 DEBUG nova.virt.libvirt.vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-614377642',display_name='tempest-ServersTestMultiNic-server-614377642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-614377642',id=84,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-3hddow00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:31Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=ef8fcf55-e147-4baf-b506-1d99af05d330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.304 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.305 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:ff:d8,bridge_name='br-int',has_traffic_filtering=True,id=5f1f909d-4147-44de-9adf-829a12fc8bfa,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f1f909d-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.306 187212 DEBUG nova.virt.libvirt.vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-614377642',display_name='tempest-ServersTestMultiNic-server-614377642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-614377642',id=84,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-3hddow00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:31Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=ef8fcf55-e147-4baf-b506-1d99af05d330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.306 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.307 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:92:66,bridge_name='br-int',has_traffic_filtering=True,id=67a9975d-5d22-4a6a-af5f-83ab6b080d9a,network=Network(951a1e05-7d1e-4c9c-9143-3e651fb29978),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67a9975d-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.308 187212 DEBUG nova.virt.libvirt.vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-614377642',display_name='tempest-ServersTestMultiNic-server-614377642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-614377642',id=84,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-3hddow00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:31Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=ef8fcf55-e147-4baf-b506-1d99af05d330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.308 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.309 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e0:ee,bridge_name='br-int',has_traffic_filtering=True,id=04328ce4-34a7-41df-8b27-e1d5b7f3f280,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04328ce4-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.310 187212 DEBUG nova.objects.instance [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lazy-loading 'pci_devices' on Instance uuid ef8fcf55-e147-4baf-b506-1d99af05d330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.397 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <uuid>ef8fcf55-e147-4baf-b506-1d99af05d330</uuid>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <name>instance-00000054</name>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestMultiNic-server-614377642</nova:name>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:11:51</nova:creationTime>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:user uuid="430719002c284cd28237859ea6061eef">tempest-ServersTestMultiNic-1621990639-project-member</nova:user>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:project uuid="f7aededcaee54c4bbb7cba6007565f65">tempest-ServersTestMultiNic-1621990639</nova:project>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:port uuid="5f1f909d-4147-44de-9adf-829a12fc8bfa">
Dec 05 12:11:51 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:port uuid="67a9975d-5d22-4a6a-af5f-83ab6b080d9a">
Dec 05 12:11:51 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.1.97" ipVersion="4"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         <nova:port uuid="04328ce4-34a7-41df-8b27-e1d5b7f3f280">
Dec 05 12:11:51 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.205" ipVersion="4"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <system>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <entry name="serial">ef8fcf55-e147-4baf-b506-1d99af05d330</entry>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <entry name="uuid">ef8fcf55-e147-4baf-b506-1d99af05d330</entry>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </system>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <os>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   </os>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <features>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   </features>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk.config"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:3c:ff:d8"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <target dev="tap5f1f909d-41"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:c6:92:66"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <target dev="tap67a9975d-5d"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:1e:e0:ee"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <target dev="tap04328ce4-34"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/console.log" append="off"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <video>
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </video>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:11:51 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:11:51 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:11:51 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:11:51 compute-0 nova_compute[187208]: </domain>
Dec 05 12:11:51 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.398 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Preparing to wait for external event network-vif-plugged-5f1f909d-4147-44de-9adf-829a12fc8bfa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.398 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.398 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.399 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.399 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Preparing to wait for external event network-vif-plugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.399 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.400 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.400 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.400 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Preparing to wait for external event network-vif-plugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.400 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.401 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.401 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.402 187212 DEBUG nova.virt.libvirt.vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-614377642',display_name='tempest-ServersTestMultiNic-server-614377642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-614377642',id=84,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-3hddow00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:31Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=ef8fcf55-e147-4baf-b506-1d99af05d330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.402 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.403 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:ff:d8,bridge_name='br-int',has_traffic_filtering=True,id=5f1f909d-4147-44de-9adf-829a12fc8bfa,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f1f909d-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.403 187212 DEBUG os_vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:ff:d8,bridge_name='br-int',has_traffic_filtering=True,id=5f1f909d-4147-44de-9adf-829a12fc8bfa,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f1f909d-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.404 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.404 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.405 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.407 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.407 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f1f909d-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.408 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5f1f909d-41, col_values=(('external_ids', {'iface-id': '5f1f909d-4147-44de-9adf-829a12fc8bfa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3c:ff:d8', 'vm-uuid': 'ef8fcf55-e147-4baf-b506-1d99af05d330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.410 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 NetworkManager[55691]: <info>  [1764936711.4109] manager: (tap5f1f909d-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.412 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.418 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.419 187212 INFO os_vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:ff:d8,bridge_name='br-int',has_traffic_filtering=True,id=5f1f909d-4147-44de-9adf-829a12fc8bfa,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f1f909d-41')
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.420 187212 DEBUG nova.virt.libvirt.vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-614377642',display_name='tempest-ServersTestMultiNic-server-614377642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-614377642',id=84,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-3hddow00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:31Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=ef8fcf55-e147-4baf-b506-1d99af05d330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.420 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.421 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:92:66,bridge_name='br-int',has_traffic_filtering=True,id=67a9975d-5d22-4a6a-af5f-83ab6b080d9a,network=Network(951a1e05-7d1e-4c9c-9143-3e651fb29978),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67a9975d-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.421 187212 DEBUG os_vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:92:66,bridge_name='br-int',has_traffic_filtering=True,id=67a9975d-5d22-4a6a-af5f-83ab6b080d9a,network=Network(951a1e05-7d1e-4c9c-9143-3e651fb29978),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67a9975d-5d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.422 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.422 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.422 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.425 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.425 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap67a9975d-5d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.425 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap67a9975d-5d, col_values=(('external_ids', {'iface-id': '67a9975d-5d22-4a6a-af5f-83ab6b080d9a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:92:66', 'vm-uuid': 'ef8fcf55-e147-4baf-b506-1d99af05d330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.426 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 NetworkManager[55691]: <info>  [1764936711.4278] manager: (tap67a9975d-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.429 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.434 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.435 187212 INFO os_vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:92:66,bridge_name='br-int',has_traffic_filtering=True,id=67a9975d-5d22-4a6a-af5f-83ab6b080d9a,network=Network(951a1e05-7d1e-4c9c-9143-3e651fb29978),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67a9975d-5d')
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.436 187212 DEBUG nova.virt.libvirt.vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-614377642',display_name='tempest-ServersTestMultiNic-server-614377642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-614377642',id=84,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-3hddow00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:31Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=ef8fcf55-e147-4baf-b506-1d99af05d330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.436 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.437 187212 DEBUG nova.network.os_vif_util [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e0:ee,bridge_name='br-int',has_traffic_filtering=True,id=04328ce4-34a7-41df-8b27-e1d5b7f3f280,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04328ce4-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.437 187212 DEBUG os_vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e0:ee,bridge_name='br-int',has_traffic_filtering=True,id=04328ce4-34a7-41df-8b27-e1d5b7f3f280,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04328ce4-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.437 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.438 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.438 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.439 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.439 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04328ce4-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.440 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap04328ce4-34, col_values=(('external_ids', {'iface-id': '04328ce4-34a7-41df-8b27-e1d5b7f3f280', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:e0:ee', 'vm-uuid': 'ef8fcf55-e147-4baf-b506-1d99af05d330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.441 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 NetworkManager[55691]: <info>  [1764936711.4424] manager: (tap04328ce4-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.444 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.451 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.452 187212 INFO os_vif [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e0:ee,bridge_name='br-int',has_traffic_filtering=True,id=04328ce4-34a7-41df-8b27-e1d5b7f3f280,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04328ce4-34')
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.566 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.567 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.567 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] No VIF found with MAC fa:16:3e:3c:ff:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.567 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] No VIF found with MAC fa:16:3e:c6:92:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.567 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] No VIF found with MAC fa:16:3e:1e:e0:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.568 187212 INFO nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Using config drive
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.871 187212 INFO nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Creating config drive at /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.config
Dec 05 12:11:51 compute-0 nova_compute[187208]: 2025-12-05 12:11:51.875 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2syaaylg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.001 187212 DEBUG oslo_concurrency.processutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2syaaylg" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.0809] manager: (tap0bab1586-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/323)
Dec 05 12:11:52 compute-0 kernel: tap0bab1586-b0: entered promiscuous mode
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00807|binding|INFO|Claiming lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 for this chassis.
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00808|binding|INFO|0bab1586-b06a-4ae9-a0f9-9fbea816c5c2: Claiming fa:16:3e:d5:a0:97 10.100.0.11
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00809|binding|INFO|Setting lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 ovn-installed in OVS
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.108 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 systemd-udevd[234146]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 systemd-machined[153543]: New machine qemu-96-instance-00000056.
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.1298] device (tap0bab1586-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.1313] device (tap0bab1586-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:52 compute-0 systemd[1]: Started Virtual Machine qemu-96-instance-00000056.
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.289 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a0:97 10.100.0.11'], port_security=['fa:16:3e:d5:a0:97 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '2', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00810|binding|INFO|Setting lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 up in Southbound
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.290 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 bound to our chassis
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.291 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.292 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f39c34c5-9a29-4aab-a18e-bcb2ea0316d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.406 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936712.4063294, f2a101e0-138f-404e-b6e0-e1359272f560 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.407 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] VM Started (Lifecycle Event)
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.461 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.466 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936712.410209, f2a101e0-138f-404e-b6e0-e1359272f560 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.466 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] VM Paused (Lifecycle Event)
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.490 187212 INFO nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Creating config drive at /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk.config
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.495 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1s_8jjb2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.527 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.534 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.602 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.623 187212 DEBUG oslo_concurrency.processutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1s_8jjb2" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.629 187212 DEBUG oslo_concurrency.lockutils [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.630 187212 DEBUG oslo_concurrency.lockutils [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.630 187212 DEBUG oslo_concurrency.lockutils [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.631 187212 DEBUG oslo_concurrency.lockutils [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.631 187212 DEBUG oslo_concurrency.lockutils [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.633 187212 INFO nova.compute.manager [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Terminating instance
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.647 187212 DEBUG nova.compute.manager [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:11:52 compute-0 kernel: tap88e41011-3e (unregistering): left promiscuous mode
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.6771] device (tap88e41011-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.683 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.685 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.684 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.689 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00811|binding|INFO|Releasing lport 88e41011-3ebc-4215-ad20-58a49d31a6d4 from this chassis (sb_readonly=1)
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00812|binding|INFO|Removing iface tap88e41011-3e ovn-installed in OVS
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00813|if_status|INFO|Dropped 1 log messages in last 192 seconds (most recently, 192 seconds ago) due to excessive rate
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00814|if_status|INFO|Not setting lport 88e41011-3ebc-4215-ad20-58a49d31a6d4 down as sb is readonly
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.7091] manager: (tap5f1f909d-41): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Dec 05 12:11:52 compute-0 systemd-udevd[234149]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:52 compute-0 kernel: tap5f1f909d-41: entered promiscuous mode
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00815|binding|INFO|Setting lport 88e41011-3ebc-4215-ad20-58a49d31a6d4 down in Southbound
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.719 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00816|if_status|INFO|Not updating pb chassis for 5f1f909d-4147-44de-9adf-829a12fc8bfa now as sb is readonly
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00817|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00818|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00819|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00820|binding|INFO|Claiming lport 5f1f909d-4147-44de-9adf-829a12fc8bfa for this chassis.
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00821|binding|INFO|5f1f909d-4147-44de-9adf-829a12fc8bfa: Claiming fa:16:3e:3c:ff:d8 10.100.0.12
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00822|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:11:52 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.721 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:40:d1 10.100.0.8'], port_security=['fa:16:3e:a2:40:d1 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cbdd2780-9e2b-4e10-8d0a-98de936cf6ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.241'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=88e41011-3ebc-4215-ad20-58a49d31a6d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.722 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 88e41011-3ebc-4215-ad20-58a49d31a6d4 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:11:52 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d0000004b.scope: Consumed 14.699s CPU time.
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.724 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.7264] device (tap5f1f909d-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.7272] device (tap5f1f909d-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.7350] manager: (tap67a9975d-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/325)
Dec 05 12:11:52 compute-0 systemd-machined[153543]: Machine qemu-86-instance-0000004b terminated.
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.745 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[79e775a4-6735-40ff-86b7-8c84cce68465]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.7589] manager: (tap04328ce4-34): new Tun device (/org/freedesktop/NetworkManager/Devices/326)
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.7687] device (tap67a9975d-5d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:52 compute-0 kernel: tap67a9975d-5d: entered promiscuous mode
Dec 05 12:11:52 compute-0 kernel: tap04328ce4-34: entered promiscuous mode
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.7698] device (tap04328ce4-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.7712] device (tap67a9975d-5d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.7716] device (tap04328ce4-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.773 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.781 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.782 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.788 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd1cf2e-3a1a-4df9-9d90-75ace500d551]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.792 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[279fa322-f9d6-4494-ad1c-41df80b62bbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 systemd-machined[153543]: New machine qemu-97-instance-00000054.
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00823|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00824|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00825|binding|INFO|Claiming lport 67a9975d-5d22-4a6a-af5f-83ab6b080d9a for this chassis.
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00826|binding|INFO|67a9975d-5d22-4a6a-af5f-83ab6b080d9a: Claiming fa:16:3e:c6:92:66 10.100.1.97
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00827|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00828|binding|INFO|Claiming lport 04328ce4-34a7-41df-8b27-e1d5b7f3f280 for this chassis.
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00829|binding|INFO|04328ce4-34a7-41df-8b27-e1d5b7f3f280: Claiming fa:16:3e:1e:e0:ee 10.100.0.205
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00830|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.812 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:ff:d8 10.100.0.12'], port_security=['fa:16:3e:3c:ff:d8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/24', 'neutron:device_id': 'ef8fcf55-e147-4baf-b506-1d99af05d330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1ff0073-21f4-4b01-9595-8418e7a5cc2c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5f1f909d-4147-44de-9adf-829a12fc8bfa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:52 compute-0 systemd[1]: Started Virtual Machine qemu-97-instance-00000054.
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.829 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0b1385d3-cc9b-4190-92f8-c884dab9cdbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00831|binding|INFO|Setting lport 5f1f909d-4147-44de-9adf-829a12fc8bfa ovn-installed in OVS
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00832|binding|INFO|Setting lport 5f1f909d-4147-44de-9adf-829a12fc8bfa up in Southbound
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.835 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.850 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff7e37c-d735-4e84-bb60-b257d937d074]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 16, 'rx_bytes': 742, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 16, 'rx_bytes': 742, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234216, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.855 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:92:66 10.100.1.97'], port_security=['fa:16:3e:c6:92:66 10.100.1.97'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.97/24', 'neutron:device_id': 'ef8fcf55-e147-4baf-b506-1d99af05d330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-951a1e05-7d1e-4c9c-9143-3e651fb29978', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4593a165-aa4c-4b2e-939b-76fea57e8290, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=67a9975d-5d22-4a6a-af5f-83ab6b080d9a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.857 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:e0:ee 10.100.0.205'], port_security=['fa:16:3e:1e:e0:ee 10.100.0.205'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.205/24', 'neutron:device_id': 'ef8fcf55-e147-4baf-b506-1d99af05d330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1ff0073-21f4-4b01-9595-8418e7a5cc2c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=04328ce4-34a7-41df-8b27-e1d5b7f3f280) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.871 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dfbe2029-a3b9-41bf-8526-fbce7640f417]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398326, 'tstamp': 398326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234221, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398330, 'tstamp': 398330}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234221, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.873 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.874 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00833|binding|INFO|Setting lport 67a9975d-5d22-4a6a-af5f-83ab6b080d9a ovn-installed in OVS
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00834|binding|INFO|Setting lport 67a9975d-5d22-4a6a-af5f-83ab6b080d9a up in Southbound
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00835|binding|INFO|Setting lport 04328ce4-34a7-41df-8b27-e1d5b7f3f280 ovn-installed in OVS
Dec 05 12:11:52 compute-0 ovn_controller[95610]: 2025-12-05T12:11:52Z|00836|binding|INFO|Setting lport 04328ce4-34a7-41df-8b27-e1d5b7f3f280 up in Southbound
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.877 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.8793] manager: (tap88e41011-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/327)
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.881 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.882 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.882 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.882 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.883 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.885 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5f1f909d-4147-44de-9adf-829a12fc8bfa in datapath 3f3e0335-bf9c-44f0-9272-60c74c37b38b unbound from our chassis
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.887 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f3e0335-bf9c-44f0-9272-60c74c37b38b
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.899 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62dae70c-d104-497c-adc9-415ce1425409]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.901 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3f3e0335-b1 in ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.903 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3f3e0335-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.903 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[90b8fc4c-52bf-43fb-823b-300b8ea1969a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.904 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[63aaef16-2657-4e9c-9756-b712f3b8c959]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.922 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[60cbfe0d-bab9-4295-8c6c-ae98bb9db147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.937 187212 INFO nova.virt.libvirt.driver [-] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Instance destroyed successfully.
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.937 187212 DEBUG nova.objects.instance [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'resources' on Instance uuid ecc25cb4-5b3a-43f7-949d-ca9a1a19056a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.939 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[57316b1c-16d2-4471-b591-806b124ab71a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.965 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed5e4c0-d20a-42a9-92f6-cc08852843f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.968 187212 DEBUG nova.virt.libvirt.vif [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.969 187212 DEBUG nova.network.os_vif_util [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.970 187212 DEBUG nova.network.os_vif_util [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.970 187212 DEBUG os_vif [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:52.974 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4c96e5e7-5412-43bf-8846-c35c9bb3af62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.974 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.974 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88e41011-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.978 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.985 187212 INFO os_vif [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e')
Dec 05 12:11:52 compute-0 NetworkManager[55691]: <info>  [1764936712.9857] manager: (tap3f3e0335-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/328)
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.986 187212 DEBUG nova.virt.libvirt.vif [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.986 187212 DEBUG nova.network.os_vif_util [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "7b183eee-c877-4387-a2f2-78923af9a88b", "address": "fa:16:3e:58:13:10", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b183eee-c8", "ovs_interfaceid": "7b183eee-c877-4387-a2f2-78923af9a88b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.986 187212 DEBUG nova.network.os_vif_util [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.987 187212 DEBUG os_vif [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.988 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.988 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b183eee-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.988 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.989 187212 INFO os_vif [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:13:10,bridge_name='br-int',has_traffic_filtering=True,id=7b183eee-c877-4387-a2f2-78923af9a88b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b183eee-c8')
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.990 187212 INFO nova.virt.libvirt.driver [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Deleting instance files /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a_del
Dec 05 12:11:52 compute-0 nova_compute[187208]: 2025-12-05 12:11:52.991 187212 INFO nova.virt.libvirt.driver [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Deletion of /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a_del complete
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.016 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f50c21-01e9-4fa8-bede-10da3307ecd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.020 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ae7a90d1-94f2-4bd9-834d-35072bf303ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:53 compute-0 NetworkManager[55691]: <info>  [1764936713.0450] device (tap3f3e0335-b0): carrier: link connected
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.049 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b9e770-4555-4a2a-961b-d8fbc4109155]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.066 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a0834c74-57f4-4377-acf1-c03ce950d0c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f3e0335-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:40:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409360, 'reachable_time': 29730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234269, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.083 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7d59bc84-40f3-4e04-b10c-c4b179864bd2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe01:4085'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409360, 'tstamp': 409360}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234274, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.092 187212 INFO nova.compute.manager [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Took 0.44 seconds to destroy the instance on the hypervisor.
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.093 187212 DEBUG oslo.service.loopingcall [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.093 187212 DEBUG nova.compute.manager [-] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.093 187212 DEBUG nova.network.neutron [-] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.101 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aaff5fa0-984f-4aa9-b228-d1c3ed658018]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f3e0335-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:40:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409360, 'reachable_time': 29730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234277, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.134 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eabd0253-3dc5-4b6d-b5e0-bcba886126e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.171 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936713.1704233, ef8fcf55-e147-4baf-b506-1d99af05d330 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.172 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] VM Started (Lifecycle Event)
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.189 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.196 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7ad2496d-ab34-432f-ae4a-023a21f3c6e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.197 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f3e0335-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.197 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.197 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f3e0335-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:53 compute-0 NetworkManager[55691]: <info>  [1764936713.2006] manager: (tap3f3e0335-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.200 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:53 compute-0 kernel: tap3f3e0335-b0: entered promiscuous mode
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.205 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f3e0335-b0, col_values=(('external_ids', {'iface-id': '5043c1f9-031c-414d-b5e2-35f6436be4ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:53 compute-0 ovn_controller[95610]: 2025-12-05T12:11:53Z|00837|binding|INFO|Releasing lport 5043c1f9-031c-414d-b5e2-35f6436be4ab from this chassis (sb_readonly=0)
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.206 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.215 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.219 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936713.170682, ef8fcf55-e147-4baf-b506-1d99af05d330 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.219 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] VM Paused (Lifecycle Event)
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.220 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3f3e0335-bf9c-44f0-9272-60c74c37b38b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3f3e0335-bf9c-44f0-9272-60c74c37b38b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.221 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.222 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[160c9a1c-8762-4aae-bc9b-15c0324e9280]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.222 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-3f3e0335-bf9c-44f0-9272-60c74c37b38b
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/3f3e0335-bf9c-44f0-9272-60c74c37b38b.pid.haproxy
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 3f3e0335-bf9c-44f0-9272-60c74c37b38b
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:11:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:53.223 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'env', 'PROCESS_TAG=haproxy-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3f3e0335-bf9c-44f0-9272-60c74c37b38b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.253 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.256 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.275 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:53 compute-0 podman[234311]: 2025-12-05 12:11:53.559339669 +0000 UTC m=+0.021451086 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:11:53 compute-0 podman[234311]: 2025-12-05 12:11:53.740629532 +0000 UTC m=+0.202740919 container create b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:11:53 compute-0 systemd[1]: Started libpod-conmon-b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3.scope.
Dec 05 12:11:53 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:11:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f721315c1793784aa6632ad222f35fca4cfc77d10f44c6dbf10bf3342a133f25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.942 187212 INFO nova.network.neutron [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Port 7b183eee-c877-4387-a2f2-78923af9a88b from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.944 187212 DEBUG nova.network.neutron [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:53 compute-0 nova_compute[187208]: 2025-12-05 12:11:53.977 187212 DEBUG oslo_concurrency.lockutils [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:54 compute-0 podman[234311]: 2025-12-05 12:11:54.006532413 +0000 UTC m=+0.468643830 container init b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 12:11:54 compute-0 nova_compute[187208]: 2025-12-05 12:11:54.011 187212 DEBUG oslo_concurrency.lockutils [None req-0565735c-92a2-4530-91eb-ccacfcab7c4d 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-7b183eee-c877-4387-a2f2-78923af9a88b" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 6.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:54 compute-0 podman[234311]: 2025-12-05 12:11:54.012765522 +0000 UTC m=+0.474876909 container start b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:11:54 compute-0 neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b[234326]: [NOTICE]   (234330) : New worker (234332) forked
Dec 05 12:11:54 compute-0 neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b[234326]: [NOTICE]   (234330) : Loading success.
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.072 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 67a9975d-5d22-4a6a-af5f-83ab6b080d9a in datapath 951a1e05-7d1e-4c9c-9143-3e651fb29978 unbound from our chassis
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.075 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 951a1e05-7d1e-4c9c-9143-3e651fb29978
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.083 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7d62f123-36c7-4b63-8343-761286ba7d46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.084 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap951a1e05-71 in ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.086 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap951a1e05-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.086 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[33d588cf-7591-4724-a03b-d8a42c80306b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.086 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[29441883-2f5b-4325-a29c-941d818e80ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.097 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[34265e2c-0af9-4783-b5a4-66e8b7b451ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.111 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e63393d4-1aa8-4103-a6c1-5239dfe5b21e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.135 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c7fc93e9-19d9-4682-b062-754ef307a648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 systemd-udevd[234253]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.142 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ea92b704-8a4c-4f69-a802-b4a766c0203f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 NetworkManager[55691]: <info>  [1764936714.1437] manager: (tap951a1e05-70): new Veth device (/org/freedesktop/NetworkManager/Devices/330)
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.172 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b99e49da-16f1-4f4f-894e-af389cebd2a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.175 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3f80c9-79ff-4a90-8947-8537605f68ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 NetworkManager[55691]: <info>  [1764936714.1979] device (tap951a1e05-70): carrier: link connected
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.203 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9a3022d2-1b24-4180-943c-17073e54ea49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.227 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[831d6e4a-40b1-48d9-bb17-0e13b57e9772]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap951a1e05-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:9f:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409476, 'reachable_time': 41046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234351, 'error': None, 'target': 'ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.248 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aeef782a-3217-4a3b-b56d-ebee74ac8ecb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe85:9f5e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409476, 'tstamp': 409476}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234352, 'error': None, 'target': 'ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.270 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a83ce5cf-cfcd-484f-b07c-7f420f8c4cc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap951a1e05-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:9f:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409476, 'reachable_time': 41046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234353, 'error': None, 'target': 'ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.302 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b253a1b0-c321-4170-b7d6-6530ad05d057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.378 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[031b5e82-311f-44cb-8220-28e925ad0e5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.380 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap951a1e05-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.380 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.381 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap951a1e05-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:54 compute-0 nova_compute[187208]: 2025-12-05 12:11:54.383 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:54 compute-0 NetworkManager[55691]: <info>  [1764936714.3839] manager: (tap951a1e05-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Dec 05 12:11:54 compute-0 kernel: tap951a1e05-70: entered promiscuous mode
Dec 05 12:11:54 compute-0 nova_compute[187208]: 2025-12-05 12:11:54.385 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.386 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap951a1e05-70, col_values=(('external_ids', {'iface-id': 'd0c3b543-be48-4c24-8c16-9ccdd92a378b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:54 compute-0 nova_compute[187208]: 2025-12-05 12:11:54.387 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:54 compute-0 ovn_controller[95610]: 2025-12-05T12:11:54Z|00838|binding|INFO|Releasing lport d0c3b543-be48-4c24-8c16-9ccdd92a378b from this chassis (sb_readonly=0)
Dec 05 12:11:54 compute-0 nova_compute[187208]: 2025-12-05 12:11:54.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.403 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/951a1e05-7d1e-4c9c-9143-3e651fb29978.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/951a1e05-7d1e-4c9c-9143-3e651fb29978.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.404 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3a3cda0f-903f-4a13-a382-18d093300a32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.405 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-951a1e05-7d1e-4c9c-9143-3e651fb29978
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/951a1e05-7d1e-4c9c-9143-3e651fb29978.pid.haproxy
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 951a1e05-7d1e-4c9c-9143-3e651fb29978
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:11:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:54.406 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978', 'env', 'PROCESS_TAG=haproxy-951a1e05-7d1e-4c9c-9143-3e651fb29978', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/951a1e05-7d1e-4c9c-9143-3e651fb29978.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:11:54 compute-0 nova_compute[187208]: 2025-12-05 12:11:54.726 187212 DEBUG nova.network.neutron [req-dd6d4c83-0ef4-47fe-be36-611187804548 req-e367da7b-f416-46bb-b8ca-dd807d5ae220 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Updated VIF entry in instance network info cache for port 67a9975d-5d22-4a6a-af5f-83ab6b080d9a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:54 compute-0 nova_compute[187208]: 2025-12-05 12:11:54.726 187212 DEBUG nova.network.neutron [req-dd6d4c83-0ef4-47fe-be36-611187804548 req-e367da7b-f416-46bb-b8ca-dd807d5ae220 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Updating instance_info_cache with network_info: [{"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:54 compute-0 nova_compute[187208]: 2025-12-05 12:11:54.729 187212 DEBUG nova.network.neutron [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Updated VIF entry in instance network info cache for port 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:54 compute-0 nova_compute[187208]: 2025-12-05 12:11:54.729 187212 DEBUG nova.network.neutron [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Updating instance_info_cache with network_info: [{"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:54 compute-0 podman[234386]: 2025-12-05 12:11:54.838737025 +0000 UTC m=+0.103842231 container create 5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:11:54 compute-0 podman[234386]: 2025-12-05 12:11:54.769774376 +0000 UTC m=+0.034879612 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:11:54 compute-0 systemd[1]: Started libpod-conmon-5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e.scope.
Dec 05 12:11:54 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:11:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7965c7475ebcb3c9788b77fdf12f4ae89fbcfe344b768ac69adf31da8b83f1a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:11:54 compute-0 podman[234386]: 2025-12-05 12:11:54.930778217 +0000 UTC m=+0.195883443 container init 5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:11:54 compute-0 podman[234386]: 2025-12-05 12:11:54.937409977 +0000 UTC m=+0.202515183 container start 5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:11:54 compute-0 neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978[234402]: [NOTICE]   (234406) : New worker (234408) forked
Dec 05 12:11:54 compute-0 neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978[234402]: [NOTICE]   (234406) : Loading success.
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.026 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 04328ce4-34a7-41df-8b27-e1d5b7f3f280 in datapath 3f3e0335-bf9c-44f0-9272-60c74c37b38b unbound from our chassis
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.031 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f3e0335-bf9c-44f0-9272-60c74c37b38b
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.048 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3e319974-b438-4fe7-b50c-4faa14e07356]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.072 187212 DEBUG oslo_concurrency.lockutils [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f2a101e0-138f-404e-b6e0-e1359272f560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.072 187212 DEBUG nova.compute.manager [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-unplugged-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.073 187212 DEBUG oslo_concurrency.lockutils [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.073 187212 DEBUG oslo_concurrency.lockutils [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.073 187212 DEBUG oslo_concurrency.lockutils [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.073 187212 DEBUG nova.compute.manager [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] No waiting events found dispatching network-vif-unplugged-7b183eee-c877-4387-a2f2-78923af9a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.073 187212 WARNING nova.compute.manager [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received unexpected event network-vif-unplugged-7b183eee-c877-4387-a2f2-78923af9a88b for instance with vm_state active and task_state None.
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.074 187212 DEBUG nova.compute.manager [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.074 187212 DEBUG oslo_concurrency.lockutils [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.074 187212 DEBUG oslo_concurrency.lockutils [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.074 187212 DEBUG oslo_concurrency.lockutils [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.074 187212 DEBUG nova.compute.manager [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] No waiting events found dispatching network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.074 187212 WARNING nova.compute.manager [req-0aad869d-46bc-4d65-941e-c15671b60a46 req-299f6a6b-dcb8-4374-864d-51e8590c2d56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received unexpected event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b for instance with vm_state active and task_state None.
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.075 187212 DEBUG oslo_concurrency.lockutils [req-dd6d4c83-0ef4-47fe-be36-611187804548 req-e367da7b-f416-46bb-b8ca-dd807d5ae220 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.076 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.076 187212 DEBUG nova.network.neutron [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Refreshing network info cache for port 5f1f909d-4147-44de-9adf-829a12fc8bfa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.081 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[438a716b-6241-4378-af45-4ffd86022863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.085 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fc611916-2a7c-4549-8022-c0f3e4fa3a4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.121 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b573d749-5885-459f-8494-dfc00c667552]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.143 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[95bd9ed1-4823-4eba-9582-6bec2fb2ee84]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f3e0335-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:40:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409360, 'reachable_time': 29730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234422, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.164 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[65fc732f-624a-44bc-9bcd-fb212969910f]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap3f3e0335-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409372, 'tstamp': 409372}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234423, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3f3e0335-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409375, 'tstamp': 409375}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234423, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.167 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f3e0335-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.170 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f3e0335-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.170 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.171 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f3e0335-b0, col_values=(('external_ids', {'iface-id': '5043c1f9-031c-414d-b5e2-35f6436be4ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:55.171 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.233 187212 DEBUG nova.compute.manager [req-1e5ffc4a-22c6-4375-b2bb-bf11b3f16625 req-4b12dba8-4ddc-40ea-8e06-e9539599a6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-unplugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.233 187212 DEBUG oslo_concurrency.lockutils [req-1e5ffc4a-22c6-4375-b2bb-bf11b3f16625 req-4b12dba8-4ddc-40ea-8e06-e9539599a6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.234 187212 DEBUG oslo_concurrency.lockutils [req-1e5ffc4a-22c6-4375-b2bb-bf11b3f16625 req-4b12dba8-4ddc-40ea-8e06-e9539599a6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.234 187212 DEBUG oslo_concurrency.lockutils [req-1e5ffc4a-22c6-4375-b2bb-bf11b3f16625 req-4b12dba8-4ddc-40ea-8e06-e9539599a6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.234 187212 DEBUG nova.compute.manager [req-1e5ffc4a-22c6-4375-b2bb-bf11b3f16625 req-4b12dba8-4ddc-40ea-8e06-e9539599a6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] No waiting events found dispatching network-vif-unplugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.234 187212 DEBUG nova.compute.manager [req-1e5ffc4a-22c6-4375-b2bb-bf11b3f16625 req-4b12dba8-4ddc-40ea-8e06-e9539599a6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-unplugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.327 187212 DEBUG nova.compute.manager [req-64427373-b0fa-4597-aa18-6aec7ac64bed req-e7633247-cc74-461b-ba4e-ea7426c9a0aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.328 187212 DEBUG oslo_concurrency.lockutils [req-64427373-b0fa-4597-aa18-6aec7ac64bed req-e7633247-cc74-461b-ba4e-ea7426c9a0aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.328 187212 DEBUG oslo_concurrency.lockutils [req-64427373-b0fa-4597-aa18-6aec7ac64bed req-e7633247-cc74-461b-ba4e-ea7426c9a0aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.328 187212 DEBUG oslo_concurrency.lockutils [req-64427373-b0fa-4597-aa18-6aec7ac64bed req-e7633247-cc74-461b-ba4e-ea7426c9a0aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.328 187212 DEBUG nova.compute.manager [req-64427373-b0fa-4597-aa18-6aec7ac64bed req-e7633247-cc74-461b-ba4e-ea7426c9a0aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Processing event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.329 187212 DEBUG nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.335 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936715.3353214, f2a101e0-138f-404e-b6e0-e1359272f560 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.336 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] VM Resumed (Lifecycle Event)
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.338 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.343 187212 INFO nova.virt.libvirt.driver [-] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance spawned successfully.
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.344 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.373 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.380 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.383 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.384 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.384 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.385 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.385 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.386 187212 DEBUG nova.virt.libvirt.driver [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.421 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.457 187212 INFO nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Took 10.13 seconds to spawn the instance on the hypervisor.
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.457 187212 DEBUG nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.545 187212 INFO nova.compute.manager [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Took 11.10 seconds to build instance.
Dec 05 12:11:55 compute-0 nova_compute[187208]: 2025-12-05 12:11:55.584 187212 DEBUG oslo_concurrency.lockutils [None req-2144a341-f4cf-4501-909e-579854badee4 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:56 compute-0 nova_compute[187208]: 2025-12-05 12:11:56.912 187212 DEBUG nova.network.neutron [-] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:56 compute-0 nova_compute[187208]: 2025-12-05 12:11:56.934 187212 INFO nova.compute.manager [-] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Took 3.84 seconds to deallocate network for instance.
Dec 05 12:11:56 compute-0 nova_compute[187208]: 2025-12-05 12:11:56.986 187212 DEBUG oslo_concurrency.lockutils [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:56 compute-0 nova_compute[187208]: 2025-12-05 12:11:56.986 187212 DEBUG oslo_concurrency.lockutils [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.155 187212 DEBUG nova.network.neutron [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Updated VIF entry in instance network info cache for port 5f1f909d-4147-44de-9adf-829a12fc8bfa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.156 187212 DEBUG nova.network.neutron [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Updating instance_info_cache with network_info: [{"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.171 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.171 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Received event network-vif-plugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.172 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.172 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.172 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.172 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Processing event network-vif-plugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.173 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Received event network-vif-plugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.173 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.173 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.174 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.174 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] No waiting events found dispatching network-vif-plugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.174 187212 WARNING nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Received unexpected event network-vif-plugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f for instance with vm_state building and task_state spawning.
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.175 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.175 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.175 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.175 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.176 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] No waiting events found dispatching network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.176 187212 WARNING nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received unexpected event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b for instance with vm_state active and task_state None.
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.176 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.176 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.176 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.177 187212 DEBUG oslo_concurrency.lockutils [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.177 187212 DEBUG nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] No waiting events found dispatching network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.177 187212 WARNING nova.compute.manager [req-386a13dd-fca1-4d56-ad8f-070025a8c7ca req-d9a1956c-f0aa-4d66-8b31-e15cbf3ac4b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received unexpected event network-vif-plugged-7b183eee-c877-4387-a2f2-78923af9a88b for instance with vm_state active and task_state None.
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.177 187212 DEBUG oslo_concurrency.lockutils [req-47cc6d8f-1754-4df4-a0f3-756f7785293e req-06834155-a981-4e53-9a76-ee40f034e3bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.178 187212 DEBUG nova.network.neutron [req-47cc6d8f-1754-4df4-a0f3-756f7785293e req-06834155-a981-4e53-9a76-ee40f034e3bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Refreshing network info cache for port 04328ce4-34a7-41df-8b27-e1d5b7f3f280 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.179 187212 DEBUG nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Instance event wait completed in 13 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.182 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936717.1821947, c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.182 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] VM Resumed (Lifecycle Event)
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.185 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.190 187212 DEBUG nova.compute.provider_tree [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.192 187212 INFO nova.virt.libvirt.driver [-] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Instance spawned successfully.
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.193 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.225 187212 DEBUG nova.scheduler.client.report [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.232 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.236 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.237 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.237 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.238 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.238 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.238 187212 DEBUG nova.virt.libvirt.driver [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.242 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.272 187212 DEBUG oslo_concurrency.lockutils [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.275 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.305 187212 INFO nova.scheduler.client.report [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Deleted allocations for instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.319 187212 INFO nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Took 22.42 seconds to spawn the instance on the hypervisor.
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.319 187212 DEBUG nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.417 187212 DEBUG oslo_concurrency.lockutils [None req-0c8fb1b8-982f-486d-989b-0afb85d1f396 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.427 187212 INFO nova.compute.manager [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Took 24.51 seconds to build instance.
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.448 187212 DEBUG oslo_concurrency.lockutils [None req-25ea8f6b-c1bd-4ce7-886e-c459f67c945f 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.931 187212 INFO nova.compute.manager [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Rescuing
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.932 187212 DEBUG oslo_concurrency.lockutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "refresh_cache-f2a101e0-138f-404e-b6e0-e1359272f560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.932 187212 DEBUG oslo_concurrency.lockutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquired lock "refresh_cache-f2a101e0-138f-404e-b6e0-e1359272f560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.932 187212 DEBUG nova.network.neutron [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:11:57 compute-0 nova_compute[187208]: 2025-12-05 12:11:57.978 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.004 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.004 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.005 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.005 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.005 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] No waiting events found dispatching network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.005 187212 WARNING nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received unexpected event network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 for instance with vm_state deleted and task_state None.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.006 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-plugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.006 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.006 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.006 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.006 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Processing event network-vif-plugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.007 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-plugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.007 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.007 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.007 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.008 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] No event matching network-vif-plugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 in dict_keys([('network-vif-plugged', '5f1f909d-4147-44de-9adf-829a12fc8bfa'), ('network-vif-plugged', '67a9975d-5d22-4a6a-af5f-83ab6b080d9a')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.008 187212 WARNING nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received unexpected event network-vif-plugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 for instance with vm_state building and task_state spawning.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.008 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-plugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.008 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.009 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.009 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.009 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Processing event network-vif-plugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.010 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-plugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.010 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.010 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.010 187212 DEBUG oslo_concurrency.lockutils [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.010 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] No event matching network-vif-plugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a in dict_keys([('network-vif-plugged', '5f1f909d-4147-44de-9adf-829a12fc8bfa')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.011 187212 WARNING nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received unexpected event network-vif-plugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a for instance with vm_state building and task_state spawning.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.011 187212 DEBUG nova.compute.manager [req-fa5cc9d4-5df6-4303-939f-8db4a9fc94c7 req-b5f366e4-86c8-4ae2-aa41-1ae8deb71800 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-vif-deleted-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.141 187212 DEBUG nova.compute.manager [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.142 187212 DEBUG oslo_concurrency.lockutils [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.142 187212 DEBUG oslo_concurrency.lockutils [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.142 187212 DEBUG oslo_concurrency.lockutils [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.142 187212 DEBUG nova.compute.manager [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] No waiting events found dispatching network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.142 187212 WARNING nova.compute.manager [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received unexpected event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 for instance with vm_state active and task_state rescuing.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.143 187212 DEBUG nova.compute.manager [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-plugged-5f1f909d-4147-44de-9adf-829a12fc8bfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.143 187212 DEBUG oslo_concurrency.lockutils [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.143 187212 DEBUG oslo_concurrency.lockutils [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.143 187212 DEBUG oslo_concurrency.lockutils [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.143 187212 DEBUG nova.compute.manager [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Processing event network-vif-plugged-5f1f909d-4147-44de-9adf-829a12fc8bfa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.143 187212 DEBUG nova.compute.manager [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-plugged-5f1f909d-4147-44de-9adf-829a12fc8bfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.143 187212 DEBUG oslo_concurrency.lockutils [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.144 187212 DEBUG oslo_concurrency.lockutils [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.144 187212 DEBUG oslo_concurrency.lockutils [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.144 187212 DEBUG nova.compute.manager [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] No waiting events found dispatching network-vif-plugged-5f1f909d-4147-44de-9adf-829a12fc8bfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.144 187212 WARNING nova.compute.manager [req-4f6b56cd-9ecd-4a65-bb25-61d7a822322b req-c5c8125d-7156-4073-8d6a-43a6be64a536 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received unexpected event network-vif-plugged-5f1f909d-4147-44de-9adf-829a12fc8bfa for instance with vm_state building and task_state spawning.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.145 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Instance event wait completed in 4 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.150 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.155 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936718.1555722, ef8fcf55-e147-4baf-b506-1d99af05d330 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.156 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] VM Resumed (Lifecycle Event)
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.159 187212 INFO nova.virt.libvirt.driver [-] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Instance spawned successfully.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.159 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.199 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.200 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.200 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.201 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.201 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.202 187212 DEBUG nova.virt.libvirt.driver [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.205 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.209 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.229 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.246 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:11:58 compute-0 podman[234425]: 2025-12-05 12:11:58.260743107 +0000 UTC m=+0.108233857 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, managed_by=edpm_ansible)
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.283 187212 INFO nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Took 26.70 seconds to spawn the instance on the hypervisor.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.283 187212 DEBUG nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.358 187212 INFO nova.compute.manager [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Took 28.29 seconds to build instance.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.379 187212 DEBUG oslo_concurrency.lockutils [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "a70dccfb-2a89-4283-aba2-934af2667db3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.380 187212 DEBUG oslo_concurrency.lockutils [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.380 187212 DEBUG oslo_concurrency.lockutils [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.381 187212 DEBUG oslo_concurrency.lockutils [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.381 187212 DEBUG oslo_concurrency.lockutils [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.382 187212 INFO nova.compute.manager [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Terminating instance
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.383 187212 DEBUG nova.compute.manager [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.384 187212 DEBUG oslo_concurrency.lockutils [None req-d8fd258f-dc8a-43f6-b984-a6c272ef88ee 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 28.894s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 kernel: tap4870c9e1-85 (unregistering): left promiscuous mode
Dec 05 12:11:58 compute-0 NetworkManager[55691]: <info>  [1764936718.4227] device (tap4870c9e1-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.440 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 ovn_controller[95610]: 2025-12-05T12:11:58Z|00839|binding|INFO|Releasing lport 4870c9e1-8549-42a1-a77d-f9824cc38f59 from this chassis (sb_readonly=0)
Dec 05 12:11:58 compute-0 ovn_controller[95610]: 2025-12-05T12:11:58Z|00840|binding|INFO|Setting lport 4870c9e1-8549-42a1-a77d-f9824cc38f59 down in Southbound
Dec 05 12:11:58 compute-0 ovn_controller[95610]: 2025-12-05T12:11:58Z|00841|binding|INFO|Removing iface tap4870c9e1-85 ovn-installed in OVS
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.447 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.464 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d00000053.scope: Deactivated successfully.
Dec 05 12:11:58 compute-0 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d00000053.scope: Consumed 13.039s CPU time.
Dec 05 12:11:58 compute-0 systemd-machined[153543]: Machine qemu-94-instance-00000053 terminated.
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.494 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:5a:9c 10.100.0.13'], port_security=['fa:16:3e:f1:5a:9c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a70dccfb-2a89-4283-aba2-934af2667db3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4870c9e1-8549-42a1-a77d-f9824cc38f59) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.495 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4870c9e1-8549-42a1-a77d-f9824cc38f59 in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c unbound from our chassis
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.498 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.500 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a456af37-c96b-4bea-9caa-b127b8e5e74d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.501 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace which is not needed anymore
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.610 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.622 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.653 187212 INFO nova.virt.libvirt.driver [-] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Instance destroyed successfully.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.653 187212 DEBUG nova.objects.instance [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'resources' on Instance uuid a70dccfb-2a89-4283-aba2-934af2667db3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.670 187212 DEBUG nova.virt.libvirt.vif [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1890664302',display_name='tempest-ServerDiskConfigTestJSON-server-1890664302',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1890664302',id=83,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:11:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ijylq08c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:11:53Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=a70dccfb-2a89-4283-aba2-934af2667db3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.670 187212 DEBUG nova.network.os_vif_util [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "address": "fa:16:3e:f1:5a:9c", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4870c9e1-85", "ovs_interfaceid": "4870c9e1-8549-42a1-a77d-f9824cc38f59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.671 187212 DEBUG nova.network.os_vif_util [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:5a:9c,bridge_name='br-int',has_traffic_filtering=True,id=4870c9e1-8549-42a1-a77d-f9824cc38f59,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4870c9e1-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.671 187212 DEBUG os_vif [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:5a:9c,bridge_name='br-int',has_traffic_filtering=True,id=4870c9e1-8549-42a1-a77d-f9824cc38f59,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4870c9e1-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.674 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233805]: [NOTICE]   (233812) : haproxy version is 2.8.14-c23fe91
Dec 05 12:11:58 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233805]: [NOTICE]   (233812) : path to executable is /usr/sbin/haproxy
Dec 05 12:11:58 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233805]: [WARNING]  (233812) : Exiting Master process...
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.674 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4870c9e1-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:58 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233805]: [ALERT]    (233812) : Current worker (233816) exited with code 143 (Terminated)
Dec 05 12:11:58 compute-0 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[233805]: [WARNING]  (233812) : All workers exited. Exiting... (0)
Dec 05 12:11:58 compute-0 systemd[1]: libpod-995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c.scope: Deactivated successfully.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.680 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.683 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.686 187212 INFO os_vif [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:5a:9c,bridge_name='br-int',has_traffic_filtering=True,id=4870c9e1-8549-42a1-a77d-f9824cc38f59,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4870c9e1-85')
Dec 05 12:11:58 compute-0 podman[234473]: 2025-12-05 12:11:58.686807314 +0000 UTC m=+0.059302643 container died 995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.686 187212 INFO nova.virt.libvirt.driver [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Deleting instance files /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3_del
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.687 187212 INFO nova.virt.libvirt.driver [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Deletion of /var/lib/nova/instances/a70dccfb-2a89-4283-aba2-934af2667db3_del complete
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.704 187212 DEBUG oslo_concurrency.lockutils [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.705 187212 DEBUG oslo_concurrency.lockutils [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.705 187212 DEBUG oslo_concurrency.lockutils [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.705 187212 DEBUG oslo_concurrency.lockutils [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.706 187212 DEBUG oslo_concurrency.lockutils [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.709 187212 INFO nova.compute.manager [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Terminating instance
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.710 187212 DEBUG nova.compute.manager [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:11:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c-userdata-shm.mount: Deactivated successfully.
Dec 05 12:11:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-eea16742dc7dafaa5415894d6e2f2bbe9e168c04b0874ff39379e45f2a2874c6-merged.mount: Deactivated successfully.
Dec 05 12:11:58 compute-0 kernel: tapef99bad5-d0 (unregistering): left promiscuous mode
Dec 05 12:11:58 compute-0 NetworkManager[55691]: <info>  [1764936718.7616] device (tapef99bad5-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:11:58 compute-0 podman[234473]: 2025-12-05 12:11:58.768434387 +0000 UTC m=+0.140929716 container cleanup 995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 12:11:58 compute-0 ovn_controller[95610]: 2025-12-05T12:11:58Z|00842|binding|INFO|Releasing lport ef99bad5-d092-46f6-9b3a-8225cc233d1e from this chassis (sb_readonly=0)
Dec 05 12:11:58 compute-0 ovn_controller[95610]: 2025-12-05T12:11:58Z|00843|binding|INFO|Setting lport ef99bad5-d092-46f6-9b3a-8225cc233d1e down in Southbound
Dec 05 12:11:58 compute-0 ovn_controller[95610]: 2025-12-05T12:11:58Z|00844|binding|INFO|Removing iface tapef99bad5-d0 ovn-installed in OVS
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.777 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.778 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:e5:94 10.100.0.6'], port_security=['fa:16:3e:bd:e5:94 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cbdd2780-9e2b-4e10-8d0a-98de936cf6ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ef99bad5-d092-46f6-9b3a-8225cc233d1e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:11:58 compute-0 systemd[1]: libpod-conmon-995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c.scope: Deactivated successfully.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.786 187212 INFO nova.compute.manager [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.787 187212 DEBUG oslo.service.loopingcall [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.788 187212 DEBUG nova.compute.manager [-] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.788 187212 DEBUG nova.network.neutron [-] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Dec 05 12:11:58 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d0000004a.scope: Consumed 16.439s CPU time.
Dec 05 12:11:58 compute-0 systemd-machined[153543]: Machine qemu-83-instance-0000004a terminated.
Dec 05 12:11:58 compute-0 podman[234514]: 2025-12-05 12:11:58.874728736 +0000 UTC m=+0.066423907 container remove 995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.882 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ecccf49d-4535-41a1-bd8d-b34fccf186b7]: (4, ('Fri Dec  5 12:11:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c)\n995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c\nFri Dec  5 12:11:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c)\n995d7a65b417448b5a7ebfc2bdc53dca383c4e1194d5840cbd3ae12c3d3ae58c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.884 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1660178f-cec6-4b05-a1c6-f9a520d89e53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.886 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:58 compute-0 kernel: tap7be4540a-00: left promiscuous mode
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.888 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.906 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bff48458-0dab-426e-90cb-ad3f985ace9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.932 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dc0e7832-13a4-44d8-b8d4-350d595103ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.935 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f2454a35-69fb-4f77-b1d0-da68b1956ec3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.961 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[393a6d33-0b5a-418b-b478-624dab1f94b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 407161, 'reachable_time': 26581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234539, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.966 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.966 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[39629804-88e0-4693-bf86-f4b2a3e5c9ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.967 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ef99bad5-d092-46f6-9b3a-8225cc233d1e in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis
Dec 05 12:11:58 compute-0 systemd[1]: run-netns-ovnmeta\x2d7be4540a\x2d0e0d\x2d45dd\x2d9ed3\x2d2c2701ae3e2c.mount: Deactivated successfully.
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.970 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.972 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d81600ba-5a51-475d-af1e-ab6215418c84]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:58.974 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace which is not needed anymore
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.988 187212 INFO nova.virt.libvirt.driver [-] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Instance destroyed successfully.
Dec 05 12:11:58 compute-0 nova_compute[187208]: 2025-12-05 12:11:58.988 187212 DEBUG nova.objects.instance [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'resources' on Instance uuid 54d9605a-998b-4492-afc8-f7a5b0dd4e84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.082 187212 DEBUG nova.virt.libvirt.vif [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-569275018',display_name='tempest-tempest.common.compute-instance-569275018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-569275018',id=74,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-h1nkz7rb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=54d9605a-998b-4492-afc8-f7a5b0dd4e84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.083 187212 DEBUG nova.network.os_vif_util [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.083 187212 DEBUG nova.network.os_vif_util [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.084 187212 DEBUG os_vif [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.085 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.085 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef99bad5-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.088 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.092 187212 INFO os_vif [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0')
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.093 187212 INFO nova.virt.libvirt.driver [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Deleting instance files /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84_del
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.093 187212 INFO nova.virt.libvirt.driver [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Deletion of /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84_del complete
Dec 05 12:11:59 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[231559]: [NOTICE]   (231563) : haproxy version is 2.8.14-c23fe91
Dec 05 12:11:59 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[231559]: [NOTICE]   (231563) : path to executable is /usr/sbin/haproxy
Dec 05 12:11:59 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[231559]: [WARNING]  (231563) : Exiting Master process...
Dec 05 12:11:59 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[231559]: [ALERT]    (231563) : Current worker (231565) exited with code 143 (Terminated)
Dec 05 12:11:59 compute-0 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[231559]: [WARNING]  (231563) : All workers exited. Exiting... (0)
Dec 05 12:11:59 compute-0 systemd[1]: libpod-2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0.scope: Deactivated successfully.
Dec 05 12:11:59 compute-0 podman[234566]: 2025-12-05 12:11:59.148814872 +0000 UTC m=+0.068858667 container died 2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.160 187212 INFO nova.compute.manager [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Took 0.45 seconds to destroy the instance on the hypervisor.
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.161 187212 DEBUG oslo.service.loopingcall [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.161 187212 DEBUG nova.compute.manager [-] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.161 187212 DEBUG nova.network.neutron [-] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:11:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0-userdata-shm.mount: Deactivated successfully.
Dec 05 12:11:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-0abe6a70244aa25519de7b565800e0d75c41ac2b2899ca45664096e7b4998d70-merged.mount: Deactivated successfully.
Dec 05 12:11:59 compute-0 podman[234566]: 2025-12-05 12:11:59.205430436 +0000 UTC m=+0.125474231 container cleanup 2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:11:59 compute-0 systemd[1]: libpod-conmon-2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0.scope: Deactivated successfully.
Dec 05 12:11:59 compute-0 podman[234592]: 2025-12-05 12:11:59.283647081 +0000 UTC m=+0.056374899 container remove 2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:11:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:59.289 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[213093a5-0e52-48df-b1dd-5741c45106b8]: (4, ('Fri Dec  5 12:11:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0)\n2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0\nFri Dec  5 12:11:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0)\n2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:59.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[65b65a7b-339d-422a-ad06-4cb25f0544d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:59.292 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.294 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:59 compute-0 kernel: tapfbfed6fc-30: left promiscuous mode
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:11:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:59.320 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b6096ee3-ab58-433b-b15d-ef605ad9ac06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:59.331 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0b666854-4d6d-4308-878c-790a513042b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:59.333 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05197802-e08c-4e68-a357-7f82efbf6dd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:59.360 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[00aa5cb6-220d-415a-a691-66684622e770]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398305, 'reachable_time': 19595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234607, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:59.363 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:11:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:11:59.364 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[cbecc262-124b-40cc-9b17-4d5096889457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:11:59 compute-0 systemd[1]: run-netns-ovnmeta\x2dfbfed6fc\x2d3701\x2d4311\x2da4c2\x2d8c49c5b7584c.mount: Deactivated successfully.
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.762 187212 DEBUG nova.network.neutron [req-47cc6d8f-1754-4df4-a0f3-756f7785293e req-06834155-a981-4e53-9a76-ee40f034e3bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Updated VIF entry in instance network info cache for port 04328ce4-34a7-41df-8b27-e1d5b7f3f280. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.763 187212 DEBUG nova.network.neutron [req-47cc6d8f-1754-4df4-a0f3-756f7785293e req-06834155-a981-4e53-9a76-ee40f034e3bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Updating instance_info_cache with network_info: [{"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:11:59 compute-0 nova_compute[187208]: 2025-12-05 12:11:59.889 187212 DEBUG oslo_concurrency.lockutils [req-47cc6d8f-1754-4df4-a0f3-756f7785293e req-06834155-a981-4e53-9a76-ee40f034e3bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ef8fcf55-e147-4baf-b506-1d99af05d330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:00 compute-0 nova_compute[187208]: 2025-12-05 12:12:00.882 187212 DEBUG nova.network.neutron [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Updating instance_info_cache with network_info: [{"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:00 compute-0 nova_compute[187208]: 2025-12-05 12:12:00.912 187212 DEBUG oslo_concurrency.lockutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Releasing lock "refresh_cache-f2a101e0-138f-404e-b6e0-e1359272f560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.400 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.810 187212 DEBUG nova.network.neutron [-] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.820 187212 DEBUG nova.network.neutron [-] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.846 187212 INFO nova.compute.manager [-] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Took 2.68 seconds to deallocate network for instance.
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.886 187212 INFO nova.compute.manager [-] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Took 3.10 seconds to deallocate network for instance.
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.956 187212 DEBUG nova.compute.manager [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Received event network-vif-unplugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.956 187212 DEBUG oslo_concurrency.lockutils [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.956 187212 DEBUG oslo_concurrency.lockutils [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.956 187212 DEBUG oslo_concurrency.lockutils [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.956 187212 DEBUG nova.compute.manager [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] No waiting events found dispatching network-vif-unplugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.957 187212 DEBUG nova.compute.manager [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Received event network-vif-unplugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.957 187212 DEBUG nova.compute.manager [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Received event network-vif-plugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.957 187212 DEBUG oslo_concurrency.lockutils [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.957 187212 DEBUG oslo_concurrency.lockutils [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.957 187212 DEBUG oslo_concurrency.lockutils [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.957 187212 DEBUG nova.compute.manager [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] No waiting events found dispatching network-vif-plugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.957 187212 WARNING nova.compute.manager [req-4c31e4c0-3859-472d-915b-2859d21d91ad req-eead70b3-54c2-4b95-8e9d-b354f798ec2a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Received unexpected event network-vif-plugged-4870c9e1-8549-42a1-a77d-f9824cc38f59 for instance with vm_state active and task_state deleting.
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.958 187212 DEBUG oslo_concurrency.lockutils [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:01 compute-0 nova_compute[187208]: 2025-12-05 12:12:01.958 187212 DEBUG oslo_concurrency.lockutils [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.029 187212 DEBUG oslo_concurrency.lockutils [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.224 187212 DEBUG nova.compute.provider_tree [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.246 187212 DEBUG nova.scheduler.client.report [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.297 187212 DEBUG oslo_concurrency.lockutils [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.300 187212 DEBUG oslo_concurrency.lockutils [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.324 187212 INFO nova.scheduler.client.report [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Deleted allocations for instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.478 187212 DEBUG nova.compute.provider_tree [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.674 187212 DEBUG oslo_concurrency.lockutils [None req-d18530de-4bf5-4afa-9545-9242359df37e 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.687 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.686 187212 DEBUG nova.scheduler.client.report [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.707 187212 DEBUG oslo_concurrency.lockutils [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.708 187212 DEBUG oslo_concurrency.lockutils [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.709 187212 DEBUG oslo_concurrency.lockutils [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.710 187212 DEBUG oslo_concurrency.lockutils [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.710 187212 DEBUG oslo_concurrency.lockutils [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.711 187212 INFO nova.compute.manager [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Terminating instance
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.712 187212 DEBUG nova.compute.manager [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:12:02 compute-0 kernel: tap9e4c5b24-e3 (unregistering): left promiscuous mode
Dec 05 12:12:02 compute-0 NetworkManager[55691]: <info>  [1764936722.7408] device (tap9e4c5b24-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.745 187212 DEBUG oslo_concurrency.lockutils [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.446s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:02 compute-0 ovn_controller[95610]: 2025-12-05T12:12:02Z|00845|binding|INFO|Releasing lport 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f from this chassis (sb_readonly=0)
Dec 05 12:12:02 compute-0 ovn_controller[95610]: 2025-12-05T12:12:02Z|00846|binding|INFO|Setting lport 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f down in Southbound
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:02 compute-0 ovn_controller[95610]: 2025-12-05T12:12:02Z|00847|binding|INFO|Removing iface tap9e4c5b24-e3 ovn-installed in OVS
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.764 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:c3:62 10.100.0.14'], port_security=['fa:16:3e:6f:c3:62 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.765 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 unbound from our chassis
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.766 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.767 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.818 187212 INFO nova.scheduler.client.report [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Deleted allocations for instance a70dccfb-2a89-4283-aba2-934af2667db3
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.820 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.833 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a15a27-9b27-45b3-8e6f-d66125643092]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:02 compute-0 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d00000055.scope: Deactivated successfully.
Dec 05 12:12:02 compute-0 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d00000055.scope: Consumed 6.245s CPU time.
Dec 05 12:12:02 compute-0 systemd-machined[153543]: Machine qemu-95-instance-00000055 terminated.
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.873 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[370e8512-4f91-4f07-b2d3-a8f1f703bec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.877 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[43660c23-0260-45a0-9a76-8883f1191a39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.903 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[84f76ba0-b61c-4e9c-8c46-7ef994b47337]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.931 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e041ab-50f1-43b5-876a-eb8355866080]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234633, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.955 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e1828b-7738-4164-a674-58e2b1b24236]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234639, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234639, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.960 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.969 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.969 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.970 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:02.970 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.990 187212 INFO nova.virt.libvirt.driver [-] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Instance destroyed successfully.
Dec 05 12:12:02 compute-0 nova_compute[187208]: 2025-12-05 12:12:02.990 187212 DEBUG nova.objects.instance [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'resources' on Instance uuid c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.007 187212 DEBUG oslo_concurrency.lockutils [None req-aaf92778-65ce-4e52-9f03-ad898538f1d7 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "a70dccfb-2a89-4283-aba2-934af2667db3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.017 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.017 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.019 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.068 187212 DEBUG oslo_concurrency.lockutils [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.069 187212 DEBUG oslo_concurrency.lockutils [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.069 187212 DEBUG oslo_concurrency.lockutils [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.069 187212 DEBUG oslo_concurrency.lockutils [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.070 187212 DEBUG oslo_concurrency.lockutils [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.071 187212 INFO nova.compute.manager [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Terminating instance
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.071 187212 DEBUG nova.compute.manager [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.074 187212 DEBUG nova.virt.libvirt.vif [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:11:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1213220860',display_name='tempest-ServersTestJSON-server-1213220860',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1213220860',id=85,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBqXxjjZG3zRpoVdmm3gug/kuSI6oQzY8us2IsTu/9Zy7Ex4wlL/7VzaOtZ9WYCzOWxb9KMd4tJR1k9CVgqnqR3cmI+vR+6Obgtfj46DxP9OaK/oOG4PK0jIDIbrTs0g7A==',key_name='tempest-key-1731656875',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:11:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-iecebzas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:11:57Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.074 187212 DEBUG nova.network.os_vif_util [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "address": "fa:16:3e:6f:c3:62", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e4c5b24-e3", "ovs_interfaceid": "9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.075 187212 DEBUG nova.network.os_vif_util [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:c3:62,bridge_name='br-int',has_traffic_filtering=True,id=9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e4c5b24-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.075 187212 DEBUG os_vif [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:c3:62,bridge_name='br-int',has_traffic_filtering=True,id=9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e4c5b24-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.076 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.077 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e4c5b24-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.080 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.083 187212 INFO os_vif [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:c3:62,bridge_name='br-int',has_traffic_filtering=True,id=9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e4c5b24-e3')
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.083 187212 INFO nova.virt.libvirt.driver [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Deleting instance files /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7_del
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.084 187212 INFO nova.virt.libvirt.driver [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Deletion of /var/lib/nova/instances/c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7_del complete
Dec 05 12:12:03 compute-0 kernel: tap5f1f909d-41 (unregistering): left promiscuous mode
Dec 05 12:12:03 compute-0 NetworkManager[55691]: <info>  [1764936723.1033] device (tap5f1f909d-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:12:03 compute-0 ovn_controller[95610]: 2025-12-05T12:12:03Z|00848|binding|INFO|Releasing lport 5f1f909d-4147-44de-9adf-829a12fc8bfa from this chassis (sb_readonly=0)
Dec 05 12:12:03 compute-0 ovn_controller[95610]: 2025-12-05T12:12:03Z|00849|binding|INFO|Setting lport 5f1f909d-4147-44de-9adf-829a12fc8bfa down in Southbound
Dec 05 12:12:03 compute-0 ovn_controller[95610]: 2025-12-05T12:12:03Z|00850|binding|INFO|Removing iface tap5f1f909d-41 ovn-installed in OVS
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.111 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.122 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:ff:d8 10.100.0.12'], port_security=['fa:16:3e:3c:ff:d8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/24', 'neutron:device_id': 'ef8fcf55-e147-4baf-b506-1d99af05d330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1ff0073-21f4-4b01-9595-8418e7a5cc2c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5f1f909d-4147-44de-9adf-829a12fc8bfa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.123 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5f1f909d-4147-44de-9adf-829a12fc8bfa in datapath 3f3e0335-bf9c-44f0-9272-60c74c37b38b unbound from our chassis
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.125 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.126 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f3e0335-bf9c-44f0-9272-60c74c37b38b
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.129 187212 INFO nova.compute.manager [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Took 0.42 seconds to destroy the instance on the hypervisor.
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.129 187212 DEBUG oslo.service.loopingcall [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.130 187212 DEBUG nova.compute.manager [-] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.130 187212 DEBUG nova.network.neutron [-] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:12:03 compute-0 kernel: tap67a9975d-5d (unregistering): left promiscuous mode
Dec 05 12:12:03 compute-0 NetworkManager[55691]: <info>  [1764936723.1430] device (tap67a9975d-5d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.144 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7d503a8c-149d-4cc3-94ab-5ee607ebea57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.152 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_controller[95610]: 2025-12-05T12:12:03Z|00851|binding|INFO|Releasing lport 67a9975d-5d22-4a6a-af5f-83ab6b080d9a from this chassis (sb_readonly=0)
Dec 05 12:12:03 compute-0 ovn_controller[95610]: 2025-12-05T12:12:03Z|00852|binding|INFO|Setting lport 67a9975d-5d22-4a6a-af5f-83ab6b080d9a down in Southbound
Dec 05 12:12:03 compute-0 ovn_controller[95610]: 2025-12-05T12:12:03Z|00853|binding|INFO|Removing iface tap67a9975d-5d ovn-installed in OVS
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.155 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.161 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:92:66 10.100.1.97'], port_security=['fa:16:3e:c6:92:66 10.100.1.97'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.97/24', 'neutron:device_id': 'ef8fcf55-e147-4baf-b506-1d99af05d330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-951a1e05-7d1e-4c9c-9143-3e651fb29978', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4593a165-aa4c-4b2e-939b-76fea57e8290, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=67a9975d-5d22-4a6a-af5f-83ab6b080d9a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.177 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[92380455-ad22-4022-833d-b104f0444546]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.180 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[290cfab5-5b87-46e1-ae3c-702e803b0c3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 kernel: tap04328ce4-34 (unregistering): left promiscuous mode
Dec 05 12:12:03 compute-0 NetworkManager[55691]: <info>  [1764936723.1899] device (tap04328ce4-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.212 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_controller[95610]: 2025-12-05T12:12:03Z|00854|binding|INFO|Releasing lport 04328ce4-34a7-41df-8b27-e1d5b7f3f280 from this chassis (sb_readonly=0)
Dec 05 12:12:03 compute-0 ovn_controller[95610]: 2025-12-05T12:12:03Z|00855|binding|INFO|Setting lport 04328ce4-34a7-41df-8b27-e1d5b7f3f280 down in Southbound
Dec 05 12:12:03 compute-0 ovn_controller[95610]: 2025-12-05T12:12:03Z|00856|binding|INFO|Removing iface tap04328ce4-34 ovn-installed in OVS
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.215 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.215 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b31548-8e63-406a-883e-04aa91ea9cce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.224 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:e0:ee 10.100.0.205'], port_security=['fa:16:3e:1e:e0:ee 10.100.0.205'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.205/24', 'neutron:device_id': 'ef8fcf55-e147-4baf-b506-1d99af05d330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1ff0073-21f4-4b01-9595-8418e7a5cc2c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=04328ce4-34a7-41df-8b27-e1d5b7f3f280) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.227 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.231 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.242 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a86a2dac-e69b-4605-9958-d88bf62ce1a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f3e0335-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:40:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 532, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 532, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409360, 'reachable_time': 29730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234673, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d00000054.scope: Deactivated successfully.
Dec 05 12:12:03 compute-0 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d00000054.scope: Consumed 5.225s CPU time.
Dec 05 12:12:03 compute-0 systemd-machined[153543]: Machine qemu-97-instance-00000054 terminated.
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.256 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f48e90c7-63c6-4f7c-a6a8-9eb23d0582ca]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap3f3e0335-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409372, 'tstamp': 409372}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234674, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3f3e0335-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409375, 'tstamp': 409375}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234674, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.261 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f3e0335-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.262 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.273 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.274 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f3e0335-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.275 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.275 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f3e0335-b0, col_values=(('external_ids', {'iface-id': '5043c1f9-031c-414d-b5e2-35f6436be4ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.275 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.277 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 67a9975d-5d22-4a6a-af5f-83ab6b080d9a in datapath 951a1e05-7d1e-4c9c-9143-3e651fb29978 unbound from our chassis
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.278 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 951a1e05-7d1e-4c9c-9143-3e651fb29978, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.279 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[169db755-76aa-453f-8f8b-7d949e4d5fa7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.279 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978 namespace which is not needed anymore
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.379 187212 INFO nova.virt.libvirt.driver [-] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Instance destroyed successfully.
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.379 187212 DEBUG nova.objects.instance [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lazy-loading 'resources' on Instance uuid ef8fcf55-e147-4baf-b506-1d99af05d330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.397 187212 DEBUG nova.virt.libvirt.vif [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-614377642',display_name='tempest-ServersTestMultiNic-server-614377642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-614377642',id=84,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:11:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-3hddow00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:11:58Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=ef8fcf55-e147-4baf-b506-1d99af05d330,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.398 187212 DEBUG nova.network.os_vif_util [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "address": "fa:16:3e:3c:ff:d8", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f1f909d-41", "ovs_interfaceid": "5f1f909d-4147-44de-9adf-829a12fc8bfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.398 187212 DEBUG nova.network.os_vif_util [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:ff:d8,bridge_name='br-int',has_traffic_filtering=True,id=5f1f909d-4147-44de-9adf-829a12fc8bfa,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f1f909d-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.399 187212 DEBUG os_vif [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:ff:d8,bridge_name='br-int',has_traffic_filtering=True,id=5f1f909d-4147-44de-9adf-829a12fc8bfa,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f1f909d-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.400 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.400 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f1f909d-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.404 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.411 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.413 187212 INFO os_vif [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:ff:d8,bridge_name='br-int',has_traffic_filtering=True,id=5f1f909d-4147-44de-9adf-829a12fc8bfa,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f1f909d-41')
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.414 187212 DEBUG nova.virt.libvirt.vif [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-614377642',display_name='tempest-ServersTestMultiNic-server-614377642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-614377642',id=84,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:11:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-3hddow00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:11:58Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=ef8fcf55-e147-4baf-b506-1d99af05d330,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.414 187212 DEBUG nova.network.os_vif_util [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "address": "fa:16:3e:c6:92:66", "network": {"id": "951a1e05-7d1e-4c9c-9143-3e651fb29978", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1729247503", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67a9975d-5d", "ovs_interfaceid": "67a9975d-5d22-4a6a-af5f-83ab6b080d9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.415 187212 DEBUG nova.network.os_vif_util [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:92:66,bridge_name='br-int',has_traffic_filtering=True,id=67a9975d-5d22-4a6a-af5f-83ab6b080d9a,network=Network(951a1e05-7d1e-4c9c-9143-3e651fb29978),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67a9975d-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.415 187212 DEBUG os_vif [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:92:66,bridge_name='br-int',has_traffic_filtering=True,id=67a9975d-5d22-4a6a-af5f-83ab6b080d9a,network=Network(951a1e05-7d1e-4c9c-9143-3e651fb29978),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67a9975d-5d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.416 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.416 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67a9975d-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.418 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.419 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.424 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.427 187212 INFO os_vif [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:92:66,bridge_name='br-int',has_traffic_filtering=True,id=67a9975d-5d22-4a6a-af5f-83ab6b080d9a,network=Network(951a1e05-7d1e-4c9c-9143-3e651fb29978),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67a9975d-5d')
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.428 187212 DEBUG nova.virt.libvirt.vif [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-614377642',display_name='tempest-ServersTestMultiNic-server-614377642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-614377642',id=84,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:11:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-3hddow00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:11:58Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=ef8fcf55-e147-4baf-b506-1d99af05d330,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.428 187212 DEBUG nova.network.os_vif_util [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "address": "fa:16:3e:1e:e0:ee", "network": {"id": "3f3e0335-bf9c-44f0-9272-60c74c37b38b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67619951", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04328ce4-34", "ovs_interfaceid": "04328ce4-34a7-41df-8b27-e1d5b7f3f280", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.429 187212 DEBUG nova.network.os_vif_util [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:e0:ee,bridge_name='br-int',has_traffic_filtering=True,id=04328ce4-34a7-41df-8b27-e1d5b7f3f280,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04328ce4-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.429 187212 DEBUG os_vif [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:e0:ee,bridge_name='br-int',has_traffic_filtering=True,id=04328ce4-34a7-41df-8b27-e1d5b7f3f280,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04328ce4-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.431 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.431 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04328ce4-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.433 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.434 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.437 187212 INFO os_vif [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:e0:ee,bridge_name='br-int',has_traffic_filtering=True,id=04328ce4-34a7-41df-8b27-e1d5b7f3f280,network=Network(3f3e0335-bf9c-44f0-9272-60c74c37b38b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04328ce4-34')
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.438 187212 INFO nova.virt.libvirt.driver [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Deleting instance files /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330_del
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978[234402]: [NOTICE]   (234406) : haproxy version is 2.8.14-c23fe91
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978[234402]: [NOTICE]   (234406) : path to executable is /usr/sbin/haproxy
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978[234402]: [WARNING]  (234406) : Exiting Master process...
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.440 187212 INFO nova.virt.libvirt.driver [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Deletion of /var/lib/nova/instances/ef8fcf55-e147-4baf-b506-1d99af05d330_del complete
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978[234402]: [ALERT]    (234406) : Current worker (234408) exited with code 143 (Terminated)
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978[234402]: [WARNING]  (234406) : All workers exited. Exiting... (0)
Dec 05 12:12:03 compute-0 systemd[1]: libpod-5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e.scope: Deactivated successfully.
Dec 05 12:12:03 compute-0 podman[234731]: 2025-12-05 12:12:03.45193481 +0000 UTC m=+0.060531528 container died 5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 05 12:12:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e-userdata-shm.mount: Deactivated successfully.
Dec 05 12:12:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-d7965c7475ebcb3c9788b77fdf12f4ae89fbcfe344b768ac69adf31da8b83f1a-merged.mount: Deactivated successfully.
Dec 05 12:12:03 compute-0 podman[234731]: 2025-12-05 12:12:03.48714511 +0000 UTC m=+0.095741828 container cleanup 5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:12:03 compute-0 systemd[1]: libpod-conmon-5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e.scope: Deactivated successfully.
Dec 05 12:12:03 compute-0 podman[234760]: 2025-12-05 12:12:03.550341044 +0000 UTC m=+0.041634356 container remove 5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.557 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2de5fdf2-da68-4169-8c1f-b306b4320080]: (4, ('Fri Dec  5 12:12:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978 (5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e)\n5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e\nFri Dec  5 12:12:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978 (5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e)\n5e8a729119e0f52c5c30fa21dae5ab5e64cb49d1759feb23caea869d12ba584e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.563 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[06c10a13-5bf0-456f-9698-6860bc1789a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.564 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap951a1e05-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.564 187212 INFO nova.compute.manager [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Took 0.49 seconds to destroy the instance on the hypervisor.
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.565 187212 DEBUG oslo.service.loopingcall [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.565 187212 DEBUG nova.compute.manager [-] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.565 187212 DEBUG nova.network.neutron [-] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:12:03 compute-0 kernel: tap951a1e05-70: left promiscuous mode
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.569 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 nova_compute[187208]: 2025-12-05 12:12:03.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.582 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f2e9be76-dcf7-4ea7-a23c-1da12311abbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.604 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d26b89c6-207e-4d4b-8572-eb11d71b15ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.606 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9dac9ebe-9db8-48dd-9d7f-a203e90b567f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.627 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[48fb6229-d980-4b93-8e0d-a35c5ecaf006]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409469, 'reachable_time': 23511, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234775, 'error': None, 'target': 'ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d951a1e05\x2d7d1e\x2d4c9c\x2d9143\x2d3e651fb29978.mount: Deactivated successfully.
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.631 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-951a1e05-7d1e-4c9c-9143-3e651fb29978 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.631 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b64c19ef-b3e7-427b-9971-8338231787bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.632 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 04328ce4-34a7-41df-8b27-e1d5b7f3f280 in datapath 3f3e0335-bf9c-44f0-9272-60c74c37b38b unbound from our chassis
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.637 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f3e0335-bf9c-44f0-9272-60c74c37b38b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.638 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[45b106f7-fdfd-4950-8018-72e2ec1b1167]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:03.639 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b namespace which is not needed anymore
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b[234326]: [NOTICE]   (234330) : haproxy version is 2.8.14-c23fe91
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b[234326]: [NOTICE]   (234330) : path to executable is /usr/sbin/haproxy
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b[234326]: [WARNING]  (234330) : Exiting Master process...
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b[234326]: [WARNING]  (234330) : Exiting Master process...
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b[234326]: [ALERT]    (234330) : Current worker (234332) exited with code 143 (Terminated)
Dec 05 12:12:03 compute-0 neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b[234326]: [WARNING]  (234330) : All workers exited. Exiting... (0)
Dec 05 12:12:03 compute-0 systemd[1]: libpod-b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3.scope: Deactivated successfully.
Dec 05 12:12:03 compute-0 podman[234793]: 2025-12-05 12:12:03.774188628 +0000 UTC m=+0.050190212 container died b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 05 12:12:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3-userdata-shm.mount: Deactivated successfully.
Dec 05 12:12:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-f721315c1793784aa6632ad222f35fca4cfc77d10f44c6dbf10bf3342a133f25-merged.mount: Deactivated successfully.
Dec 05 12:12:03 compute-0 podman[234793]: 2025-12-05 12:12:03.819591351 +0000 UTC m=+0.095592935 container cleanup b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:12:03 compute-0 systemd[1]: libpod-conmon-b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3.scope: Deactivated successfully.
Dec 05 12:12:04 compute-0 podman[234824]: 2025-12-05 12:12:04.035831386 +0000 UTC m=+0.192991519 container remove b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:12:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:04.041 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2554c858-ef07-4e43-96c0-a89d1c7fee86]: (4, ('Fri Dec  5 12:12:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b (b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3)\nb69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3\nFri Dec  5 12:12:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b (b69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3)\nb69523ac7cbd26ca40f85b0328a8926e2b5e43f4f170209c2ca56b1aac93e5d3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:04.043 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[64b4c10c-ea24-4a9c-b9ad-c1af703ab0be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:04.044 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f3e0335-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:04 compute-0 nova_compute[187208]: 2025-12-05 12:12:04.046 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:04 compute-0 kernel: tap3f3e0335-b0: left promiscuous mode
Dec 05 12:12:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:04.053 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dbcb809c-749d-4ce9-9da7-a69d6f192ef3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:04 compute-0 nova_compute[187208]: 2025-12-05 12:12:04.061 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:04.069 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bae98dd8-9a49-4127-bfd1-02c817edd338]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:04.071 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[395787ef-b2bf-44c3-9cee-b05ed52016c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:04.087 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8a4e16d5-0cb6-4dd0-837a-8df73fe428a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409352, 'reachable_time': 43046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234839, 'error': None, 'target': 'ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:04.089 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3f3e0335-bf9c-44f0-9272-60c74c37b38b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:12:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:04.089 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[e21312d3-46d0-49fc-81b2-2cb170d9b507]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:04 compute-0 systemd[1]: run-netns-ovnmeta\x2d3f3e0335\x2dbf9c\x2d44f0\x2d9272\x2d60c74c37b38b.mount: Deactivated successfully.
Dec 05 12:12:04 compute-0 podman[234840]: 2025-12-05 12:12:04.478991714 +0000 UTC m=+0.069370752 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible)
Dec 05 12:12:04 compute-0 podman[234841]: 2025-12-05 12:12:04.495029154 +0000 UTC m=+0.087062539 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:12:04 compute-0 nova_compute[187208]: 2025-12-05 12:12:04.770 187212 DEBUG nova.network.neutron [-] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:04 compute-0 nova_compute[187208]: 2025-12-05 12:12:04.802 187212 INFO nova.compute.manager [-] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Took 1.67 seconds to deallocate network for instance.
Dec 05 12:12:04 compute-0 nova_compute[187208]: 2025-12-05 12:12:04.865 187212 DEBUG oslo_concurrency.lockutils [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:04 compute-0 nova_compute[187208]: 2025-12-05 12:12:04.866 187212 DEBUG oslo_concurrency.lockutils [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.018 187212 DEBUG nova.compute.provider_tree [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.043 187212 DEBUG nova.scheduler.client.report [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.081 187212 DEBUG oslo_concurrency.lockutils [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.116 187212 INFO nova.scheduler.client.report [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Deleted allocations for instance c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.196 187212 DEBUG oslo_concurrency.lockutils [None req-4ff92b0e-21c6-4cc4-9de5-45431b8d31c2 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.372 187212 DEBUG nova.compute.manager [req-beeaabe6-717c-4585-b0e0-2e6c7155e7e8 req-fbb3ed4d-1b7a-431e-9b38-4c24c0df685b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-unplugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.373 187212 DEBUG oslo_concurrency.lockutils [req-beeaabe6-717c-4585-b0e0-2e6c7155e7e8 req-fbb3ed4d-1b7a-431e-9b38-4c24c0df685b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.373 187212 DEBUG oslo_concurrency.lockutils [req-beeaabe6-717c-4585-b0e0-2e6c7155e7e8 req-fbb3ed4d-1b7a-431e-9b38-4c24c0df685b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.373 187212 DEBUG oslo_concurrency.lockutils [req-beeaabe6-717c-4585-b0e0-2e6c7155e7e8 req-fbb3ed4d-1b7a-431e-9b38-4c24c0df685b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.374 187212 DEBUG nova.compute.manager [req-beeaabe6-717c-4585-b0e0-2e6c7155e7e8 req-fbb3ed4d-1b7a-431e-9b38-4c24c0df685b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] No waiting events found dispatching network-vif-unplugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.374 187212 DEBUG nova.compute.manager [req-beeaabe6-717c-4585-b0e0-2e6c7155e7e8 req-fbb3ed4d-1b7a-431e-9b38-4c24c0df685b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-unplugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.703 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-unplugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.703 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.704 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.704 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.704 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] No waiting events found dispatching network-vif-unplugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.704 187212 WARNING nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received unexpected event network-vif-unplugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e for instance with vm_state deleted and task_state None.
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.705 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.705 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.705 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.705 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.706 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] No waiting events found dispatching network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.706 187212 WARNING nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received unexpected event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e for instance with vm_state deleted and task_state None.
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.706 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-deleted-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.706 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Received event network-vif-deleted-4870c9e1-8549-42a1-a77d-f9824cc38f59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.706 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Received event network-vif-unplugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.707 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.707 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.707 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.707 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] No waiting events found dispatching network-vif-unplugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.707 187212 WARNING nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Received unexpected event network-vif-unplugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f for instance with vm_state deleted and task_state None.
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.707 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Received event network-vif-plugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.708 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.708 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.708 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.708 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] No waiting events found dispatching network-vif-plugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.708 187212 WARNING nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Received unexpected event network-vif-plugged-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f for instance with vm_state deleted and task_state None.
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.709 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-unplugged-5f1f909d-4147-44de-9adf-829a12fc8bfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.709 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.709 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.709 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.709 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] No waiting events found dispatching network-vif-unplugged-5f1f909d-4147-44de-9adf-829a12fc8bfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.710 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-unplugged-5f1f909d-4147-44de-9adf-829a12fc8bfa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.710 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-plugged-5f1f909d-4147-44de-9adf-829a12fc8bfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.710 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.710 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.711 187212 DEBUG oslo_concurrency.lockutils [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.711 187212 DEBUG nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] No waiting events found dispatching network-vif-plugged-5f1f909d-4147-44de-9adf-829a12fc8bfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:05 compute-0 nova_compute[187208]: 2025-12-05 12:12:05.711 187212 WARNING nova.compute.manager [req-253e5add-b4e1-45a4-bc6a-4d62865ff438 req-9a473aa9-b6e8-4b67-9c0d-48cdd4806472 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received unexpected event network-vif-plugged-5f1f909d-4147-44de-9adf-829a12fc8bfa for instance with vm_state active and task_state deleting.
Dec 05 12:12:06 compute-0 nova_compute[187208]: 2025-12-05 12:12:06.501 187212 DEBUG nova.network.neutron [-] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:06 compute-0 nova_compute[187208]: 2025-12-05 12:12:06.532 187212 INFO nova.compute.manager [-] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Took 2.97 seconds to deallocate network for instance.
Dec 05 12:12:06 compute-0 nova_compute[187208]: 2025-12-05 12:12:06.574 187212 DEBUG oslo_concurrency.lockutils [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:06 compute-0 nova_compute[187208]: 2025-12-05 12:12:06.575 187212 DEBUG oslo_concurrency.lockutils [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:06 compute-0 nova_compute[187208]: 2025-12-05 12:12:06.741 187212 DEBUG nova.compute.provider_tree [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:06 compute-0 nova_compute[187208]: 2025-12-05 12:12:06.771 187212 DEBUG nova.scheduler.client.report [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:06 compute-0 nova_compute[187208]: 2025-12-05 12:12:06.822 187212 DEBUG oslo_concurrency.lockutils [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:06 compute-0 nova_compute[187208]: 2025-12-05 12:12:06.880 187212 INFO nova.scheduler.client.report [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Deleted allocations for instance ef8fcf55-e147-4baf-b506-1d99af05d330
Dec 05 12:12:06 compute-0 nova_compute[187208]: 2025-12-05 12:12:06.991 187212 DEBUG oslo_concurrency.lockutils [None req-efe0a40c-2dae-4fa1-9c40-9d384afc01c4 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:07 compute-0 nova_compute[187208]: 2025-12-05 12:12:07.935 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936712.9338875, ecc25cb4-5b3a-43f7-949d-ca9a1a19056a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:07 compute-0 nova_compute[187208]: 2025-12-05 12:12:07.935 187212 INFO nova.compute.manager [-] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] VM Stopped (Lifecycle Event)
Dec 05 12:12:07 compute-0 nova_compute[187208]: 2025-12-05 12:12:07.970 187212 DEBUG nova.compute.manager [None req-7edc47fd-8144-43e0-89a1-04407ce9cd95 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.277 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.433 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.609 187212 DEBUG nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-plugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.610 187212 DEBUG oslo_concurrency.lockutils [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.610 187212 DEBUG oslo_concurrency.lockutils [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.610 187212 DEBUG oslo_concurrency.lockutils [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.611 187212 DEBUG nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] No waiting events found dispatching network-vif-plugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.611 187212 WARNING nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received unexpected event network-vif-plugged-67a9975d-5d22-4a6a-af5f-83ab6b080d9a for instance with vm_state deleted and task_state None.
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.612 187212 DEBUG nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-unplugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.612 187212 DEBUG oslo_concurrency.lockutils [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.613 187212 DEBUG oslo_concurrency.lockutils [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.613 187212 DEBUG oslo_concurrency.lockutils [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.614 187212 DEBUG nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] No waiting events found dispatching network-vif-unplugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.614 187212 WARNING nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received unexpected event network-vif-unplugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 for instance with vm_state deleted and task_state None.
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.614 187212 DEBUG nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-plugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.614 187212 DEBUG oslo_concurrency.lockutils [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.615 187212 DEBUG oslo_concurrency.lockutils [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.615 187212 DEBUG oslo_concurrency.lockutils [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ef8fcf55-e147-4baf-b506-1d99af05d330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.615 187212 DEBUG nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] No waiting events found dispatching network-vif-plugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.615 187212 WARNING nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received unexpected event network-vif-plugged-04328ce4-34a7-41df-8b27-e1d5b7f3f280 for instance with vm_state deleted and task_state None.
Dec 05 12:12:08 compute-0 nova_compute[187208]: 2025-12-05 12:12:08.616 187212 DEBUG nova.compute.manager [req-ad23ab2b-c70d-4338-ab81-aa773517f4cc req-6c3660f7-4bda-4c90-8ebc-59b51bbfd2e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Received event network-vif-deleted-9e4c5b24-e3e0-4285-9ef8-924e0ab8e04f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:09 compute-0 nova_compute[187208]: 2025-12-05 12:12:09.864 187212 DEBUG nova.compute.manager [req-3269a108-5ca0-4979-b796-4beaa95bca1b req-ac4e07db-1025-49a4-a473-dbed7eede410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-deleted-04328ce4-34a7-41df-8b27-e1d5b7f3f280 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:09 compute-0 nova_compute[187208]: 2025-12-05 12:12:09.865 187212 DEBUG nova.compute.manager [req-3269a108-5ca0-4979-b796-4beaa95bca1b req-ac4e07db-1025-49a4-a473-dbed7eede410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-deleted-5f1f909d-4147-44de-9adf-829a12fc8bfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:09 compute-0 nova_compute[187208]: 2025-12-05 12:12:09.865 187212 DEBUG nova.compute.manager [req-3269a108-5ca0-4979-b796-4beaa95bca1b req-ac4e07db-1025-49a4-a473-dbed7eede410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Received event network-vif-deleted-67a9975d-5d22-4a6a-af5f-83ab6b080d9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:11 compute-0 podman[234901]: 2025-12-05 12:12:11.192177934 +0000 UTC m=+0.049767310 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:12:11 compute-0 podman[234902]: 2025-12-05 12:12:11.237038851 +0000 UTC m=+0.090043205 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:12:11 compute-0 ovn_controller[95610]: 2025-12-05T12:12:11Z|00857|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:12:11 compute-0 ovn_controller[95610]: 2025-12-05T12:12:11Z|00858|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.449 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.541 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "f3769524-43d7-4c3b-be59-18bf7af73e18" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.541 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.559 187212 DEBUG nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:12:11 compute-0 ovn_controller[95610]: 2025-12-05T12:12:11Z|00859|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:12:11 compute-0 ovn_controller[95610]: 2025-12-05T12:12:11Z|00860|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.643 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.700 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.700 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.708 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.708 187212 INFO nova.compute.claims [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.888 187212 DEBUG nova.compute.provider_tree [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.919 187212 DEBUG nova.scheduler.client.report [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.957 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:11 compute-0 nova_compute[187208]: 2025-12-05 12:12:11.958 187212 DEBUG nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.112 187212 DEBUG nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.113 187212 DEBUG nova.network.neutron [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.143 187212 INFO nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.347 187212 DEBUG nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.466 187212 DEBUG nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.468 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.468 187212 INFO nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Creating image(s)
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.469 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "/var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.469 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.470 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.486 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.547 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.548 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.549 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.563 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.622 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.623 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.649 187212 DEBUG nova.policy [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.667 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.668 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.668 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.727 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.729 187212 DEBUG nova.virt.disk.api [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Checking if we can resize image /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.729 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.790 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.791 187212 DEBUG nova.virt.disk.api [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Cannot resize image /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.792 187212 DEBUG nova.objects.instance [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'migration_context' on Instance uuid f3769524-43d7-4c3b-be59-18bf7af73e18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.810 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.811 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Ensure instance console log exists: /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.811 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.812 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:12 compute-0 nova_compute[187208]: 2025-12-05 12:12:12.812 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:13 compute-0 podman[234967]: 2025-12-05 12:12:13.201925488 +0000 UTC m=+0.059178359 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.279 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.435 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:13 compute-0 kernel: tap0bab1586-b0 (unregistering): left promiscuous mode
Dec 05 12:12:13 compute-0 NetworkManager[55691]: <info>  [1764936733.6253] device (tap0bab1586-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:12:13 compute-0 ovn_controller[95610]: 2025-12-05T12:12:13Z|00861|binding|INFO|Releasing lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 from this chassis (sb_readonly=0)
Dec 05 12:12:13 compute-0 ovn_controller[95610]: 2025-12-05T12:12:13Z|00862|binding|INFO|Setting lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 down in Southbound
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.632 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:13 compute-0 ovn_controller[95610]: 2025-12-05T12:12:13Z|00863|binding|INFO|Removing iface tap0bab1586-b0 ovn-installed in OVS
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.635 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:13.640 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a0:97 10.100.0.11'], port_security=['fa:16:3e:d5:a0:97 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:13.641 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 unbound from our chassis
Dec 05 12:12:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:13.643 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:12:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:13.643 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ff0839-39ae-4ae4-81e3-d49ad5fa2d9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.645 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.652 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936718.6513307, a70dccfb-2a89-4283-aba2-934af2667db3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.652 187212 INFO nova.compute.manager [-] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] VM Stopped (Lifecycle Event)
Dec 05 12:12:13 compute-0 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d00000056.scope: Deactivated successfully.
Dec 05 12:12:13 compute-0 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d00000056.scope: Consumed 12.515s CPU time.
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.676 187212 DEBUG nova.compute.manager [None req-f950e0b4-b230-48ae-b53a-ef0cd619163d - - - - - -] [instance: a70dccfb-2a89-4283-aba2-934af2667db3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:13 compute-0 systemd-machined[153543]: Machine qemu-96-instance-00000056 terminated.
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.854 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.979 187212 DEBUG nova.network.neutron [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Successfully created port: ecb20c89-e04b-4dcb-9b67-08705004bcf8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.986 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936718.9849665, 54d9605a-998b-4492-afc8-f7a5b0dd4e84 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:13 compute-0 nova_compute[187208]: 2025-12-05 12:12:13.986 187212 INFO nova.compute.manager [-] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] VM Stopped (Lifecycle Event)
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.008 187212 DEBUG nova.compute.manager [None req-72a29722-d915-45fd-8e68-dd843e315213 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.271 187212 DEBUG nova.compute.manager [req-37c2b121-0531-422b-9e0f-58494df1e9bb req-567f9583-4a0a-4a57-843e-46dd2e8eabc5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received event network-vif-unplugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.271 187212 DEBUG oslo_concurrency.lockutils [req-37c2b121-0531-422b-9e0f-58494df1e9bb req-567f9583-4a0a-4a57-843e-46dd2e8eabc5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.271 187212 DEBUG oslo_concurrency.lockutils [req-37c2b121-0531-422b-9e0f-58494df1e9bb req-567f9583-4a0a-4a57-843e-46dd2e8eabc5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.272 187212 DEBUG oslo_concurrency.lockutils [req-37c2b121-0531-422b-9e0f-58494df1e9bb req-567f9583-4a0a-4a57-843e-46dd2e8eabc5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.272 187212 DEBUG nova.compute.manager [req-37c2b121-0531-422b-9e0f-58494df1e9bb req-567f9583-4a0a-4a57-843e-46dd2e8eabc5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] No waiting events found dispatching network-vif-unplugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.272 187212 WARNING nova.compute.manager [req-37c2b121-0531-422b-9e0f-58494df1e9bb req-567f9583-4a0a-4a57-843e-46dd2e8eabc5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received unexpected event network-vif-unplugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 for instance with vm_state active and task_state rescuing.
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.464 187212 INFO nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance shutdown successfully after 13 seconds.
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.469 187212 INFO nova.virt.libvirt.driver [-] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance destroyed successfully.
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.470 187212 DEBUG nova.objects.instance [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'numa_topology' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.496 187212 INFO nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Attempting rescue
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.497 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.500 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.501 187212 INFO nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Creating image(s)
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.501 187212 DEBUG oslo_concurrency.lockutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.502 187212 DEBUG oslo_concurrency.lockutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.502 187212 DEBUG oslo_concurrency.lockutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.503 187212 DEBUG nova.objects.instance [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.529 187212 DEBUG oslo_concurrency.lockutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.530 187212 DEBUG oslo_concurrency.lockutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.541 187212 DEBUG oslo_concurrency.processutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.608 187212 DEBUG oslo_concurrency.processutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.610 187212 DEBUG oslo_concurrency.processutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.746 187212 DEBUG oslo_concurrency.processutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.rescue" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.748 187212 DEBUG oslo_concurrency.lockutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.749 187212 DEBUG nova.objects.instance [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'migration_context' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.765 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.766 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Start _get_guest_xml network_info=[{"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-689964296-network", "vif_mac": "fa:16:3e:d5:a0:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a6987852-063f-405d-a848-6b382694811e', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.767 187212 DEBUG nova.objects.instance [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'resources' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.792 187212 WARNING nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.797 187212 DEBUG nova.virt.libvirt.host [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.798 187212 DEBUG nova.virt.libvirt.host [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.802 187212 DEBUG nova.virt.libvirt.host [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.803 187212 DEBUG nova.virt.libvirt.host [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.803 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.803 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.804 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.804 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.804 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.805 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.805 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.805 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.805 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.805 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.806 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.806 187212 DEBUG nova.virt.hardware [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.806 187212 DEBUG nova.objects.instance [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.825 187212 DEBUG nova.virt.libvirt.vif [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:11:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-234668707',display_name='tempest-ServerRescueTestJSON-server-234668707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-234668707',id=86,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:11:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f73626a62534c97a06b6ec98d749111',ramdisk_id='',reservation_id='r-p6hppypz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-122605385',owner_user_name='tempest-ServerRescueTestJSON-122605385-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:11:55Z,user_data=None,user_id='d12bb49c0ca84e8dad933b49753c7b24',uuid=f2a101e0-138f-404e-b6e0-e1359272f560,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-689964296-network", "vif_mac": "fa:16:3e:d5:a0:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.826 187212 DEBUG nova.network.os_vif_util [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converting VIF {"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-689964296-network", "vif_mac": "fa:16:3e:d5:a0:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.827 187212 DEBUG nova.network.os_vif_util [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a0:97,bridge_name='br-int',has_traffic_filtering=True,id=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bab1586-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.827 187212 DEBUG nova.objects.instance [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.849 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <uuid>f2a101e0-138f-404e-b6e0-e1359272f560</uuid>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <name>instance-00000056</name>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerRescueTestJSON-server-234668707</nova:name>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:12:14</nova:creationTime>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:12:14 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:12:14 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:12:14 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:12:14 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:12:14 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:12:14 compute-0 nova_compute[187208]:         <nova:user uuid="d12bb49c0ca84e8dad933b49753c7b24">tempest-ServerRescueTestJSON-122605385-project-member</nova:user>
Dec 05 12:12:14 compute-0 nova_compute[187208]:         <nova:project uuid="8f73626a62534c97a06b6ec98d749111">tempest-ServerRescueTestJSON-122605385</nova:project>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:12:14 compute-0 nova_compute[187208]:         <nova:port uuid="0bab1586-b06a-4ae9-a0f9-9fbea816c5c2">
Dec 05 12:12:14 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <system>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <entry name="serial">f2a101e0-138f-404e-b6e0-e1359272f560</entry>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <entry name="uuid">f2a101e0-138f-404e-b6e0-e1359272f560</entry>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </system>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <os>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   </os>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <features>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   </features>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.rescue"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <target dev="vdb" bus="virtio"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.config.rescue"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:d5:a0:97"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <target dev="tap0bab1586-b0"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/console.log" append="off"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <video>
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </video>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:12:14 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:12:14 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:12:14 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:12:14 compute-0 nova_compute[187208]: </domain>
Dec 05 12:12:14 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.858 187212 INFO nova.virt.libvirt.driver [-] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance destroyed successfully.
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.920 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.920 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.921 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.921 187212 DEBUG nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No VIF found with MAC fa:16:3e:d5:a0:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.921 187212 INFO nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Using config drive
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.935 187212 DEBUG nova.objects.instance [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:14 compute-0 nova_compute[187208]: 2025-12-05 12:12:14.962 187212 DEBUG nova.objects.instance [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'keypairs' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.073 187212 INFO nova.virt.libvirt.driver [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Creating config drive at /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.config.rescue
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.077 187212 DEBUG oslo_concurrency.processutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6jy0vr99 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.204 187212 DEBUG oslo_concurrency.processutils [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6jy0vr99" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:16 compute-0 kernel: tap0bab1586-b0: entered promiscuous mode
Dec 05 12:12:16 compute-0 NetworkManager[55691]: <info>  [1764936736.2741] manager: (tap0bab1586-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/332)
Dec 05 12:12:16 compute-0 systemd-udevd[234991]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:12:16 compute-0 ovn_controller[95610]: 2025-12-05T12:12:16Z|00864|binding|INFO|Claiming lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 for this chassis.
Dec 05 12:12:16 compute-0 ovn_controller[95610]: 2025-12-05T12:12:16Z|00865|binding|INFO|0bab1586-b06a-4ae9-a0f9-9fbea816c5c2: Claiming fa:16:3e:d5:a0:97 10.100.0.11
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.276 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:16.283 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a0:97 10.100.0.11'], port_security=['fa:16:3e:d5:a0:97 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '5', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:16.284 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 bound to our chassis
Dec 05 12:12:16 compute-0 NetworkManager[55691]: <info>  [1764936736.2852] device (tap0bab1586-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:12:16 compute-0 NetworkManager[55691]: <info>  [1764936736.2860] device (tap0bab1586-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:12:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:16.286 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:12:16 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:16.286 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[04e922ba-a11b-41f6-8dc8-58748eb4f2d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:16 compute-0 ovn_controller[95610]: 2025-12-05T12:12:16Z|00866|binding|INFO|Setting lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 up in Southbound
Dec 05 12:12:16 compute-0 ovn_controller[95610]: 2025-12-05T12:12:16Z|00867|binding|INFO|Setting lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 ovn-installed in OVS
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.289 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.291 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:16 compute-0 systemd-machined[153543]: New machine qemu-98-instance-00000056.
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.321 187212 DEBUG nova.network.neutron [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Successfully updated port: ecb20c89-e04b-4dcb-9b67-08705004bcf8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:12:16 compute-0 systemd[1]: Started Virtual Machine qemu-98-instance-00000056.
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.345 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "refresh_cache-f3769524-43d7-4c3b-be59-18bf7af73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.345 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquired lock "refresh_cache-f3769524-43d7-4c3b-be59-18bf7af73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.346 187212 DEBUG nova.network.neutron [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.445 187212 DEBUG nova.compute.manager [req-010334a9-cce1-4116-9bf5-8b7c7a244f1d req-8a5ca67d-4561-415a-acd6-d3206711d541 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.445 187212 DEBUG oslo_concurrency.lockutils [req-010334a9-cce1-4116-9bf5-8b7c7a244f1d req-8a5ca67d-4561-415a-acd6-d3206711d541 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.445 187212 DEBUG oslo_concurrency.lockutils [req-010334a9-cce1-4116-9bf5-8b7c7a244f1d req-8a5ca67d-4561-415a-acd6-d3206711d541 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.445 187212 DEBUG oslo_concurrency.lockutils [req-010334a9-cce1-4116-9bf5-8b7c7a244f1d req-8a5ca67d-4561-415a-acd6-d3206711d541 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.446 187212 DEBUG nova.compute.manager [req-010334a9-cce1-4116-9bf5-8b7c7a244f1d req-8a5ca67d-4561-415a-acd6-d3206711d541 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] No waiting events found dispatching network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.446 187212 WARNING nova.compute.manager [req-010334a9-cce1-4116-9bf5-8b7c7a244f1d req-8a5ca67d-4561-415a-acd6-d3206711d541 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received unexpected event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 for instance with vm_state active and task_state rescuing.
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.542 187212 DEBUG nova.compute.manager [req-07036655-baa4-45f1-b1d5-fc37fd667ec5 req-09d91896-19c0-4705-9437-2bb143e4cbed 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Received event network-changed-ecb20c89-e04b-4dcb-9b67-08705004bcf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.542 187212 DEBUG nova.compute.manager [req-07036655-baa4-45f1-b1d5-fc37fd667ec5 req-09d91896-19c0-4705-9437-2bb143e4cbed 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Refreshing instance network info cache due to event network-changed-ecb20c89-e04b-4dcb-9b67-08705004bcf8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.542 187212 DEBUG oslo_concurrency.lockutils [req-07036655-baa4-45f1-b1d5-fc37fd667ec5 req-09d91896-19c0-4705-9437-2bb143e4cbed 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f3769524-43d7-4c3b-be59-18bf7af73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.581 187212 DEBUG nova.network.neutron [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.653 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for f2a101e0-138f-404e-b6e0-e1359272f560 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.654 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936736.6524525, f2a101e0-138f-404e-b6e0-e1359272f560 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.654 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] VM Resumed (Lifecycle Event)
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.659 187212 DEBUG nova.compute.manager [None req-df86a76c-0dac-418d-a4c3-5ad3a26aa2ef d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.689 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.693 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.718 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] During sync_power_state the instance has a pending task (rescuing). Skip.
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.719 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936736.6527867, f2a101e0-138f-404e-b6e0-e1359272f560 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.719 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] VM Started (Lifecycle Event)
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.740 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:16 compute-0 nova_compute[187208]: 2025-12-05 12:12:16.745 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.670 187212 DEBUG nova.network.neutron [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Updating instance_info_cache with network_info: [{"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.699 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Releasing lock "refresh_cache-f3769524-43d7-4c3b-be59-18bf7af73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.699 187212 DEBUG nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Instance network_info: |[{"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.699 187212 DEBUG oslo_concurrency.lockutils [req-07036655-baa4-45f1-b1d5-fc37fd667ec5 req-09d91896-19c0-4705-9437-2bb143e4cbed 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f3769524-43d7-4c3b-be59-18bf7af73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.700 187212 DEBUG nova.network.neutron [req-07036655-baa4-45f1-b1d5-fc37fd667ec5 req-09d91896-19c0-4705-9437-2bb143e4cbed 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Refreshing network info cache for port ecb20c89-e04b-4dcb-9b67-08705004bcf8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.703 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Start _get_guest_xml network_info=[{"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.709 187212 WARNING nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.713 187212 DEBUG nova.virt.libvirt.host [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.714 187212 DEBUG nova.virt.libvirt.host [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.719 187212 DEBUG nova.virt.libvirt.host [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.719 187212 DEBUG nova.virt.libvirt.host [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.719 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.720 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.720 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.720 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.721 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.721 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.721 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.721 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.721 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.721 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.722 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.722 187212 DEBUG nova.virt.hardware [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.725 187212 DEBUG nova.virt.libvirt.vif [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1949687168',display_name='tempest-ServersTestJSON-server-1949687168',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1949687168',id=87,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-bluhmwh4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:12Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=f3769524-43d7-4c3b-be59-18bf7af73e18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.726 187212 DEBUG nova.network.os_vif_util [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.726 187212 DEBUG nova.network.os_vif_util [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:25:c4,bridge_name='br-int',has_traffic_filtering=True,id=ecb20c89-e04b-4dcb-9b67-08705004bcf8,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecb20c89-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.728 187212 DEBUG nova.objects.instance [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'pci_devices' on Instance uuid f3769524-43d7-4c3b-be59-18bf7af73e18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.755 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <uuid>f3769524-43d7-4c3b-be59-18bf7af73e18</uuid>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <name>instance-00000057</name>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestJSON-server-1949687168</nova:name>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:12:17</nova:creationTime>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:12:17 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:12:17 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:12:17 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:12:17 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:12:17 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:12:17 compute-0 nova_compute[187208]:         <nova:user uuid="62153b585ecc4e6fa2ad567851d49081">tempest-ServersTestJSON-1492365581-project-member</nova:user>
Dec 05 12:12:17 compute-0 nova_compute[187208]:         <nova:project uuid="0c982a61e3fc4c8da9248076bb0361ac">tempest-ServersTestJSON-1492365581</nova:project>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:12:17 compute-0 nova_compute[187208]:         <nova:port uuid="ecb20c89-e04b-4dcb-9b67-08705004bcf8">
Dec 05 12:12:17 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <system>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <entry name="serial">f3769524-43d7-4c3b-be59-18bf7af73e18</entry>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <entry name="uuid">f3769524-43d7-4c3b-be59-18bf7af73e18</entry>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     </system>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <os>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   </os>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <features>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   </features>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk.config"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:a8:25:c4"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <target dev="tapecb20c89-e0"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/console.log" append="off"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <video>
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     </video>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:12:17 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:12:17 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:12:17 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:12:17 compute-0 nova_compute[187208]: </domain>
Dec 05 12:12:17 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.755 187212 DEBUG nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Preparing to wait for external event network-vif-plugged-ecb20c89-e04b-4dcb-9b67-08705004bcf8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.756 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.756 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.756 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.757 187212 DEBUG nova.virt.libvirt.vif [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1949687168',display_name='tempest-ServersTestJSON-server-1949687168',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1949687168',id=87,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-bluhmwh4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:12Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=f3769524-43d7-4c3b-be59-18bf7af73e18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.757 187212 DEBUG nova.network.os_vif_util [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.758 187212 DEBUG nova.network.os_vif_util [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:25:c4,bridge_name='br-int',has_traffic_filtering=True,id=ecb20c89-e04b-4dcb-9b67-08705004bcf8,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecb20c89-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.758 187212 DEBUG os_vif [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:25:c4,bridge_name='br-int',has_traffic_filtering=True,id=ecb20c89-e04b-4dcb-9b67-08705004bcf8,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecb20c89-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.759 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.759 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.764 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapecb20c89-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapecb20c89-e0, col_values=(('external_ids', {'iface-id': 'ecb20c89-e04b-4dcb-9b67-08705004bcf8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:25:c4', 'vm-uuid': 'f3769524-43d7-4c3b-be59-18bf7af73e18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:17 compute-0 NetworkManager[55691]: <info>  [1764936737.7687] manager: (tapecb20c89-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/333)
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.773 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.774 187212 INFO os_vif [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:25:c4,bridge_name='br-int',has_traffic_filtering=True,id=ecb20c89-e04b-4dcb-9b67-08705004bcf8,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecb20c89-e0')
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.917 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.918 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.918 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No VIF found with MAC fa:16:3e:a8:25:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.918 187212 INFO nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Using config drive
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.988 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936722.9870348, c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:17 compute-0 nova_compute[187208]: 2025-12-05 12:12:17.988 187212 INFO nova.compute.manager [-] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] VM Stopped (Lifecycle Event)
Dec 05 12:12:18 compute-0 nova_compute[187208]: 2025-12-05 12:12:18.011 187212 DEBUG nova.compute.manager [None req-4fa914f5-0f59-416b-a3aa-0d0628cd6971 - - - - - -] [instance: c1d8824a-0fa0-48aa-8939-fa96fd7fa1a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:18 compute-0 nova_compute[187208]: 2025-12-05 12:12:18.281 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:18 compute-0 nova_compute[187208]: 2025-12-05 12:12:18.377 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936723.376913, ef8fcf55-e147-4baf-b506-1d99af05d330 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:18 compute-0 nova_compute[187208]: 2025-12-05 12:12:18.378 187212 INFO nova.compute.manager [-] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] VM Stopped (Lifecycle Event)
Dec 05 12:12:18 compute-0 nova_compute[187208]: 2025-12-05 12:12:18.396 187212 DEBUG nova.compute.manager [None req-dcd2067d-f06b-4808-9b38-6eba7afb2cbd - - - - - -] [instance: ef8fcf55-e147-4baf-b506-1d99af05d330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.522 187212 DEBUG nova.compute.manager [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.522 187212 DEBUG oslo_concurrency.lockutils [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.523 187212 DEBUG oslo_concurrency.lockutils [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.523 187212 DEBUG oslo_concurrency.lockutils [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.524 187212 DEBUG nova.compute.manager [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] No waiting events found dispatching network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.524 187212 WARNING nova.compute.manager [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received unexpected event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 for instance with vm_state rescued and task_state None.
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.524 187212 DEBUG nova.compute.manager [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.525 187212 DEBUG oslo_concurrency.lockutils [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.525 187212 DEBUG oslo_concurrency.lockutils [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.526 187212 DEBUG oslo_concurrency.lockutils [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.526 187212 DEBUG nova.compute.manager [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] No waiting events found dispatching network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.527 187212 WARNING nova.compute.manager [req-02d7a692-4692-43f0-91aa-d4b983e91a97 req-8455c616-b8f3-4b10-b6e3-a3bfd8cd58d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received unexpected event network-vif-plugged-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 for instance with vm_state rescued and task_state None.
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.703 187212 INFO nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Creating config drive at /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk.config
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.709 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeccgzc30 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.837 187212 DEBUG oslo_concurrency.processutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeccgzc30" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:19 compute-0 kernel: tapecb20c89-e0: entered promiscuous mode
Dec 05 12:12:19 compute-0 NetworkManager[55691]: <info>  [1764936739.9022] manager: (tapecb20c89-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/334)
Dec 05 12:12:19 compute-0 ovn_controller[95610]: 2025-12-05T12:12:19Z|00868|binding|INFO|Claiming lport ecb20c89-e04b-4dcb-9b67-08705004bcf8 for this chassis.
Dec 05 12:12:19 compute-0 ovn_controller[95610]: 2025-12-05T12:12:19Z|00869|binding|INFO|ecb20c89-e04b-4dcb-9b67-08705004bcf8: Claiming fa:16:3e:a8:25:c4 10.100.0.7
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.905 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:19 compute-0 ovn_controller[95610]: 2025-12-05T12:12:19Z|00870|binding|INFO|Setting lport ecb20c89-e04b-4dcb-9b67-08705004bcf8 ovn-installed in OVS
Dec 05 12:12:19 compute-0 ovn_controller[95610]: 2025-12-05T12:12:19Z|00871|binding|INFO|Setting lport ecb20c89-e04b-4dcb-9b67-08705004bcf8 up in Southbound
Dec 05 12:12:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:19.915 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:25:c4 10.100.0.7'], port_security=['fa:16:3e:a8:25:c4 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ecb20c89-e04b-4dcb-9b67-08705004bcf8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:19.917 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ecb20c89-e04b-4dcb-9b67-08705004bcf8 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 bound to our chassis
Dec 05 12:12:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:19.919 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:19 compute-0 nova_compute[187208]: 2025-12-05 12:12:19.924 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:19.935 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3838de44-9f2e-434f-ab2d-f2a1aa1ec8be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:19 compute-0 systemd-udevd[235072]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:12:19 compute-0 systemd-machined[153543]: New machine qemu-99-instance-00000057.
Dec 05 12:12:19 compute-0 NetworkManager[55691]: <info>  [1764936739.9566] device (tapecb20c89-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:12:19 compute-0 NetworkManager[55691]: <info>  [1764936739.9577] device (tapecb20c89-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:12:19 compute-0 systemd[1]: Started Virtual Machine qemu-99-instance-00000057.
Dec 05 12:12:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:19.969 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e995b40b-cf6f-47bd-96ac-cabd497d4cf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:19.973 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[57751dbd-52c7-4099-83d0-c4efb78905ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:20.002 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[059723e6-84fb-42d7-ba16-3614ea32c5d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:20.018 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[647ada65-816e-4562-b2e3-47507f49f068]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 13, 'rx_bytes': 616, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 13, 'rx_bytes': 616, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235083, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:20.032 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[63b0e8d1-de4a-45dc-a0f3-9e0860e88fd6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235086, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235086, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:20.034 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.035 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.036 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:20.036 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:20.037 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:20.037 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:20 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:20.037 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.397 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936740.396692, f3769524-43d7-4c3b-be59-18bf7af73e18 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.398 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] VM Started (Lifecycle Event)
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.424 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.432 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936740.3969622, f3769524-43d7-4c3b-be59-18bf7af73e18 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.432 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] VM Paused (Lifecycle Event)
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.452 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.465 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.501 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.716 187212 DEBUG nova.network.neutron [req-07036655-baa4-45f1-b1d5-fc37fd667ec5 req-09d91896-19c0-4705-9437-2bb143e4cbed 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Updated VIF entry in instance network info cache for port ecb20c89-e04b-4dcb-9b67-08705004bcf8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.716 187212 DEBUG nova.network.neutron [req-07036655-baa4-45f1-b1d5-fc37fd667ec5 req-09d91896-19c0-4705-9437-2bb143e4cbed 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Updating instance_info_cache with network_info: [{"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:20 compute-0 nova_compute[187208]: 2025-12-05 12:12:20.738 187212 DEBUG oslo_concurrency.lockutils [req-07036655-baa4-45f1-b1d5-fc37fd667ec5 req-09d91896-19c0-4705-9437-2bb143e4cbed 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f3769524-43d7-4c3b-be59-18bf7af73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.123 187212 DEBUG nova.compute.manager [req-ba34d4bd-f7f8-481c-80c7-1ecde60627c3 req-b942c859-d099-461a-9df1-3c6cd9837531 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Received event network-vif-plugged-ecb20c89-e04b-4dcb-9b67-08705004bcf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.124 187212 DEBUG oslo_concurrency.lockutils [req-ba34d4bd-f7f8-481c-80c7-1ecde60627c3 req-b942c859-d099-461a-9df1-3c6cd9837531 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.124 187212 DEBUG oslo_concurrency.lockutils [req-ba34d4bd-f7f8-481c-80c7-1ecde60627c3 req-b942c859-d099-461a-9df1-3c6cd9837531 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.124 187212 DEBUG oslo_concurrency.lockutils [req-ba34d4bd-f7f8-481c-80c7-1ecde60627c3 req-b942c859-d099-461a-9df1-3c6cd9837531 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.124 187212 DEBUG nova.compute.manager [req-ba34d4bd-f7f8-481c-80c7-1ecde60627c3 req-b942c859-d099-461a-9df1-3c6cd9837531 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Processing event network-vif-plugged-ecb20c89-e04b-4dcb-9b67-08705004bcf8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.125 187212 DEBUG nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.127 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936742.1274996, f3769524-43d7-4c3b-be59-18bf7af73e18 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.127 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] VM Resumed (Lifecycle Event)
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.143 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.152 187212 INFO nova.virt.libvirt.driver [-] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Instance spawned successfully.
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.152 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.157 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.162 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.174 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.174 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.175 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.175 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.175 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.176 187212 DEBUG nova.virt.libvirt.driver [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.179 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:12:22 compute-0 podman[235095]: 2025-12-05 12:12:22.207643906 +0000 UTC m=+0.058805629 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.238 187212 INFO nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Took 9.77 seconds to spawn the instance on the hypervisor.
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.238 187212 DEBUG nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.310 187212 INFO nova.compute.manager [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Took 10.71 seconds to build instance.
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.326 187212 DEBUG oslo_concurrency.lockutils [None req-c6268c10-bb9f-4fa7-87f4-de89b4cbf07e 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.736 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.737 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.759 187212 DEBUG nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.887 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.888 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.896 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:12:22 compute-0 nova_compute[187208]: 2025-12-05 12:12:22.896 187212 INFO nova.compute.claims [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.098 187212 DEBUG nova.compute.provider_tree [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.113 187212 DEBUG nova.scheduler.client.report [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.143 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.144 187212 DEBUG nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.193 187212 DEBUG nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.194 187212 DEBUG nova.network.neutron [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.217 187212 INFO nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.238 187212 DEBUG nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.282 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.343 187212 DEBUG nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.344 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.344 187212 INFO nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Creating image(s)
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.345 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.345 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.346 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.359 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.427 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.428 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.429 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.444 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.507 187212 DEBUG nova.policy [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.510 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.510 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.544 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.545 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.546 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.610 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.611 187212 DEBUG nova.virt.disk.api [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Checking if we can resize image /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.611 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.756 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk --force-share --output=json" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.757 187212 DEBUG nova.virt.disk.api [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Cannot resize image /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.758 187212 DEBUG nova.objects.instance [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'migration_context' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.777 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.778 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Ensure instance console log exists: /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.778 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.779 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:23 compute-0 nova_compute[187208]: 2025-12-05 12:12:23.779 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:24 compute-0 nova_compute[187208]: 2025-12-05 12:12:24.684 187212 DEBUG nova.network.neutron [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Successfully created port: c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:12:25 compute-0 nova_compute[187208]: 2025-12-05 12:12:25.996 187212 DEBUG nova.network.neutron [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Successfully updated port: c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.010 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.010 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquired lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.011 187212 DEBUG nova.network.neutron [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.264 187212 DEBUG nova.network.neutron [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.643 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "7bf2559a-9191-4322-9e46-19761de59dc9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.643 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.667 187212 DEBUG nova.compute.manager [req-f3d7b591-2e9a-4b0d-9551-b0eb95ca1239 req-5c3d9336-5524-4abb-93ae-9a88e5dabef4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Received event network-vif-plugged-ecb20c89-e04b-4dcb-9b67-08705004bcf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.667 187212 DEBUG oslo_concurrency.lockutils [req-f3d7b591-2e9a-4b0d-9551-b0eb95ca1239 req-5c3d9336-5524-4abb-93ae-9a88e5dabef4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.667 187212 DEBUG oslo_concurrency.lockutils [req-f3d7b591-2e9a-4b0d-9551-b0eb95ca1239 req-5c3d9336-5524-4abb-93ae-9a88e5dabef4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.668 187212 DEBUG oslo_concurrency.lockutils [req-f3d7b591-2e9a-4b0d-9551-b0eb95ca1239 req-5c3d9336-5524-4abb-93ae-9a88e5dabef4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.668 187212 DEBUG nova.compute.manager [req-f3d7b591-2e9a-4b0d-9551-b0eb95ca1239 req-5c3d9336-5524-4abb-93ae-9a88e5dabef4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] No waiting events found dispatching network-vif-plugged-ecb20c89-e04b-4dcb-9b67-08705004bcf8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.668 187212 WARNING nova.compute.manager [req-f3d7b591-2e9a-4b0d-9551-b0eb95ca1239 req-5c3d9336-5524-4abb-93ae-9a88e5dabef4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Received unexpected event network-vif-plugged-ecb20c89-e04b-4dcb-9b67-08705004bcf8 for instance with vm_state active and task_state None.
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.669 187212 DEBUG nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.764 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.764 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.771 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:12:26 compute-0 nova_compute[187208]: 2025-12-05 12:12:26.772 187212 INFO nova.compute.claims [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.011 187212 DEBUG nova.compute.provider_tree [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.030 187212 DEBUG nova.scheduler.client.report [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.056 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.057 187212 DEBUG nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.117 187212 DEBUG nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.117 187212 DEBUG nova.network.neutron [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.129 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.129 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.158 187212 INFO nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.174 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.181 187212 DEBUG nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.195 187212 DEBUG nova.compute.manager [req-30612702-9f5c-4cdf-ba45-05a20e889f3c req-8517269e-a6dd-471c-a7f3-a6ecac39644e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-changed-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.195 187212 DEBUG nova.compute.manager [req-30612702-9f5c-4cdf-ba45-05a20e889f3c req-8517269e-a6dd-471c-a7f3-a6ecac39644e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Refreshing instance network info cache due to event network-changed-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.196 187212 DEBUG oslo_concurrency.lockutils [req-30612702-9f5c-4cdf-ba45-05a20e889f3c req-8517269e-a6dd-471c-a7f3-a6ecac39644e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.287 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.288 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.294 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.294 187212 INFO nova.compute.claims [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.309 187212 DEBUG nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.311 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.311 187212 INFO nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Creating image(s)
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.312 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "/var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.312 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.313 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.327 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.389 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.390 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.391 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.404 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.471 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.472 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.587 187212 DEBUG nova.compute.provider_tree [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.611 187212 DEBUG nova.scheduler.client.report [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.636 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.636 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.642 187212 DEBUG nova.policy [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.680 187212 DEBUG nova.network.neutron [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Updating instance_info_cache with network_info: [{"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.710 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.710 187212 DEBUG nova.network.neutron [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.714 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Releasing lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.714 187212 DEBUG nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance network_info: |[{"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.715 187212 DEBUG oslo_concurrency.lockutils [req-30612702-9f5c-4cdf-ba45-05a20e889f3c req-8517269e-a6dd-471c-a7f3-a6ecac39644e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.715 187212 DEBUG nova.network.neutron [req-30612702-9f5c-4cdf-ba45-05a20e889f3c req-8517269e-a6dd-471c-a7f3-a6ecac39644e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Refreshing network info cache for port c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.717 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Start _get_guest_xml network_info=[{"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.722 187212 WARNING nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.724 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk 1073741824" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.724 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.725 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.746 187212 INFO nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.759 187212 DEBUG nova.virt.libvirt.host [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.760 187212 DEBUG nova.virt.libvirt.host [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.768 187212 DEBUG nova.virt.libvirt.host [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.769 187212 DEBUG nova.virt.libvirt.host [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.769 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.769 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.770 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.770 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.770 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.771 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.771 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.771 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.771 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.772 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.772 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.772 187212 DEBUG nova.virt.hardware [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.777 187212 DEBUG nova.virt.libvirt.vif [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-604643112',display_name='tempest-ServerRescueTestJSON-server-604643112',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-604643112',id=88,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f73626a62534c97a06b6ec98d749111',ramdisk_id='',reservation_id='r-ri0fnqjs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-122605385',owner_user_name='tempest-ServerRescueTestJSON-122605385-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:23Z,user_data=None,user_id='d12bb49c0ca84e8dad933b49753c7b24',uuid=b661b497-acb9-4b26-8e26-7d0802bca8bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.777 187212 DEBUG nova.network.os_vif_util [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converting VIF {"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.778 187212 DEBUG nova.network.os_vif_util [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:e5:61,bridge_name='br-int',has_traffic_filtering=True,id=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7d6a93d-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.779 187212 DEBUG nova.objects.instance [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'pci_devices' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.781 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.784 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.790 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.791 187212 DEBUG nova.virt.disk.api [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Checking if we can resize image /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.791 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.813 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <uuid>b661b497-acb9-4b26-8e26-7d0802bca8bf</uuid>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <name>instance-00000058</name>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerRescueTestJSON-server-604643112</nova:name>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:12:27</nova:creationTime>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:12:27 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:12:27 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:12:27 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:12:27 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:12:27 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:12:27 compute-0 nova_compute[187208]:         <nova:user uuid="d12bb49c0ca84e8dad933b49753c7b24">tempest-ServerRescueTestJSON-122605385-project-member</nova:user>
Dec 05 12:12:27 compute-0 nova_compute[187208]:         <nova:project uuid="8f73626a62534c97a06b6ec98d749111">tempest-ServerRescueTestJSON-122605385</nova:project>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:12:27 compute-0 nova_compute[187208]:         <nova:port uuid="c7d6a93d-8775-4c4e-9a60-bc8e87e5b310">
Dec 05 12:12:27 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <system>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <entry name="serial">b661b497-acb9-4b26-8e26-7d0802bca8bf</entry>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <entry name="uuid">b661b497-acb9-4b26-8e26-7d0802bca8bf</entry>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     </system>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <os>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   </os>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <features>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   </features>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.config"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:17:e5:61"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <target dev="tapc7d6a93d-87"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/console.log" append="off"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <video>
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     </video>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:12:27 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:12:27 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:12:27 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:12:27 compute-0 nova_compute[187208]: </domain>
Dec 05 12:12:27 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.814 187212 DEBUG nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Preparing to wait for external event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.814 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.814 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.814 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.815 187212 DEBUG nova.virt.libvirt.vif [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-604643112',display_name='tempest-ServerRescueTestJSON-server-604643112',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-604643112',id=88,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f73626a62534c97a06b6ec98d749111',ramdisk_id='',reservation_id='r-ri0fnqjs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-122605385',owner_user_name='tempest-ServerRescueTestJSON-122605385-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:23Z,user_data=None,user_id='d12bb49c0ca84e8dad933b49753c7b24',uuid=b661b497-acb9-4b26-8e26-7d0802bca8bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.816 187212 DEBUG nova.network.os_vif_util [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converting VIF {"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.816 187212 DEBUG nova.network.os_vif_util [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:e5:61,bridge_name='br-int',has_traffic_filtering=True,id=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7d6a93d-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.817 187212 DEBUG os_vif [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:e5:61,bridge_name='br-int',has_traffic_filtering=True,id=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7d6a93d-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.817 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.818 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.818 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.821 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.821 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc7d6a93d-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.822 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc7d6a93d-87, col_values=(('external_ids', {'iface-id': 'c7d6a93d-8775-4c4e-9a60-bc8e87e5b310', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:e5:61', 'vm-uuid': 'b661b497-acb9-4b26-8e26-7d0802bca8bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:27 compute-0 NetworkManager[55691]: <info>  [1764936747.8247] manager: (tapc7d6a93d-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.831 187212 INFO os_vif [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:e5:61,bridge_name='br-int',has_traffic_filtering=True,id=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7d6a93d-87')
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.859 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.860 187212 DEBUG nova.virt.disk.api [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Cannot resize image /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.860 187212 DEBUG nova.objects.instance [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'migration_context' on Instance uuid 7bf2559a-9191-4322-9e46-19761de59dc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.884 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.884 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Ensure instance console log exists: /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.885 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.885 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.885 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.910 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.912 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.912 187212 INFO nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Creating image(s)
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.912 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "/var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.913 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "/var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.914 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "/var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.926 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.987 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.988 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.989 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No VIF found with MAC fa:16:3e:17:e5:61, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:12:27 compute-0 nova_compute[187208]: 2025-12-05 12:12:27.989 187212 INFO nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Using config drive
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.019 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.019 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.020 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.039 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.100 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.101 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.287 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.448 187212 DEBUG nova.policy [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '430719002c284cd28237859ea6061eef', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.644 187212 INFO nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Creating config drive at /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.config
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.650 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfvuzfvsd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.778 187212 DEBUG oslo_concurrency.processutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfvuzfvsd" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.785 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk 1073741824" returned: 0 in 0.684s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.786 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.787 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.862 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.863 187212 DEBUG nova.virt.disk.api [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Checking if we can resize image /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.864 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:28 compute-0 NetworkManager[55691]: <info>  [1764936748.8679] manager: (tapc7d6a93d-87): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Dec 05 12:12:28 compute-0 kernel: tapc7d6a93d-87: entered promiscuous mode
Dec 05 12:12:28 compute-0 ovn_controller[95610]: 2025-12-05T12:12:28Z|00872|binding|INFO|Claiming lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for this chassis.
Dec 05 12:12:28 compute-0 ovn_controller[95610]: 2025-12-05T12:12:28Z|00873|binding|INFO|c7d6a93d-8775-4c4e-9a60-bc8e87e5b310: Claiming fa:16:3e:17:e5:61 10.100.0.8
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.906 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:28 compute-0 ovn_controller[95610]: 2025-12-05T12:12:28Z|00874|binding|INFO|Setting lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 ovn-installed in OVS
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.920 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:28 compute-0 ovn_controller[95610]: 2025-12-05T12:12:28Z|00875|binding|INFO|Setting lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 up in Southbound
Dec 05 12:12:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:28.923 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:e5:61 10.100.0.8'], port_security=['fa:16:3e:17:e5:61 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '2', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.925 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:28.927 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 bound to our chassis
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.934 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:28 compute-0 systemd-udevd[235216]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.935 187212 DEBUG nova.virt.disk.api [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Cannot resize image /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.935 187212 DEBUG nova.objects.instance [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lazy-loading 'migration_context' on Instance uuid 846c0e55-1620-4c7a-9792-d4f5f0d728d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:28.929 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:12:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:28.943 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35503741-0a7f-43a6-966d-e62afca18cba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:28 compute-0 NetworkManager[55691]: <info>  [1764936748.9500] device (tapc7d6a93d-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:12:28 compute-0 NetworkManager[55691]: <info>  [1764936748.9507] device (tapc7d6a93d-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.950 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.951 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Ensure instance console log exists: /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.951 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.952 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:28 compute-0 nova_compute[187208]: 2025-12-05 12:12:28.952 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:28 compute-0 systemd-machined[153543]: New machine qemu-100-instance-00000058.
Dec 05 12:12:28 compute-0 podman[235187]: 2025-12-05 12:12:28.969213654 +0000 UTC m=+0.109630517 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:12:28 compute-0 systemd[1]: Started Virtual Machine qemu-100-instance-00000058.
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.336 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936749.3358464, b661b497-acb9-4b26-8e26-7d0802bca8bf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.338 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] VM Started (Lifecycle Event)
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.363 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.368 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936749.335928, b661b497-acb9-4b26-8e26-7d0802bca8bf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.369 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] VM Paused (Lifecycle Event)
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.388 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.392 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.413 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.753 187212 DEBUG nova.network.neutron [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Successfully created port: 8f30bb1b-124e-4840-966f-7fbd73ceba98 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.956 187212 DEBUG nova.compute.manager [req-d34d3f3c-f3fe-45b2-aee4-fdac643e3a08 req-86647f58-49e0-4d06-ad53-7b0ae3e81cf6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.956 187212 DEBUG oslo_concurrency.lockutils [req-d34d3f3c-f3fe-45b2-aee4-fdac643e3a08 req-86647f58-49e0-4d06-ad53-7b0ae3e81cf6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.956 187212 DEBUG oslo_concurrency.lockutils [req-d34d3f3c-f3fe-45b2-aee4-fdac643e3a08 req-86647f58-49e0-4d06-ad53-7b0ae3e81cf6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.957 187212 DEBUG oslo_concurrency.lockutils [req-d34d3f3c-f3fe-45b2-aee4-fdac643e3a08 req-86647f58-49e0-4d06-ad53-7b0ae3e81cf6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.957 187212 DEBUG nova.compute.manager [req-d34d3f3c-f3fe-45b2-aee4-fdac643e3a08 req-86647f58-49e0-4d06-ad53-7b0ae3e81cf6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Processing event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.958 187212 DEBUG nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.974 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936749.9607217, b661b497-acb9-4b26-8e26-7d0802bca8bf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.974 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] VM Resumed (Lifecycle Event)
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.979 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.983 187212 INFO nova.virt.libvirt.driver [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance spawned successfully.
Dec 05 12:12:29 compute-0 nova_compute[187208]: 2025-12-05 12:12:29.984 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.001 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.007 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.013 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.014 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.014 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.014 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.015 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.016 187212 DEBUG nova.virt.libvirt.driver [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.029 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.112 187212 INFO nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Took 6.77 seconds to spawn the instance on the hypervisor.
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.113 187212 DEBUG nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.179 187212 INFO nova.compute.manager [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Took 7.36 seconds to build instance.
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.195 187212 DEBUG oslo_concurrency.lockutils [None req-84d15703-2adf-4733-b0e2-6fcf8d190b64 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.641 187212 DEBUG nova.network.neutron [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Successfully created port: 0e7829e3-325a-430d-898f-510a4c544ffa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.741 187212 DEBUG nova.network.neutron [req-30612702-9f5c-4cdf-ba45-05a20e889f3c req-8517269e-a6dd-471c-a7f3-a6ecac39644e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Updated VIF entry in instance network info cache for port c7d6a93d-8775-4c4e-9a60-bc8e87e5b310. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.742 187212 DEBUG nova.network.neutron [req-30612702-9f5c-4cdf-ba45-05a20e889f3c req-8517269e-a6dd-471c-a7f3-a6ecac39644e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Updating instance_info_cache with network_info: [{"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:30 compute-0 nova_compute[187208]: 2025-12-05 12:12:30.754 187212 DEBUG oslo_concurrency.lockutils [req-30612702-9f5c-4cdf-ba45-05a20e889f3c req-8517269e-a6dd-471c-a7f3-a6ecac39644e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:31 compute-0 nova_compute[187208]: 2025-12-05 12:12:31.642 187212 DEBUG nova.network.neutron [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Successfully created port: 961ee213-955e-471f-8cb4-ad0d3a82285c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.079 187212 DEBUG nova.compute.manager [req-962ac69c-1066-4764-9a1a-176bfd404700 req-993b46ac-6c7f-45fd-8925-8d41054a88e3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.079 187212 DEBUG oslo_concurrency.lockutils [req-962ac69c-1066-4764-9a1a-176bfd404700 req-993b46ac-6c7f-45fd-8925-8d41054a88e3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.080 187212 DEBUG oslo_concurrency.lockutils [req-962ac69c-1066-4764-9a1a-176bfd404700 req-993b46ac-6c7f-45fd-8925-8d41054a88e3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.080 187212 DEBUG oslo_concurrency.lockutils [req-962ac69c-1066-4764-9a1a-176bfd404700 req-993b46ac-6c7f-45fd-8925-8d41054a88e3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.080 187212 DEBUG nova.compute.manager [req-962ac69c-1066-4764-9a1a-176bfd404700 req-993b46ac-6c7f-45fd-8925-8d41054a88e3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] No waiting events found dispatching network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.080 187212 WARNING nova.compute.manager [req-962ac69c-1066-4764-9a1a-176bfd404700 req-993b46ac-6c7f-45fd-8925-8d41054a88e3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received unexpected event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for instance with vm_state active and task_state None.
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.552 187212 INFO nova.compute.manager [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Rescuing
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.553 187212 DEBUG oslo_concurrency.lockutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.553 187212 DEBUG oslo_concurrency.lockutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquired lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.553 187212 DEBUG nova.network.neutron [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.605 187212 DEBUG nova.network.neutron [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Successfully updated port: 8f30bb1b-124e-4840-966f-7fbd73ceba98 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.623 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "refresh_cache-7bf2559a-9191-4322-9e46-19761de59dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.623 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquired lock "refresh_cache-7bf2559a-9191-4322-9e46-19761de59dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.624 187212 DEBUG nova.network.neutron [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:32 compute-0 nova_compute[187208]: 2025-12-05 12:12:32.896 187212 DEBUG nova.network.neutron [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:12:33 compute-0 nova_compute[187208]: 2025-12-05 12:12:33.288 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:33 compute-0 nova_compute[187208]: 2025-12-05 12:12:33.314 187212 DEBUG nova.network.neutron [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Successfully updated port: 0e7829e3-325a-430d-898f-510a4c544ffa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:12:33 compute-0 nova_compute[187208]: 2025-12-05 12:12:33.513 187212 DEBUG nova.compute.manager [req-19d91f1e-614c-44d1-acf2-95537b255d2f req-a00ddc32-7899-4975-a1c4-6fcff74f4bc6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-changed-0e7829e3-325a-430d-898f-510a4c544ffa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:33 compute-0 nova_compute[187208]: 2025-12-05 12:12:33.514 187212 DEBUG nova.compute.manager [req-19d91f1e-614c-44d1-acf2-95537b255d2f req-a00ddc32-7899-4975-a1c4-6fcff74f4bc6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Refreshing instance network info cache due to event network-changed-0e7829e3-325a-430d-898f-510a4c544ffa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:12:33 compute-0 nova_compute[187208]: 2025-12-05 12:12:33.514 187212 DEBUG oslo_concurrency.lockutils [req-19d91f1e-614c-44d1-acf2-95537b255d2f req-a00ddc32-7899-4975-a1c4-6fcff74f4bc6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-846c0e55-1620-4c7a-9792-d4f5f0d728d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:33 compute-0 nova_compute[187208]: 2025-12-05 12:12:33.514 187212 DEBUG oslo_concurrency.lockutils [req-19d91f1e-614c-44d1-acf2-95537b255d2f req-a00ddc32-7899-4975-a1c4-6fcff74f4bc6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-846c0e55-1620-4c7a-9792-d4f5f0d728d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:33 compute-0 nova_compute[187208]: 2025-12-05 12:12:33.514 187212 DEBUG nova.network.neutron [req-19d91f1e-614c-44d1-acf2-95537b255d2f req-a00ddc32-7899-4975-a1c4-6fcff74f4bc6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Refreshing network info cache for port 0e7829e3-325a-430d-898f-510a4c544ffa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:12:34 compute-0 ovn_controller[95610]: 2025-12-05T12:12:34Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a8:25:c4 10.100.0.7
Dec 05 12:12:34 compute-0 ovn_controller[95610]: 2025-12-05T12:12:34Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:25:c4 10.100.0.7
Dec 05 12:12:34 compute-0 nova_compute[187208]: 2025-12-05 12:12:34.616 187212 DEBUG nova.network.neutron [req-19d91f1e-614c-44d1-acf2-95537b255d2f req-a00ddc32-7899-4975-a1c4-6fcff74f4bc6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.134 187212 DEBUG nova.network.neutron [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Updating instance_info_cache with network_info: [{"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.154 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Releasing lock "refresh_cache-7bf2559a-9191-4322-9e46-19761de59dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.155 187212 DEBUG nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Instance network_info: |[{"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.158 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Start _get_guest_xml network_info=[{"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.168 187212 DEBUG nova.compute.manager [req-69872d56-0121-4550-a3a3-d2b6620436c6 req-7b8cf333-6911-4aaa-93e9-60eea87d9a4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Received event network-changed-8f30bb1b-124e-4840-966f-7fbd73ceba98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.169 187212 DEBUG nova.compute.manager [req-69872d56-0121-4550-a3a3-d2b6620436c6 req-7b8cf333-6911-4aaa-93e9-60eea87d9a4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Refreshing instance network info cache due to event network-changed-8f30bb1b-124e-4840-966f-7fbd73ceba98. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.170 187212 DEBUG oslo_concurrency.lockutils [req-69872d56-0121-4550-a3a3-d2b6620436c6 req-7b8cf333-6911-4aaa-93e9-60eea87d9a4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-7bf2559a-9191-4322-9e46-19761de59dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.170 187212 DEBUG oslo_concurrency.lockutils [req-69872d56-0121-4550-a3a3-d2b6620436c6 req-7b8cf333-6911-4aaa-93e9-60eea87d9a4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-7bf2559a-9191-4322-9e46-19761de59dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.170 187212 DEBUG nova.network.neutron [req-69872d56-0121-4550-a3a3-d2b6620436c6 req-7b8cf333-6911-4aaa-93e9-60eea87d9a4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Refreshing network info cache for port 8f30bb1b-124e-4840-966f-7fbd73ceba98 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.176 187212 WARNING nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.186 187212 DEBUG nova.virt.libvirt.host [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.188 187212 DEBUG nova.virt.libvirt.host [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.196 187212 DEBUG nova.virt.libvirt.host [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.197 187212 DEBUG nova.virt.libvirt.host [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.197 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.197 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.198 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.198 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.198 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.198 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.199 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.199 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.199 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.199 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.200 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.202 187212 DEBUG nova.virt.hardware [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.207 187212 DEBUG nova.virt.libvirt.vif [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1949687168',display_name='tempest-ServersTestJSON-server-1949687168',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1949687168',id=89,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-mvq04v0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:27Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=7bf2559a-9191-4322-9e46-19761de59dc9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.207 187212 DEBUG nova.network.os_vif_util [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.208 187212 DEBUG nova.network.os_vif_util [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:43:ef,bridge_name='br-int',has_traffic_filtering=True,id=8f30bb1b-124e-4840-966f-7fbd73ceba98,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f30bb1b-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.209 187212 DEBUG nova.objects.instance [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 7bf2559a-9191-4322-9e46-19761de59dc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.231 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <uuid>7bf2559a-9191-4322-9e46-19761de59dc9</uuid>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <name>instance-00000059</name>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestJSON-server-1949687168</nova:name>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:12:35</nova:creationTime>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:12:35 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:12:35 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:12:35 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:12:35 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:12:35 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:12:35 compute-0 nova_compute[187208]:         <nova:user uuid="62153b585ecc4e6fa2ad567851d49081">tempest-ServersTestJSON-1492365581-project-member</nova:user>
Dec 05 12:12:35 compute-0 nova_compute[187208]:         <nova:project uuid="0c982a61e3fc4c8da9248076bb0361ac">tempest-ServersTestJSON-1492365581</nova:project>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:12:35 compute-0 nova_compute[187208]:         <nova:port uuid="8f30bb1b-124e-4840-966f-7fbd73ceba98">
Dec 05 12:12:35 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <system>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <entry name="serial">7bf2559a-9191-4322-9e46-19761de59dc9</entry>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <entry name="uuid">7bf2559a-9191-4322-9e46-19761de59dc9</entry>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     </system>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <os>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   </os>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <features>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   </features>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk.config"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:e4:43:ef"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <target dev="tap8f30bb1b-12"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/console.log" append="off"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <video>
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     </video>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:12:35 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:12:35 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:12:35 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:12:35 compute-0 nova_compute[187208]: </domain>
Dec 05 12:12:35 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.231 187212 DEBUG nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Preparing to wait for external event network-vif-plugged-8f30bb1b-124e-4840-966f-7fbd73ceba98 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.231 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.232 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.232 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.233 187212 DEBUG nova.virt.libvirt.vif [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1949687168',display_name='tempest-ServersTestJSON-server-1949687168',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1949687168',id=89,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-mvq04v0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:27Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=7bf2559a-9191-4322-9e46-19761de59dc9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.233 187212 DEBUG nova.network.os_vif_util [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.233 187212 DEBUG nova.network.os_vif_util [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:43:ef,bridge_name='br-int',has_traffic_filtering=True,id=8f30bb1b-124e-4840-966f-7fbd73ceba98,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f30bb1b-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.234 187212 DEBUG os_vif [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:43:ef,bridge_name='br-int',has_traffic_filtering=True,id=8f30bb1b-124e-4840-966f-7fbd73ceba98,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f30bb1b-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.234 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.235 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.235 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.240 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f30bb1b-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:35 compute-0 podman[235248]: 2025-12-05 12:12:35.240588196 +0000 UTC m=+0.083001873 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, version=9.6, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.241 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8f30bb1b-12, col_values=(('external_ids', {'iface-id': '8f30bb1b-124e-4840-966f-7fbd73ceba98', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:43:ef', 'vm-uuid': '7bf2559a-9191-4322-9e46-19761de59dc9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:35 compute-0 podman[235249]: 2025-12-05 12:12:35.283282711 +0000 UTC m=+0.121958621 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.285 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:35 compute-0 NetworkManager[55691]: <info>  [1764936755.2862] manager: (tap8f30bb1b-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/337)
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.289 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.300 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.302 187212 INFO os_vif [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:43:ef,bridge_name='br-int',has_traffic_filtering=True,id=8f30bb1b-124e-4840-966f-7fbd73ceba98,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f30bb1b-12')
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.365 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'name': 'tempest-ServerRescueTestJSON-server-604643112', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000058', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8f73626a62534c97a06b6ec98d749111', 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'hostId': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.366 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.367 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.367 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No VIF found with MAC fa:16:3e:e4:43:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.367 187212 INFO nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Using config drive
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.369 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7bf2559a-9191-4322-9e46-19761de59dc9', 'name': 'tempest-ServersTestJSON-server-1949687168', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000059', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '0c982a61e3fc4c8da9248076bb0361ac', 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'hostId': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.371 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'name': 'tempest-₡-1444488967', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0c982a61e3fc4c8da9248076bb0361ac', 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'hostId': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.373 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'name': 'tempest-ServerRescueTestJSON-server-234668707', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000056', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8f73626a62534c97a06b6ec98d749111', 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'hostId': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.374 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'name': 'tempest-ServersNegativeTestJSON-server-826937421', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000050', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c5b34686513f4abc8165113eb8c6831e', 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'hostId': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.376 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'name': 'tempest-ServersTestJSON-server-1949687168', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000057', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0c982a61e3fc4c8da9248076bb0361ac', 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'hostId': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.377 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.379 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b661b497-acb9-4b26-8e26-7d0802bca8bf / tapc7d6a93d-87 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.380 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.382 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.385 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 30cb83d4-3a34-4420-bc83-099b266da48c / tap96dab709-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.386 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.395 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f2a101e0-138f-404e-b6e0-e1359272f560 / tap0bab1586-b0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.396 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.400 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 28e48516-8665-4d98-a92d-c84b7da9a284 / tape30774db-d3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.400 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.403 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f3769524-43d7-4c3b-be59-18bf7af73e18 / tapecb20c89-e0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.404 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b01bdfe-afbe-4570-8a2b-8df9cc8fd60b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.377523', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af0f4c44-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': '342ac660b3d134af8029974e9d870fe5b593a789a399e8beaef6259a55105f36'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.377523', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af102cc2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': 'd5543c745fc4f3ce1020c71872abbca90b06ae60dc917734bf55180d6236963a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.377523', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af11e42c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': '6c4c26250693d5d05b7821dec233e2357fad5974e3ab4526eb96c9f749e7e732'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.377523', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af126ece-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': '37ead9dcee2e29b912a8e83e2887354373171430a16fe04a3f3283e7a7420c02'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.377523', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af12eebc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': 'f8a664f153bc56bece3b4f6e9f94d36ba2ab404e6bdd01a50cd2d398ce797173'}]}, 'timestamp': '2025-12-05 12:12:35.404549', '_unique_id': 'ab8cddc19cd04f5e928a35cc708052af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.407 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.409 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.410 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.411 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.411 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.411 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.412 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e9c8623-7817-4fe8-a67c-6f1907d0d76a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.409515', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af13c896-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': '71243e8967b6117f91801fbcb549ce8175f6f1ac45faea07ae97dfeefee99aed'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.409515', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af13fc58-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': '6cf05306183b4b1fb20be8c3ce5e83f72ce7f08d1bcfd6c09abc3c0a2be1e6bd'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.409515', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af140662-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': '6e5b914a986c813ece83744c568f99a5d2ca555613e6cf5980d6ab495902877e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.409515', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af14118e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': '7e131abb1d73ab09a390e3664ad12ef6cbf2f1c8301d18c9760d606f4ac7064d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.409515', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af14217e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': '8549f0d3833dec796f8812a17db52ace468170c358936d397b98b7388de79bf6'}]}, 'timestamp': '2025-12-05 12:12:35.412329', '_unique_id': '5e74572e04454cd99bcdeabc8a764798'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.413 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.446 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.447 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.448 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.480 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.481 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.540 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.read.requests volume: 960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.540 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.read.requests volume: 453 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.541 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.566 187212 DEBUG nova.network.neutron [req-19d91f1e-614c-44d1-acf2-95537b255d2f req-a00ddc32-7899-4975-a1c4-6fcff74f4bc6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.577 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.578 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.594 187212 DEBUG oslo_concurrency.lockutils [req-19d91f1e-614c-44d1-acf2-95537b255d2f req-a00ddc32-7899-4975-a1c4-6fcff74f4bc6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-846c0e55-1620-4c7a-9792-d4f5f0d728d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.635 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.read.requests volume: 1053 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.635 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b389ed98-3ccc-4aa4-9538-c54a77ce280c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-vda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af197552-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': '2becbb2b3b4ec88d436403f5486ff2a34445f1088ae2752c042b5218e467d272'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-sda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af198664-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': '76e2ec1b559a44ad5a50b0e3df026d190c6c1673c6de2c81b71298f31b8c714b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-vda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af1eb4e0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': 'fcbf2034ff97c2b1296c74b276c70a77ea1d92ea9e02217191b3f0eddca4588b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-sda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af1ec584-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': 'a81b93d1b10d481edcfc1680c13fd99a9333d36f3c72ea0240c4bdbd38f969e6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 960, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af27be14-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': 'efeacce0d1d7a7ae18625612e047dafeba9497abfbe69ef257d340fc39925204'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 453, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vdb', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, '
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: disk_name': 'vdb'}, 'message_id': 'af27ce86-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '92f585fc02acbacbfa7d527eacd3032f7f26129ccb91e7ec943620271fca3672'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-sda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af27d8ea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': 'bef73e8bcca52ffde13bc5e98358c62cb46b408e19c828668b63f1cee52e4407'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af2d7340-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': 'c9acf133f825c4175733b2e9f6cdf00b12e929bf96770064af6a829ed3ea1169'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af2d8132-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': '02a6067681f255360eac736c9b9d53aa116f0b6a34ca3f49fa40b883710fcc5d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1053, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-vda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af363840-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '1813e3cd5ad14f06cce390284653ccfe2151183711f93cc797e2d351ab404da4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-sda', 'timestamp': '2025-12-05T12:12:35.414210', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af364556-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '575ea8eff0f50fa129f9ada12e6174c6e40bd54573ad8c8178e9699ccd02e1ec'}]}, 'timestamp': '2025-12-05 12:12:35.636093', '_unique_id': 'fd99447d61b641f3a8f1bd37df70f03a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.637 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.639 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.660 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/cpu volume: 5460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.663 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.690 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/cpu volume: 11400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.709 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/cpu volume: 10640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.726 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/cpu volume: 11400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.746 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/cpu volume: 11410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84ed07bd-42f0-438f-af29-332d69890a8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5460000000, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'timestamp': '2025-12-05T12:12:35.639547', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'af3a349a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.28166406, 'message_signature': '1dad4eb23b754f7873f5ed8e0305439f73fea33de88c50c1fcfd1674d4319343'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11400000000, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'timestamp': '2025-12-05T12:12:35.639547', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'af3ea962-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.311499796, 'message_signature': '13c9ccd8b0e57d4791e368e5c934d5096ef932fae3538f0a4b14cc432278d460'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10640000000, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'timestamp': '2025-12-05T12:12:35.639547', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'af419bc2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.330665086, 'message_signature': '72a15ccbdbc65c6322eeae0b3e9f5d20980d3ab22336d22123f87b29937d8c47'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11400000000, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'timestamp': '2025-12-05T12:12:35.639547', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'af442306-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.347358286, 'message_signature': '0329fa5920dea83e7c7c9c045e396bfd436656197b0286218668ac8c5eedb001'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11410000000, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'timestamp': '2025-12-05T12:12:35.639547', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'af473064-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.367299558, 'message_signature': '49d4bfbefdfd85fd7c8998ac15414892cfda045cb2b04949031ec3fc895aa4bd'}]}, 'timestamp': '2025-12-05 12:12:35.747062', '_unique_id': '20b7ac657f3a4bc5a12e4e044a21847a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.750 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.751 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.766 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.766 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.768 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.779 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.779 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.800 187212 DEBUG nova.network.neutron [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Updating instance_info_cache with network_info: [{"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.801 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.usage volume: 196616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.802 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.802 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.815 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.816 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 nova_compute[187208]: 2025-12-05 12:12:35.819 187212 DEBUG oslo_concurrency.lockutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Releasing lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.828 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.829 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39d65176-fad9-4f39-bf49-c89f8a6cc01a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-vda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af4a3b88-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.373051023, 'message_signature': '0484083260fb1ecea6325cd6964a20ebee96c78a551ee1df542e3f6e424846a8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-sda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af4a4a2e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.373051023, 'message_signature': '52130bb6de5cc6285658f44b03aec37e708a5fe260d3ad8152587d38ea676370'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-vda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af4c30be-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.389348111, 'message_signature': 'bb350d892ad2c5d68fc464727fb39408534667d6355ea70128bc3427da94c60f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-sda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af4c4496-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.389348111, 'message_signature': '5cb5dd5b3798fe34840287ae2be1bf29a0c95320e1852e1dff371c66b4dad653'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196616, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af4f9e84-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.401290563, 'message_signature': 'd780908182d3ae4c8da51a84bc817811caa8ec70770e710fd08f0899d3641b58'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vdb', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'af4fb3ba-d1d3-11f0-8572-fa163e006c52', 'monotonic_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: ': 4136.401290563, 'message_signature': 'a3d66ebe7c1a3dabc7eaae48f4c8c14ac0e1926beebf29cf002eef70e13d7c2c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-sda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af4fbe00-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.401290563, 'message_signature': 'fc35dd870dd6b10151f66996b31b12d9d7408d0f17082ef56dde90bfebdde302'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af51b804-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.424223051, 'message_signature': '564148ff16562b3bf053e9232e80322236df83a4a06018ad9c807fb5cd4a6b3e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af51cab0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.424223051, 'message_signature': '43cf76f751be6710706262a31de00f3302de71d96b2a3aad1c422b85e8a180df'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-vda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af53b8a2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.437626916, 'message_signature': '0903ca685b212d4893ac1d546832a41b219ea34d1aef78e15585a403c47b2b81'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-sda', 'timestamp': '2025-12-05T12:12:35.751897', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af53c374-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.437626916, 'message_signature': '6849006c76d36e054aa17aec2a26c82eaf5d08ac8bb63d01a604e5a8c585f837'}]}, 'timestamp': '2025-12-05 12:12:35.829397', '_unique_id': 'f889d7cf4d824cf5b949a6eff6653bd8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.830 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.831 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.832 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.833 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.833 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.833 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.833 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.capacity volume: 117440512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.834 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.834 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.834 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.834 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.835 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.835 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc045ba7-d8b6-48b5-911e-ad3a177b47a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-vda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af542e04-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.373051023, 'message_signature': '83129428cceb7b1ff73ae3e71900b0a56e01ee091c4841773d9ec183acfead7d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-sda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af5438fe-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.373051023, 'message_signature': '62879eab20daeea90ba951155750b5f12d9584b976472844831ee7ed89c93bd9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-vda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af5470f8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.389348111, 'message_signature': 'cd132ba0f01cbe3ba74a526ac61306645ce63f17486b877b5b2ea0ba2bdbc4ac'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-sda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af5478aa-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.389348111, 'message_signature': 'dfebc5b659d6c75386a6a9a3cc22592ed5ae48446afca4d1147d85ec2fdf4b3f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 117440512, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af548368-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.401290563, 'message_signature': 'be3971fcebe1f71d364d77d2321cabe401fe73c337cfee0e97e689ffae08d7d6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vdb', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'af548ad4-d1d3-11f0-8572-f
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: a163e006c52', 'monotonic_time': 4136.401290563, 'message_signature': '92df804298570afc8fc461c6d6b66a28956ba483841d72d8102c86ccc6801e99'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-sda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af5495a6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.401290563, 'message_signature': 'c2e9fe44101cdde5cc04d7d1a21f4b912344e384cbd3329e2b69d37e1f8e14d2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af549cf4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.424223051, 'message_signature': 'f66ccb2093b0b607c19229589907b596c8e02319ad3469cec646356beb6f4878'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af54a424-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.424223051, 'message_signature': '99bb1b53a23c0d20a56218a40c2ca83804a2d06c3cc4907bd95bc96f0bbb9d1c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-vda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af54adfc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.437626916, 'message_signature': 'd69518b84c70f2e953a22bff19d4723ce74d7204a6b34dd58219bffdd0f280cc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-sda', 'timestamp': '2025-12-05T12:12:35.831815', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af54b536-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.437626916, 'message_signature': 'c2754dc3c821caaa74d6ff21306e021014514341aa666f333f1ecefc812292e1'}]}, 'timestamp': '2025-12-05 12:12:35.835600', '_unique_id': 'fde4f0f6ed9a4c49b189ffac1531c6a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.836 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.837 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.837 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.837 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.838 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.838 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.write.requests volume: 295 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.838 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.839 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.839 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.write.requests volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.839 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.839 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.requests volume: 297 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.839 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.840 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.write.requests volume: 300 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.840 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aedf006e-54e5-4b19-ac4d-def7b983ea44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-vda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af55091e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': '4cfc6b41679789f30650f3dbf3945390cdf9a70fa8a9f7303fed1bdc1a56a7dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-sda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af5512d8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': '19d2e344a690b2d50c9a5545ff808c58590aecc28ae370a3fc36e30b07b266c2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 295, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-vda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af553704-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': '802aa3e371107fa02e9c8384baaceab5b92096392c645d6ceffd650d4c6b728c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-sda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af553e52-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': '14accf13c909e34d1e8bdeb84d587e7be1a74ae23c5148d48d588a505a423bda'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af5547b2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': 'a996a69bbbc698733b2b8e19c8eae05302d188c13346d9a6cd4931039093d9fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 28, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vdb', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'di
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: sk_name': 'vdb'}, 'message_id': 'af5550c2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '9de23aeddf4a2b2fc5b0e6042eb85252da029b45d560f354b88f7a31ab7b92e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-sda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af555ad6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '9b0ff0f89d4da2549d02c94cff475a0ac243b1f3378a71aa7d6c7c7b6d7f76a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 297, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af556260-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': '46b406121d59f4a11927d0dc67c5b5c8f42220d6b7e2549cbdda5b9568b28a0c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af556ae4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': 'a957d50c612572aafe9a1cc240f60b938270323de0f281c1ff3f7fd9f523a46b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 300, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-vda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af55764c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '25897bfff65dc85951b96def3eba1e96f23ec578e75589e6a8ae89a438aa7b63'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-sda', 'timestamp': '2025-12-05T12:12:35.837409', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af558010-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '6a623a20393e53cea229f63ac695f89df4aff5d9f6905879555fb37b8bc1d92e'}]}, 'timestamp': '2025-12-05 12:12:35.840757', '_unique_id': 'e57606923a284d17a0d4e99205cf43a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.841 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.842 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.842 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.843 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.843 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.incoming.packets volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.844 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.incoming.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.844 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.844 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e0cbaf4-066f-466e-9e37-4c0e81f9959d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.842555', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af55d51a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': 'f383f2ed3c750b7afa59b308f36e27046b584554de8b4de5b7da3c14474415ef'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 19, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.842555', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af560580-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': 'd62ed3053a3972183c7d5294d34fe8e605c2c97d748dbe59627daf29c8b4a64a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.842555', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af560df0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': '627a95cc3edfb1c2ba4818dbd9f1f9e2ddbf0d0b6179a56a8d1ff1aa140b2669'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.842555', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af561a66-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': 'f1c2556e2ce8097a6094c2ba047dde1a1118718e6fc00efd1a4ad21070d74499'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.842555', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af5625f6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': '78e30c8b876a0cd269b08154f3630a4103c6fbbfae31c022173812a9b62668ac'}]}, 'timestamp': '2025-12-05 12:12:35.844976', '_unique_id': '47310bf7d2e0415f975c5273170396ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.845 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.846 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.846 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.846 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-604643112>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>, <NovaLikeServer: tempest-₡-1444488967>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-234668707>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-826937421>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-604643112>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>, <NovaLikeServer: tempest-₡-1444488967>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-234668707>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-826937421>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>]
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.847 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.847 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.847 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.848 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.848 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.write.latency volume: 4523219050 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.849 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.849 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.849 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.write.latency volume: 77836222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.850 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.850 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.latency volume: 2775678444 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.850 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.850 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.write.latency volume: 6122422353 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.851 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb960794-786c-4994-9bbd-da061d10d40b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-vda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af568bfe-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': '7aa4ff1a76d8c9d59d7c4df607ab638e850e716cc9b8c6d930955dfcf27ae535'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-sda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af569478-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': 'b3cc7f2b17380c46a69b9d6ae599b05e34ddd2d2750b477be72c42664ff901dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4523219050, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-vda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af56c556-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': '1467597176342e048cc257d868618477e1f76f5ffef3fae4e731aebde21db9e9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-sda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af56d726-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': 'a06c0e0b6869ea88b51eb655f167fa05daa56e4279c9e60cb5e9f8f90e475f1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af56e14e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': 'a1c33fecaf51d90a7ec4443e06641e16cdb1b608ee285f07be6b8115aa3a74b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 77836222, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vdb', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'mess
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: age_id': 'af56f116-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': 'b38d723821fa3c37effeb24a7696a31e66b0026e86ca97644e59844cfe0d29f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-sda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af56f9fe-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '49083ba78c638ac538eb8059bd46cc75cc6e5a68c7658b085321fb1bdf0bc129'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2775678444, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af570156-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': 'b7e037aab543fcc708773689427657ba8537741ad630371f2fc5b5cf3f101ce3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af570b38-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': 'c10c33ad22232ca3f95963eab2297545af52d16c15bb37104a6b6f40af71b818'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6122422353, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-vda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af5712ea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '08de413877cd15d79852f142cf9665a4aaef2733993e6ee8a6fd7827ed422a7b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-sda', 'timestamp': '2025-12-05T12:12:35.847281', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af571e3e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '4e1f6d2594afe096258a472d2ac8621b15cfd6625d66c82e8787c9aa85ff0662'}]}, 'timestamp': '2025-12-05 12:12:35.851350', '_unique_id': 'cba3258bf634434980c665b737d537de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.852 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.853 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.853 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.853 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.854 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.854 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.854 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.855 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.read.bytes volume: 28287488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.855 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.read.bytes volume: 8282112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.855 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.855 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.856 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.856 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.read.bytes volume: 29231616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.856 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37260fb9-0eb7-40ff-a5fb-706175cfbcd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-vda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af577456-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': 'bb11c8e1321d105384228d886f2b2d0a5be131378815381c119d369358b0be9f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-sda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af577d3e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': '3b9187c7c34ec4e0c04cb34324b61dc1a6a6de9c709792974b31963331cc976c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31005184, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-vda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af57a944-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': '1bae0f1cfbd46f3fb637a0c24a79d91769e7d9a6da12293d5c96c8f39ce863ea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-sda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af57b420-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': '3a59b448fa18bbdade58a45963eac3682f6a727ab2c6a6eb990373a969b14bb5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28287488, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af57c028-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': 'e3346adbf29d70588fecce37b2bf74a044241884598afb48fbc2a267055cfbf1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8282112, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vdb', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_i
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: d': 'af57c780-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '2a5fdeefcbbbfdbbd96d5d6c1f36a07beea9ca1cae50d161abb91995ceadb6a5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-sda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af57cea6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': 'b36cfcd5db31cf8ce7525c1a21eebbc48523646c2f58102f6fad0398cfa9e3c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31005184, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af57dcd4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': 'a723ae738e16df51c6ad8f1bb16305713f852a29b229aefc63d9841949a2937a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af57eaf8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': 'e8431bd46fd56a90271f31cb97c8bd034fd0a1e800f8ab17538ebcd9fca0747b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29231616, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-vda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af57f30e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': 'f3342397d651a3b264f1ca6aada28d48cbadd59d622a561257f09748f00ced2c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-sda', 'timestamp': '2025-12-05T12:12:35.853250', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af57fd5e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '57e8932a6c75236bdf8c2fbc5c88047b8bc2049d002a0b7d8543ce030a1f24a2'}]}, 'timestamp': '2025-12-05 12:12:35.857069', '_unique_id': '8cc18d1610a94ddb9925e479226c2d3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.858 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.859 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.859 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.859 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b661b497-acb9-4b26-8e26-7d0802bca8bf: ceilometer.compute.pollsters.NoVolumeException
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.861 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.862 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/memory.usage volume: 42.70703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.862 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/memory.usage volume: 40.48828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.863 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/memory.usage volume: 42.87109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.863 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/memory.usage volume: 40.44140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9d34210-0a7f-4f6d-bcf1-2f779e963382', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.70703125, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'timestamp': '2025-12-05T12:12:35.859387', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'af58dbf2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.311499796, 'message_signature': '1c0fd64b3a148af42ca20a97db1b84cc2f96199a02f882b8cb92dede3a6b989e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.48828125, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'timestamp': '2025-12-05T12:12:35.859387', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'af58ec00-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.330665086, 'message_signature': '5e25a10b01d5133ae1c75226ce68fa7ff4cd8b9930b13ae30d846b9d8dda0d00'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.87109375, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'timestamp': '2025-12-05T12:12:35.859387', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'af58f696-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.347358286, 'message_signature': '27eee62727352cac61aafa7c5de0bab91abb9e297d894bc333ac1eb975924639'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.44140625, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'timestamp': '2025-12-05T12:12:35.859387', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'af59055a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.367299558, 'message_signature': 'ab1baa2cbbbb83c0b9ea9643bac86b1ea088a4fc7e306264d3be287c43f6f517'}]}, 'timestamp': '2025-12-05 12:12:35.863850', '_unique_id': '2570188ca9784a4cb22028969ec1575d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.865 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.866 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.866 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.868 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.868 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.868 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.869 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.869 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecfaf847-52ae-48e5-ab84-78d8cd57e678', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.866862', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af59885e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': '933d219b38f5b37c90441e020f312def42b5888dded58018795472aa3b1c7294'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.866862', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af59cfc6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': '0c06e788ffd1ccc045ab850761a3d601ba4181c3735389efc68b518d7372b432'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.866862', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af59d93a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': '51bcbb49f3677401a5345c68d70ab9446fb1929121934b3f192ab3831b0dc1e4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.866862', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af59e68c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': 'c0518d6cb35b9ef2f17e8a624ac2a293f530cf23b0fed10c19ca617783ccfd01'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.866862', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af59f5e6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': 'a006bab3888a814fd1439086ff2c66cd57f74d0182346355cac67523820d93d6'}]}, 'timestamp': '2025-12-05 12:12:35.870005', '_unique_id': 'a66a072299304931a961f147d5828ee7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.871 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.872 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.873 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.873 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-604643112>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>, <NovaLikeServer: tempest-₡-1444488967>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-234668707>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-826937421>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-604643112>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>, <NovaLikeServer: tempest-₡-1444488967>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-234668707>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-826937421>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>]
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.873 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.873 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.874 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.875 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.875 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.875 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.875 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7902bb2-4bba-46ae-9e58-e9e24a427bb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.873448', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af5a8b0a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': 'e7eafe6c130eec0779ab163fa8b82be6a45096a8c85576fab79b5a1e9433833e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.873448', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af5ac62e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': '2f1da3f8892ba6c06d22316f94bdd289dd58dd67209a175abca562d983dcf9b9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.873448', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af5ad06a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': '34952f9bb2dbefc9bdfa5126f83afe32afd38beeabc5a774749ac5085287da8c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.873448', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af5ada9c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': 'd705243b9bff0d1b67fc6700b16c5335e57f7718397d1f959b7bd25f576dfe7f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.873448', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af5ae79e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': 'e7ed973b92612b530ea520f473a094bb597f49362da5f0d0cb2514082218d128'}]}, 'timestamp': '2025-12-05 12:12:35.876150', '_unique_id': '10b01f967d0a425ba942120926c6218d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.877 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.878 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.878 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.880 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.880 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.880 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.outgoing.bytes volume: 1312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.880 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.881 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.outgoing.bytes volume: 992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c0939cf-9b2b-4d33-a48c-2a4842a7a02d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.878413', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af5b4d60-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': '043170274972b6b55d64aad9c1f7e23f843e3bfdba18ad75e7f3ffe7327947c0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.878413', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af5b9220-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': '5ef5f94b6c98b71678a9899fcf1526c94c838af0bba840bf3484d6b138d10698'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1312, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.878413', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af5b9b80-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': '0f8e19d33a0f1457031f335b740ce14ca95cb51453355b406955193c4569950b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.878413', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af5ba5d0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': 'c78cdadd99ed3e53d3ade8083e83c8d4692be27c763aa48f8604a1e3a99f54b5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 992, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.878413', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af5bb336-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': '4554ad2882391f4978522f80561f2158adda4b6fc8ffdecc47d2aae9dd357f63'}]}, 'timestamp': '2025-12-05 12:12:35.881363', '_unique_id': '7dc885acc01e4d1b8455ddb2cab149d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.882 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.884 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.884 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.read.latency volume: 203354232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.885 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.read.latency volume: 474433 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.886 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.886 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.read.latency volume: 246182338 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.887 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.read.latency volume: 85276382 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.887 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.read.latency volume: 182874053 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.887 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.read.latency volume: 137397134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.888 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.read.latency volume: 28062466 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.888 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.latency volume: 304173973 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.889 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.latency volume: 24574106 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.889 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.read.latency volume: 269513535 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.889 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.read.latency volume: 31883570 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4440219-07be-4707-a9ac-6c5a996ef973', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 203354232, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-vda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af5c516a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': '3c826bca943e4e449166fca8fcfd0932031f2a5668d1e5912e09315774a68d41'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 474433, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-sda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af5c5fe8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': '7df4266f712a25a289a01a3f512de67df6acf09fd5eba01b8fb54b3c1592d2c0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 246182338, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-vda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af5c99f4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': '2346e3fe3181ce23736dd0ecec53cc1666d35c6962354030a137e4f7f75fba70'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 85276382, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-sda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af5ca8f4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': 'a5a6d9cc46fca3d716dba02fe9d77ee04aef94f4b552e975e0c6f6a6e0e681ba'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 182874053, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af5cb39e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '7de866d79edcc81e02fb46263a6461aacf0354ae70a0122d599d12d6cb6bb6cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 137397134, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vdb', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'dis
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: k_name': 'vdb'}, 'message_id': 'af5cc226-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '3cacf3126553e8adee17636dfbd768b3ae0efcabcb1d040cbb199bd943d6d9d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28062466, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-sda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af5cd306-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '0db331fae8da0532181f675e884fc332b7d665748e4ec223b905410786ad5b25'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 304173973, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af5cdedc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': '7f187e1b646bfcd1b318a5d0fa8fdccb3d1d99544615c30b15a00e23920b70a0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24574106, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af5cebfc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': '64312d08932f4831fffb29ef646873ee31cfa3225a36a46dc6fda38d50253730'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 269513535, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-vda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af5cfa7a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '172795eedd93bba371a3e98e92d397e5159191ae8bf49db7560bd94e3232c633'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31883570, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-sda', 'timestamp': '2025-12-05T12:12:35.884838', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af5d0470-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': 'b0bdef3f74b0bb626625fb0e770f7e2face925693fcf0622e8aa773022ba29a9'}]}, 'timestamp': '2025-12-05 12:12:35.890049', '_unique_id': '370181bba75e4a10bb7a4aa53603e9cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.891 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.893 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.894 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.894 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.895 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.895 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.895 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e0bf09e-782c-4749-b26e-d318b632db31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.893395', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af5d9a84-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': '6ab896d32e518d47038b63a7dea41cafadadc9d1ef703edbc7ee8a051acb7d14'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.893395', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af5dd10c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': 'a662e909af51c6f9c5bf1a64aceea87893eabb5368094a7005a95f1f0e406cbc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.893395', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af5ddcf6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': '6e431618adc64f57813a5aee711699c844094cf5487fd7a29ba5f2c1bc5f1ec0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.893395', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af5de8a4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': '21912b47818926c3fae1e0a14a1c43a684f841de2f6131df50f99be8966f9c23'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.893395', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af5df146-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': '3278ab67ef538b97ff9b0d33a4947f44bea89ad2ba52973ebdf0ef2499fe1da1'}]}, 'timestamp': '2025-12-05 12:12:35.896117', '_unique_id': 'a0cadbf7af644dfaa3d935fa085700be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.897 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.898 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.898 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.900 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.900 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.incoming.bytes volume: 1772 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.900 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.incoming.bytes volume: 532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.901 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.bytes volume: 1520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.901 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.incoming.bytes volume: 1346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b51a871e-082b-4b08-9944-b11733bb832f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.898454', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af5e5b7c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': '827f9c226850316867ce199c7705a8b2fec84d893a7d926333b74dfb7cd59773'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1772, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.898454', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af5ea050-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': '90229c3d27eecf50a46d589ea5470e2f830de081d7ddeddd90255f28752996ca'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 532, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.898454', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af5eb496-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': 'cbd91ae79bfa8d386225a33fb5b14dbc25151c4ae6602ff0b8ee9647639880c8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1520, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.898454', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af5ec170-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': '8e137cc0a8f77364d8793e8687474afe3b2902c52d7eabd3b7c3f63cb3fd63bb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1346, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.898454', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af5ecf80-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': 'cfe5ab8b3a146db9795f95f8c6a5a88445b103f18ea718f3a174bbe98fa90548'}]}, 'timestamp': '2025-12-05 12:12:35.901896', '_unique_id': '4a06c8fd9a354b918f301a7cd9e54c9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.903 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.904 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.904 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.904 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-604643112>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>, <NovaLikeServer: tempest-₡-1444488967>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-234668707>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-826937421>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-604643112>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>, <NovaLikeServer: tempest-₡-1444488967>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-234668707>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-826937421>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>]
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.904 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.904 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.907 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.907 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.907 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.908 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.908 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c39c1b57-1583-464f-ad94-f7f4dd98d3c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.904703', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af5f531a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': 'e46dd51e7cd786343373af5214d75a516cf09da8436bb5cba5b6caa5e5eae021'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.904703', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af5fbc6a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': '272c5f02d876ba3292fc227daf354bf405b06cf0501d43de18a6eb9ef02c12e6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.904703', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af5fcdc2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': 'c96b20de461037cb8f0dc08a1e0c587367d1f658405a7b388d4a8ddc4f049a99'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.904703', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af5fd862-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': 'ea5db73eccfb24931e372c05df31a5b336cdec539516d2a84700022bc3ae56ff'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.904703', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af5fe2bc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': '4a64b1be7d5a03363e26c1f4bbc20c54d109d3f5c739d450bda06d36a9bedb80'}]}, 'timestamp': '2025-12-05 12:12:35.908919', '_unique_id': '59da59c75f7f4e5bbf06d6ba6ba5690b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.910 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.912 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.912 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.913 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.913 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.913 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.913 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.914 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.914 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.914 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.914 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.915 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.915 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe44e042-a797-4ab3-a2ce-5f0d94428e28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-vda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af606fac-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.373051023, 'message_signature': 'a87dfd5b66fa3ceacfa93ed15b518c923cbc3ed6ab1964f67c0606d91403e637'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-sda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af607826-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.373051023, 'message_signature': '1e1f80c26b7598d63f8422d42c84cfdcb8725885cbce03643c81e35187e1b787'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-vda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af60a094-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.389348111, 'message_signature': 'f8ca17b267ae826b6dddf768a64c32fd64a33afef355c07a21b4fd95ddc03ac5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-sda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af60a8d2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.389348111, 'message_signature': '0e15aa836290ad3d2eebb5eba10d1d6fb7e84083aadf3f8726c837b06d2e2bc7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af60b8e0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.401290563, 'message_signature': 'b2fd8f91425dd0b5836c89d22ab7fcb1f52efb14f63b07dac6a469eda6532dd1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vdb', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'af60c394-d1d3-11f0-8572-
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: fa163e006c52', 'monotonic_time': 4136.401290563, 'message_signature': 'e22447b138328e361c74e20bce07d3ec98fd10a1ff91fbac77be97a8dcaaa854'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-sda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af60cb00-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.401290563, 'message_signature': 'e25ef4b6b73cb2f3d202852c189ee856ad9d890f5d3017e58c7de1ad2b69b4a8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af60d2bc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.424223051, 'message_signature': '262b3ace1c9c47f5adb5ff1fa1feaafe92ba4db0f0f41965a9d78714aa6b2462'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af60dc4e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.424223051, 'message_signature': 'da449f15a0694d1decd75e6aa5f284f6674c66e6e64ff5d9affac7973d1984d6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-vda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af60e7ca-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.437626916, 'message_signature': '86a6c9c19a48d1d6e7889af8a6b0385370d62bd43189ef5a27be27a76fe61db0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-sda', 'timestamp': '2025-12-05T12:12:35.912027', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af60f062-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.437626916, 'message_signature': '1d6ce6df64646967a51924eda7e6168bd455180082f6f6cb7236965656fbf14f'}]}, 'timestamp': '2025-12-05 12:12:35.915694', '_unique_id': '2752a90056294478ae94445376ef2faf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.916 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.917 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.918 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-604643112>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>, <NovaLikeServer: tempest-₡-1444488967>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-234668707>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-826937421>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-604643112>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>, <NovaLikeServer: tempest-₡-1444488967>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-234668707>, <NovaLikeServer: tempest-ServersNegativeTestJSON-server-826937421>, <NovaLikeServer: tempest-ServersTestJSON-server-1949687168>]
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.918 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.919 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.919 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.920 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.920 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.920 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/network.outgoing.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86a60b77-5693-4a2f-b35e-395514138ac2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000058-b661b497-acb9-4b26-8e26-7d0802bca8bf-tapc7d6a93d-87', 'timestamp': '2025-12-05T12:12:35.918508', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'tapc7d6a93d-87', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:e5:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7d6a93d-87'}, 'message_id': 'af6168d0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4135.998594897, 'message_signature': 'f98caf181ea30e3e84a77cd2c9ce317f77711b291e67d8f729dd062eac7e8df1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-0000004f-30cb83d4-3a34-4420-bc83-099b266da48c-tap96dab709-f4', 'timestamp': '2025-12-05T12:12:35.918508', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'tap96dab709-f4', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:a5:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96dab709-f4'}, 'message_id': 'af619e36-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.003339993, 'message_signature': '9d31b0f0e457eeda8a4dfc064c921056a1712cf08aebf85f28aa937371eb163b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'instance-00000056-f2a101e0-138f-404e-b6e0-e1359272f560-tap0bab1586-b0', 'timestamp': '2025-12-05T12:12:35.918508', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'tap0bab1586-b0', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a0:97', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bab1586-b0'}, 'message_id': 'af61b060-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.007542104, 'message_signature': '7383466c367692d6b095e00fb6ef5d22d964b175e95b5407beb4652a0b2f2939'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:12:35.918508', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'af61b8b2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.01890527, 'message_signature': '8d2cd003538862024fc5bca82a30029089b6c9c07ce856d05cefbae40d78adb0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'instance-00000057-f3769524-43d7-4c3b-be59-18bf7af73e18-tapecb20c89-e0', 'timestamp': '2025-12-05T12:12:35.918508', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'tapecb20c89-e0', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a8:25:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapecb20c89-e0'}, 'message_id': 'af61c276-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.022294247, 'message_signature': '96cefcba3ee582af433a8bc25f07f8b24bd3d81a1fbcbae62e3ff402032816e9'}]}, 'timestamp': '2025-12-05 12:12:35.921124', '_unique_id': '14b7bef287bf4703bd9a822677b07e32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.922 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.923 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.923 12 DEBUG ceilometer.compute.pollsters [-] b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.925 12 DEBUG ceilometer.compute.pollsters [-] Instance 7bf2559a-9191-4322-9e46-19761de59dc9 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000059, id=7bf2559a-9191-4322-9e46-19761de59dc9>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.925 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.925 12 DEBUG ceilometer.compute.pollsters [-] 30cb83d4-3a34-4420-bc83-099b266da48c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.926 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.926 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.write.bytes volume: 249856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.926 12 DEBUG ceilometer.compute.pollsters [-] f2a101e0-138f-404e-b6e0-e1359272f560/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.926 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.927 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.927 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.927 12 DEBUG ceilometer.compute.pollsters [-] f3769524-43d7-4c3b-be59-18bf7af73e18/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b071365c-84eb-4fb1-8fb7-f3c2c61a185b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-vda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af623008-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': '145857009a2539f96671fc7a43f34cec322e200e1f5de25ea43110e3e68e5b41'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf-sda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-604643112', 'name': 'instance-00000058', 'instance_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af623b16-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.035315371, 'message_signature': 'ac9515d43186bb3fa0663853d2ed879e0fd8099fd3d3dafd46510316f2023a05'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-vda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af62818e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': '96108d9fbccdd9de71c7b15da043b41207011a9404a9d066f978dc0d1a2b00c0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': '30cb83d4-3a34-4420-bc83-099b266da48c-sda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-₡-1444488967', 'name': 'instance-0000004f', 'instance_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af628bb6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.06979771, 'message_signature': 'bd5152eb0cbc32819fbe0a54024a42e358072b118a96b1beb551affd37850fca'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af629372-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '42cb3d2abbdd54b1a680e17f6d569f816e43ff9693129956a2269ecefb08088e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 249856, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-vdb', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'af62a1e6-d1d
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': 'deb10120954f7559cef9dea6a0ce80a71383c633255e34077e89755b288053a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd12bb49c0ca84e8dad933b49753c7b24', 'user_name': None, 'project_id': '8f73626a62534c97a06b6ec98d749111', 'project_name': None, 'resource_id': 'f2a101e0-138f-404e-b6e0-e1359272f560-sda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-234668707', 'name': 'instance-00000056', 'instance_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'instance_type': 'm1.nano', 'host': 'c65eb737e2223406e8de1d0e5e0600cab8255afc3510ac79251166aa', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af62a93e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.103327002, 'message_signature': '5229b11eb1b1889dedd9b6ef08d60d73d094a02fd3759ec2345b8b2910013770'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af62b24e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': '53a0d4df0a4a601d47d4f1eb9591e4cf78b3346e61e97092dbf1b96d3302707c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af62b9b0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.162598153, 'message_signature': '76f71550fd9ff761dc185eb384e2f6e8c4cc99a7c8e4063e2dae8e211b479014'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-vda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af62c856-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '67874c09df58574844e470896994eb282357bb51bb8d0d434757771aa7c03bf7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_name': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_name': None, 'resource_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18-sda', 'timestamp': '2025-12-05T12:12:35.923600', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1949687168', 'name': 'instance-00000057', 'instance_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'instance_type': 'm1.nano', 'host': 'bdea2be93beecc969817beb34f3b6985f690969bc12b3a0d39d9d7f8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af62d01c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4136.199684428, 'message_signature': '29c0dbbf95cb2438d277ec1d49f645bab1a4d3a8fcbd483745b3c86822f5a67f'}]}, 'timestamp': '2025-12-05 12:12:35.928004', '_unique_id': '30b726f2d5654dee925a7cdeb54844e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:12:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:12:35.929 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.100 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.473 187212 DEBUG nova.network.neutron [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Successfully updated port: 961ee213-955e-471f-8cb4-ad0d3a82285c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.484 187212 INFO nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Creating config drive at /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk.config
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.488 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzf_dapp3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.515 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "refresh_cache-846c0e55-1620-4c7a-9792-d4f5f0d728d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.516 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquired lock "refresh_cache-846c0e55-1620-4c7a-9792-d4f5f0d728d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.516 187212 DEBUG nova.network.neutron [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.566 187212 DEBUG nova.compute.manager [req-5caebb8b-10e1-4e70-b8c3-95a5f62adaa7 req-36a24d4b-c4ca-4110-9d9f-dff89f7c9ed4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-changed-961ee213-955e-471f-8cb4-ad0d3a82285c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.566 187212 DEBUG nova.compute.manager [req-5caebb8b-10e1-4e70-b8c3-95a5f62adaa7 req-36a24d4b-c4ca-4110-9d9f-dff89f7c9ed4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Refreshing instance network info cache due to event network-changed-961ee213-955e-471f-8cb4-ad0d3a82285c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.566 187212 DEBUG oslo_concurrency.lockutils [req-5caebb8b-10e1-4e70-b8c3-95a5f62adaa7 req-36a24d4b-c4ca-4110-9d9f-dff89f7c9ed4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-846c0e55-1620-4c7a-9792-d4f5f0d728d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.623 187212 DEBUG oslo_concurrency.processutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzf_dapp3" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:36 compute-0 kernel: tap8f30bb1b-12: entered promiscuous mode
Dec 05 12:12:36 compute-0 NetworkManager[55691]: <info>  [1764936756.6983] manager: (tap8f30bb1b-12): new Tun device (/org/freedesktop/NetworkManager/Devices/338)
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.702 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:36 compute-0 ovn_controller[95610]: 2025-12-05T12:12:36Z|00876|binding|INFO|Claiming lport 8f30bb1b-124e-4840-966f-7fbd73ceba98 for this chassis.
Dec 05 12:12:36 compute-0 ovn_controller[95610]: 2025-12-05T12:12:36Z|00877|binding|INFO|8f30bb1b-124e-4840-966f-7fbd73ceba98: Claiming fa:16:3e:e4:43:ef 10.100.0.5
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.713 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:43:ef 10.100.0.5'], port_security=['fa:16:3e:e4:43:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7bf2559a-9191-4322-9e46-19761de59dc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8f30bb1b-124e-4840-966f-7fbd73ceba98) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.715 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8f30bb1b-124e-4840-966f-7fbd73ceba98 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 bound to our chassis
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.717 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:12:36 compute-0 ovn_controller[95610]: 2025-12-05T12:12:36Z|00878|binding|INFO|Setting lport 8f30bb1b-124e-4840-966f-7fbd73ceba98 ovn-installed in OVS
Dec 05 12:12:36 compute-0 ovn_controller[95610]: 2025-12-05T12:12:36Z|00879|binding|INFO|Setting lport 8f30bb1b-124e-4840-966f-7fbd73ceba98 up in Southbound
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.718 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.721 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.740 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f351ebb1-3646-4879-b1cd-2442db838ff7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:36 compute-0 systemd-udevd[235306]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:12:36 compute-0 systemd-machined[153543]: New machine qemu-101-instance-00000059.
Dec 05 12:12:36 compute-0 systemd[1]: Started Virtual Machine qemu-101-instance-00000059.
Dec 05 12:12:36 compute-0 NetworkManager[55691]: <info>  [1764936756.7673] device (tap8f30bb1b-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:12:36 compute-0 NetworkManager[55691]: <info>  [1764936756.7695] device (tap8f30bb1b-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.787 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b2846e85-8bc9-4a86-a2cc-fd6f07a0a10c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.792 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a93a3ccb-5c8f-4118-9a5e-b1cc836db2d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.795 187212 DEBUG nova.network.neutron [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.831 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6e7314-157c-42fb-b979-68c5beff8169]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.851 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6ca96f50-4db9-46de-b90d-d8f35df31eb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 15, 'rx_bytes': 616, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 15, 'rx_bytes': 616, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235318, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.875 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[367c4297-5b1b-4d39-a72b-a3e79cf6c76f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235320, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235320, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.878 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.880 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:36 compute-0 nova_compute[187208]: 2025-12-05 12:12:36.881 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.881 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.881 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.882 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:36 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:36.882 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:37 compute-0 nova_compute[187208]: 2025-12-05 12:12:37.033 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936757.0323665, 7bf2559a-9191-4322-9e46-19761de59dc9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:37 compute-0 nova_compute[187208]: 2025-12-05 12:12:37.033 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] VM Started (Lifecycle Event)
Dec 05 12:12:37 compute-0 nova_compute[187208]: 2025-12-05 12:12:37.055 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:37 compute-0 nova_compute[187208]: 2025-12-05 12:12:37.059 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936757.0327256, 7bf2559a-9191-4322-9e46-19761de59dc9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:37 compute-0 nova_compute[187208]: 2025-12-05 12:12:37.060 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] VM Paused (Lifecycle Event)
Dec 05 12:12:37 compute-0 nova_compute[187208]: 2025-12-05 12:12:37.076 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:37 compute-0 nova_compute[187208]: 2025-12-05 12:12:37.080 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:37 compute-0 nova_compute[187208]: 2025-12-05 12:12:37.101 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:12:38 compute-0 nova_compute[187208]: 2025-12-05 12:12:38.291 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.205 187212 DEBUG nova.network.neutron [req-69872d56-0121-4550-a3a3-d2b6620436c6 req-7b8cf333-6911-4aaa-93e9-60eea87d9a4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Updated VIF entry in instance network info cache for port 8f30bb1b-124e-4840-966f-7fbd73ceba98. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.206 187212 DEBUG nova.network.neutron [req-69872d56-0121-4550-a3a3-d2b6620436c6 req-7b8cf333-6911-4aaa-93e9-60eea87d9a4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Updating instance_info_cache with network_info: [{"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.220 187212 DEBUG oslo_concurrency.lockutils [req-69872d56-0121-4550-a3a3-d2b6620436c6 req-7b8cf333-6911-4aaa-93e9-60eea87d9a4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-7bf2559a-9191-4322-9e46-19761de59dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.357 187212 DEBUG nova.compute.manager [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Received event network-vif-plugged-8f30bb1b-124e-4840-966f-7fbd73ceba98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.357 187212 DEBUG oslo_concurrency.lockutils [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.358 187212 DEBUG oslo_concurrency.lockutils [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.358 187212 DEBUG oslo_concurrency.lockutils [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.358 187212 DEBUG nova.compute.manager [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Processing event network-vif-plugged-8f30bb1b-124e-4840-966f-7fbd73ceba98 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.359 187212 DEBUG nova.compute.manager [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Received event network-vif-plugged-8f30bb1b-124e-4840-966f-7fbd73ceba98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.359 187212 DEBUG oslo_concurrency.lockutils [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.360 187212 DEBUG oslo_concurrency.lockutils [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.361 187212 DEBUG oslo_concurrency.lockutils [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.361 187212 DEBUG nova.compute.manager [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] No waiting events found dispatching network-vif-plugged-8f30bb1b-124e-4840-966f-7fbd73ceba98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.361 187212 WARNING nova.compute.manager [req-63ec6458-dd3a-49ba-84ca-bb35dd3d98b0 req-fd8f8835-f2ba-4a0e-9a9e-2117afc96aea 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Received unexpected event network-vif-plugged-8f30bb1b-124e-4840-966f-7fbd73ceba98 for instance with vm_state building and task_state spawning.
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.362 187212 DEBUG nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.367 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936759.366468, 7bf2559a-9191-4322-9e46-19761de59dc9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.367 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] VM Resumed (Lifecycle Event)
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.369 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.373 187212 INFO nova.virt.libvirt.driver [-] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Instance spawned successfully.
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.374 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.394 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.401 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.404 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.405 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.405 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.406 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.406 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.407 187212 DEBUG nova.virt.libvirt.driver [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.430 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.462 187212 INFO nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Took 12.15 seconds to spawn the instance on the hypervisor.
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.462 187212 DEBUG nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.543 187212 INFO nova.compute.manager [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Took 12.82 seconds to build instance.
Dec 05 12:12:39 compute-0 nova_compute[187208]: 2025-12-05 12:12:39.611 187212 DEBUG oslo_concurrency.lockutils [None req-40ecbc10-987d-4fbd-80eb-9cbd49503159 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:40 compute-0 nova_compute[187208]: 2025-12-05 12:12:40.286 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.146 187212 DEBUG nova.network.neutron [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Updating instance_info_cache with network_info: [{"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:42 compute-0 podman[235346]: 2025-12-05 12:12:42.214816097 +0000 UTC m=+0.069088073 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:12:42 compute-0 podman[235347]: 2025-12-05 12:12:42.250237314 +0000 UTC m=+0.100255568 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.290 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Releasing lock "refresh_cache-846c0e55-1620-4c7a-9792-d4f5f0d728d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.290 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Instance network_info: |[{"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.290 187212 DEBUG oslo_concurrency.lockutils [req-5caebb8b-10e1-4e70-b8c3-95a5f62adaa7 req-36a24d4b-c4ca-4110-9d9f-dff89f7c9ed4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-846c0e55-1620-4c7a-9792-d4f5f0d728d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.291 187212 DEBUG nova.network.neutron [req-5caebb8b-10e1-4e70-b8c3-95a5f62adaa7 req-36a24d4b-c4ca-4110-9d9f-dff89f7c9ed4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Refreshing network info cache for port 961ee213-955e-471f-8cb4-ad0d3a82285c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.294 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Start _get_guest_xml network_info=[{"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.300 187212 WARNING nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.312 187212 DEBUG nova.virt.libvirt.host [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.314 187212 DEBUG nova.virt.libvirt.host [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.319 187212 DEBUG nova.virt.libvirt.host [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.319 187212 DEBUG nova.virt.libvirt.host [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.319 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.320 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.320 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.321 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.321 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.321 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.321 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.322 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.322 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.322 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.323 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.323 187212 DEBUG nova.virt.hardware [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.328 187212 DEBUG nova.virt.libvirt.vif [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-729713724',display_name='tempest-ServersTestMultiNic-server-729713724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-729713724',id=90,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-zskavgn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:27Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=846c0e55-1620-4c7a-9792-d4f5f0d728d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.328 187212 DEBUG nova.network.os_vif_util [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.330 187212 DEBUG nova.network.os_vif_util [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:97:61,bridge_name='br-int',has_traffic_filtering=True,id=0e7829e3-325a-430d-898f-510a4c544ffa,network=Network(29118b32-7f75-46f6-9be9-f833be642f99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e7829e3-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.331 187212 DEBUG nova.virt.libvirt.vif [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-729713724',display_name='tempest-ServersTestMultiNic-server-729713724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-729713724',id=90,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-zskavgn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:27Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=846c0e55-1620-4c7a-9792-d4f5f0d728d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.332 187212 DEBUG nova.network.os_vif_util [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.333 187212 DEBUG nova.network.os_vif_util [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:9b:af,bridge_name='br-int',has_traffic_filtering=True,id=961ee213-955e-471f-8cb4-ad0d3a82285c,network=Network(bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap961ee213-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.335 187212 DEBUG nova.objects.instance [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lazy-loading 'pci_devices' on Instance uuid 846c0e55-1620-4c7a-9792-d4f5f0d728d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.939 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <uuid>846c0e55-1620-4c7a-9792-d4f5f0d728d8</uuid>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <name>instance-0000005a</name>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestMultiNic-server-729713724</nova:name>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:12:42</nova:creationTime>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:12:42 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         <nova:user uuid="430719002c284cd28237859ea6061eef">tempest-ServersTestMultiNic-1621990639-project-member</nova:user>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         <nova:project uuid="f7aededcaee54c4bbb7cba6007565f65">tempest-ServersTestMultiNic-1621990639</nova:project>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         <nova:port uuid="0e7829e3-325a-430d-898f-510a4c544ffa">
Dec 05 12:12:42 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.232" ipVersion="4"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         <nova:port uuid="961ee213-955e-471f-8cb4-ad0d3a82285c">
Dec 05 12:12:42 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.1.248" ipVersion="4"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <system>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <entry name="serial">846c0e55-1620-4c7a-9792-d4f5f0d728d8</entry>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <entry name="uuid">846c0e55-1620-4c7a-9792-d4f5f0d728d8</entry>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </system>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <os>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   </os>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <features>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   </features>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk.config"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:87:97:61"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <target dev="tap0e7829e3-32"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:31:9b:af"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <target dev="tap961ee213-95"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/console.log" append="off"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <video>
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </video>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:12:42 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:12:42 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:12:42 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:12:42 compute-0 nova_compute[187208]: </domain>
Dec 05 12:12:42 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.939 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Preparing to wait for external event network-vif-plugged-0e7829e3-325a-430d-898f-510a4c544ffa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.940 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.940 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.940 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.940 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Preparing to wait for external event network-vif-plugged-961ee213-955e-471f-8cb4-ad0d3a82285c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.940 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.941 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.941 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.942 187212 DEBUG nova.virt.libvirt.vif [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-729713724',display_name='tempest-ServersTestMultiNic-server-729713724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-729713724',id=90,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-zskavgn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:27Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=846c0e55-1620-4c7a-9792-d4f5f0d728d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.942 187212 DEBUG nova.network.os_vif_util [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.943 187212 DEBUG nova.network.os_vif_util [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:97:61,bridge_name='br-int',has_traffic_filtering=True,id=0e7829e3-325a-430d-898f-510a4c544ffa,network=Network(29118b32-7f75-46f6-9be9-f833be642f99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e7829e3-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.943 187212 DEBUG os_vif [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:97:61,bridge_name='br-int',has_traffic_filtering=True,id=0e7829e3-325a-430d-898f-510a4c544ffa,network=Network(29118b32-7f75-46f6-9be9-f833be642f99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e7829e3-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.944 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.944 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.944 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.949 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e7829e3-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.951 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e7829e3-32, col_values=(('external_ids', {'iface-id': '0e7829e3-325a-430d-898f-510a4c544ffa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:97:61', 'vm-uuid': '846c0e55-1620-4c7a-9792-d4f5f0d728d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.954 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:42 compute-0 NetworkManager[55691]: <info>  [1764936762.9558] manager: (tap0e7829e3-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.963 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.967 187212 INFO os_vif [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:97:61,bridge_name='br-int',has_traffic_filtering=True,id=0e7829e3-325a-430d-898f-510a4c544ffa,network=Network(29118b32-7f75-46f6-9be9-f833be642f99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e7829e3-32')
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.968 187212 DEBUG nova.virt.libvirt.vif [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-729713724',display_name='tempest-ServersTestMultiNic-server-729713724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-729713724',id=90,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-zskavgn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:27Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=846c0e55-1620-4c7a-9792-d4f5f0d728d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.969 187212 DEBUG nova.network.os_vif_util [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.970 187212 DEBUG nova.network.os_vif_util [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:9b:af,bridge_name='br-int',has_traffic_filtering=True,id=961ee213-955e-471f-8cb4-ad0d3a82285c,network=Network(bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap961ee213-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.970 187212 DEBUG os_vif [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9b:af,bridge_name='br-int',has_traffic_filtering=True,id=961ee213-955e-471f-8cb4-ad0d3a82285c,network=Network(bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap961ee213-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.970 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.971 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.971 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.973 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.973 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap961ee213-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.974 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap961ee213-95, col_values=(('external_ids', {'iface-id': '961ee213-955e-471f-8cb4-ad0d3a82285c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:9b:af', 'vm-uuid': '846c0e55-1620-4c7a-9792-d4f5f0d728d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.975 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:42 compute-0 NetworkManager[55691]: <info>  [1764936762.9764] manager: (tap961ee213-95): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.978 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:42 compute-0 nova_compute[187208]: 2025-12-05 12:12:42.982 187212 INFO os_vif [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9b:af,bridge_name='br-int',has_traffic_filtering=True,id=961ee213-955e-471f-8cb4-ad0d3a82285c,network=Network(bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap961ee213-95')
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.292 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.368 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.368 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.369 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] No VIF found with MAC fa:16:3e:87:97:61, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.369 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] No VIF found with MAC fa:16:3e:31:9b:af, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.369 187212 INFO nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Using config drive
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.699 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-30cb83d4-3a34-4420-bc83-099b266da48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.699 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-30cb83d4-3a34-4420-bc83-099b266da48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:43 compute-0 nova_compute[187208]: 2025-12-05 12:12:43.699 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:12:44 compute-0 podman[235400]: 2025-12-05 12:12:44.224889761 +0000 UTC m=+0.069742402 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:12:44 compute-0 nova_compute[187208]: 2025-12-05 12:12:44.790 187212 INFO nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Creating config drive at /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk.config
Dec 05 12:12:44 compute-0 nova_compute[187208]: 2025-12-05 12:12:44.795 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj26nw1t7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:44 compute-0 nova_compute[187208]: 2025-12-05 12:12:44.921 187212 DEBUG oslo_concurrency.processutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj26nw1t7" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:44 compute-0 NetworkManager[55691]: <info>  [1764936764.9941] manager: (tap0e7829e3-32): new Tun device (/org/freedesktop/NetworkManager/Devices/341)
Dec 05 12:12:44 compute-0 kernel: tap0e7829e3-32: entered promiscuous mode
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:45 compute-0 ovn_controller[95610]: 2025-12-05T12:12:45Z|00880|binding|INFO|Claiming lport 0e7829e3-325a-430d-898f-510a4c544ffa for this chassis.
Dec 05 12:12:45 compute-0 ovn_controller[95610]: 2025-12-05T12:12:45Z|00881|binding|INFO|0e7829e3-325a-430d-898f-510a4c544ffa: Claiming fa:16:3e:87:97:61 10.100.0.232
Dec 05 12:12:45 compute-0 NetworkManager[55691]: <info>  [1764936765.0149] manager: (tap961ee213-95): new Tun device (/org/freedesktop/NetworkManager/Devices/342)
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.021 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:97:61 10.100.0.232'], port_security=['fa:16:3e:87:97:61 10.100.0.232'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.232/24', 'neutron:device_id': '846c0e55-1620-4c7a-9792-d4f5f0d728d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29118b32-7f75-46f6-9be9-f833be642f99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f523b7fe-86fa-434f-ad90-764f0845cc18, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0e7829e3-325a-430d-898f-510a4c544ffa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.022 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0e7829e3-325a-430d-898f-510a4c544ffa in datapath 29118b32-7f75-46f6-9be9-f833be642f99 bound to our chassis
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.025 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 29118b32-7f75-46f6-9be9-f833be642f99
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.036 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9ca776-d261-4dee-8b4b-df0599a7f42d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.037 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap29118b32-71 in ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.038 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap29118b32-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.038 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[22498f91-df16-4d7c-91f7-24503643e7a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.039 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[94adbc10-172a-41a0-bc14-662eafeaa73c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 systemd-udevd[235442]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:12:45 compute-0 systemd-udevd[235443]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:12:45 compute-0 kernel: tap961ee213-95: entered promiscuous mode
Dec 05 12:12:45 compute-0 ovn_controller[95610]: 2025-12-05T12:12:45Z|00882|binding|INFO|Claiming lport 961ee213-955e-471f-8cb4-ad0d3a82285c for this chassis.
Dec 05 12:12:45 compute-0 ovn_controller[95610]: 2025-12-05T12:12:45Z|00883|binding|INFO|961ee213-955e-471f-8cb4-ad0d3a82285c: Claiming fa:16:3e:31:9b:af 10.100.1.248
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.052 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[31d8e9f9-ae54-49a0-ab87-0f1c42af9f16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.054 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:45 compute-0 NetworkManager[55691]: <info>  [1764936765.0623] device (tap961ee213-95): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:12:45 compute-0 NetworkManager[55691]: <info>  [1764936765.0634] device (tap961ee213-95): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:12:45 compute-0 ovn_controller[95610]: 2025-12-05T12:12:45Z|00884|binding|INFO|Setting lport 0e7829e3-325a-430d-898f-510a4c544ffa ovn-installed in OVS
Dec 05 12:12:45 compute-0 NetworkManager[55691]: <info>  [1764936765.0652] device (tap0e7829e3-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:12:45 compute-0 NetworkManager[55691]: <info>  [1764936765.0660] device (tap0e7829e3-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:12:45 compute-0 ovn_controller[95610]: 2025-12-05T12:12:45Z|00885|binding|INFO|Setting lport 0e7829e3-325a-430d-898f-510a4c544ffa up in Southbound
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.067 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:9b:af 10.100.1.248'], port_security=['fa:16:3e:31:9b:af 10.100.1.248'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.248/24', 'neutron:device_id': '846c0e55-1620-4c7a-9792-d4f5f0d728d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f1562571-290f-4ea3-be5a-e33bc5391f1b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=961ee213-955e-471f-8cb4-ad0d3a82285c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.069 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:45 compute-0 systemd-machined[153543]: New machine qemu-102-instance-0000005a.
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.083 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c79d1cf7-b36d-46ac-8e77-be5b39869da3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_controller[95610]: 2025-12-05T12:12:45Z|00886|binding|INFO|Setting lport 961ee213-955e-471f-8cb4-ad0d3a82285c ovn-installed in OVS
Dec 05 12:12:45 compute-0 ovn_controller[95610]: 2025-12-05T12:12:45Z|00887|binding|INFO|Setting lport 961ee213-955e-471f-8cb4-ad0d3a82285c up in Southbound
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.094 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:45 compute-0 systemd[1]: Started Virtual Machine qemu-102-instance-0000005a.
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.121 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0e0b3c66-ad46-40e3-9727-071c733ae232]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.131 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[41d98db7-b17d-4226-bc6b-f0febda84d1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 NetworkManager[55691]: <info>  [1764936765.1331] manager: (tap29118b32-70): new Veth device (/org/freedesktop/NetworkManager/Devices/343)
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.161 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[50591322-dfed-4abe-bbe8-73f3d218832b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.163 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[657adc85-9a1a-4d80-b344-ea4d78412b3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 NetworkManager[55691]: <info>  [1764936765.1859] device (tap29118b32-70): carrier: link connected
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.191 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[805a284f-de39-47cc-9e75-6c8166b67416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.208 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad57149-a788-40bd-b034-d71fa67bf8ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29118b32-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:8c:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414574, 'reachable_time': 41773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235478, 'error': None, 'target': 'ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.223 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84fa0243-d731-4245-aab2-dc70890a744e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea3:8cf4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414574, 'tstamp': 414574}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235479, 'error': None, 'target': 'ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.244 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3ea502-c949-4a02-9c1f-c7d4bc917d6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29118b32-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:8c:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414574, 'reachable_time': 41773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235480, 'error': None, 'target': 'ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.281 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[34a5f75a-0a94-4a95-97fa-0bca8dead051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.340 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d652bef6-48b1-4341-8be4-55d4b9c30208]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.342 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29118b32-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.342 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.343 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29118b32-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:45 compute-0 NetworkManager[55691]: <info>  [1764936765.3457] manager: (tap29118b32-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Dec 05 12:12:45 compute-0 kernel: tap29118b32-70: entered promiscuous mode
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.348 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.354 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap29118b32-70, col_values=(('external_ids', {'iface-id': '1baba86e-543d-4445-849a-95e506cea7a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.355 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:45 compute-0 ovn_controller[95610]: 2025-12-05T12:12:45Z|00888|binding|INFO|Releasing lport 1baba86e-543d-4445-849a-95e506cea7a2 from this chassis (sb_readonly=0)
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.368 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.368 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/29118b32-7f75-46f6-9be9-f833be642f99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/29118b32-7f75-46f6-9be9-f833be642f99.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.370 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce2ef8fb-3a80-4825-9951-bb169dd7c08e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.371 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-29118b32-7f75-46f6-9be9-f833be642f99
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/29118b32-7f75-46f6-9be9-f833be642f99.pid.haproxy
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 29118b32-7f75-46f6-9be9-f833be642f99
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:12:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:45.371 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99', 'env', 'PROCESS_TAG=haproxy-29118b32-7f75-46f6-9be9-f833be642f99', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/29118b32-7f75-46f6-9be9-f833be642f99.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.438 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936765.438282, 846c0e55-1620-4c7a-9792-d4f5f0d728d8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.439 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] VM Started (Lifecycle Event)
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.534 187212 DEBUG nova.network.neutron [req-5caebb8b-10e1-4e70-b8c3-95a5f62adaa7 req-36a24d4b-c4ca-4110-9d9f-dff89f7c9ed4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Updated VIF entry in instance network info cache for port 961ee213-955e-471f-8cb4-ad0d3a82285c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.534 187212 DEBUG nova.network.neutron [req-5caebb8b-10e1-4e70-b8c3-95a5f62adaa7 req-36a24d4b-c4ca-4110-9d9f-dff89f7c9ed4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Updating instance_info_cache with network_info: [{"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:45 compute-0 podman[235520]: 2025-12-05 12:12:45.791056695 +0000 UTC m=+0.090927710 container create 4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 05 12:12:45 compute-0 podman[235520]: 2025-12-05 12:12:45.725843864 +0000 UTC m=+0.025714899 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.877 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.878 187212 DEBUG oslo_concurrency.lockutils [req-5caebb8b-10e1-4e70-b8c3-95a5f62adaa7 req-36a24d4b-c4ca-4110-9d9f-dff89f7c9ed4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-846c0e55-1620-4c7a-9792-d4f5f0d728d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.883 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936765.441227, 846c0e55-1620-4c7a-9792-d4f5f0d728d8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.883 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] VM Paused (Lifecycle Event)
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.912 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.916 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.958 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:12:45 compute-0 nova_compute[187208]: 2025-12-05 12:12:45.991 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Updating instance_info_cache with network_info: [{"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.007 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-30cb83d4-3a34-4420-bc83-099b266da48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.007 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.008 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.008 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.008 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.009 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:12:46 compute-0 systemd[1]: Started libpod-conmon-4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6.scope.
Dec 05 12:12:46 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:12:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fad4c702a0e082abb762dbdee375854cb8e067772014c4935c6133443e58ea4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.160 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:12:46 compute-0 podman[235520]: 2025-12-05 12:12:46.222231689 +0000 UTC m=+0.522102714 container init 4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 05 12:12:46 compute-0 podman[235520]: 2025-12-05 12:12:46.227460099 +0000 UTC m=+0.527331114 container start 4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:12:46 compute-0 neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99[235535]: [NOTICE]   (235539) : New worker (235541) forked
Dec 05 12:12:46 compute-0 neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99[235535]: [NOTICE]   (235539) : Loading success.
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.307 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 961ee213-955e-471f-8cb4-ad0d3a82285c in datapath bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb unbound from our chassis
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.311 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.321 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4b6c8356-f05e-4fda-9c55-b40f0f9c3bca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.322 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbfee25ff-e1 in ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.324 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbfee25ff-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.324 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd7d02c-8870-408c-99a0-b0295f2c18f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.325 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9ff4f48a-46ca-4922-9a6b-b54e4cad59a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.339 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8242318c-69c3-4e71-874c-65d1d2ce2ad6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.360 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5312cb2c-09d2-4e17-85a7-dfee1f6829b8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.392 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[90620c4e-bfff-4040-a8c8-a097c63f3c4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.399 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9d5cc3b6-4a19-4588-a82e-95044bc9a9a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 NetworkManager[55691]: <info>  [1764936766.4008] manager: (tapbfee25ff-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/345)
Dec 05 12:12:46 compute-0 systemd-udevd[235462]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.440 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad3b0bc-3fd8-4a08-8fe8-e11962b9e2be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.444 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[41413426-4f1a-42ed-85c8-20683b227e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 NetworkManager[55691]: <info>  [1764936766.4699] device (tapbfee25ff-e0): carrier: link connected
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.475 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f59a851e-2007-45c2-9c58-97ce9e3c16b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.494 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbe32b1-cfed-4716-ade1-992685d36427]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbfee25ff-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:7e:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414703, 'reachable_time': 23168, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235560, 'error': None, 'target': 'ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.517 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[145a0d58-8734-4b72-9955-11cb4efd98bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:7ee8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414703, 'tstamp': 414703}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235561, 'error': None, 'target': 'ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.537 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9542da4f-a896-4c76-a0c6-2233579bd3b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbfee25ff-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:7e:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414703, 'reachable_time': 23168, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235562, 'error': None, 'target': 'ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.577 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05c22445-460d-44ce-9089-f0f0621b01a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.614 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.614 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.649 187212 DEBUG nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.663 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[faafd323-7065-431a-bf1c-2ceacc6d83d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.664 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfee25ff-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.665 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.665 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfee25ff-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.666 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:46 compute-0 NetworkManager[55691]: <info>  [1764936766.6677] manager: (tapbfee25ff-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Dec 05 12:12:46 compute-0 kernel: tapbfee25ff-e0: entered promiscuous mode
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.670 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.671 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbfee25ff-e0, col_values=(('external_ids', {'iface-id': '91c94d38-18a4-4504-aaab-75dfc5aa01bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.672 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:46 compute-0 ovn_controller[95610]: 2025-12-05T12:12:46Z|00889|binding|INFO|Releasing lport 91c94d38-18a4-4504-aaab-75dfc5aa01bc from this chassis (sb_readonly=0)
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.687 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.688 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db0484be-ad06-46c8-b224-398fe5bd88e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.688 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb.pid.haproxy
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:12:46 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:46.689 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb', 'env', 'PROCESS_TAG=haproxy-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.726 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.727 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.737 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.737 187212 INFO nova.compute.claims [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.982 187212 DEBUG oslo_concurrency.lockutils [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "7bf2559a-9191-4322-9e46-19761de59dc9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.983 187212 DEBUG oslo_concurrency.lockutils [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.983 187212 DEBUG oslo_concurrency.lockutils [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.984 187212 DEBUG oslo_concurrency.lockutils [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.984 187212 DEBUG oslo_concurrency.lockutils [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.986 187212 INFO nova.compute.manager [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Terminating instance
Dec 05 12:12:46 compute-0 nova_compute[187208]: 2025-12-05 12:12:46.987 187212 DEBUG nova.compute.manager [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.007 187212 DEBUG nova.compute.provider_tree [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:47 compute-0 kernel: tap8f30bb1b-12 (unregistering): left promiscuous mode
Dec 05 12:12:47 compute-0 NetworkManager[55691]: <info>  [1764936767.0201] device (tap8f30bb1b-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.025 187212 DEBUG nova.scheduler.client.report [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.030 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:47 compute-0 ovn_controller[95610]: 2025-12-05T12:12:47Z|00890|binding|INFO|Releasing lport 8f30bb1b-124e-4840-966f-7fbd73ceba98 from this chassis (sb_readonly=0)
Dec 05 12:12:47 compute-0 ovn_controller[95610]: 2025-12-05T12:12:47Z|00891|binding|INFO|Setting lport 8f30bb1b-124e-4840-966f-7fbd73ceba98 down in Southbound
Dec 05 12:12:47 compute-0 ovn_controller[95610]: 2025-12-05T12:12:47Z|00892|binding|INFO|Removing iface tap8f30bb1b-12 ovn-installed in OVS
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.036 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.040 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:43:ef 10.100.0.5'], port_security=['fa:16:3e:e4:43:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7bf2559a-9191-4322-9e46-19761de59dc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8f30bb1b-124e-4840-966f-7fbd73ceba98) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.048 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.063 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.064 187212 DEBUG nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:12:47 compute-0 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d00000059.scope: Deactivated successfully.
Dec 05 12:12:47 compute-0 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d00000059.scope: Consumed 7.893s CPU time.
Dec 05 12:12:47 compute-0 systemd-machined[153543]: Machine qemu-101-instance-00000059 terminated.
Dec 05 12:12:47 compute-0 podman[235595]: 2025-12-05 12:12:47.088923431 +0000 UTC m=+0.074162599 container create 6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.117 187212 DEBUG nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.117 187212 DEBUG nova.network.neutron [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:12:47 compute-0 systemd[1]: Started libpod-conmon-6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8.scope.
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.134 187212 INFO nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:12:47 compute-0 podman[235595]: 2025-12-05 12:12:47.046415131 +0000 UTC m=+0.031654319 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.161 187212 DEBUG nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:12:47 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:12:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12e3b2ef5267c5c9a54e90389de405eff9410b180f781d0c8a7327cc11d9cf1c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.257 187212 INFO nova.virt.libvirt.driver [-] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Instance destroyed successfully.
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.258 187212 DEBUG nova.objects.instance [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'resources' on Instance uuid 7bf2559a-9191-4322-9e46-19761de59dc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.263 187212 DEBUG nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.264 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.265 187212 INFO nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Creating image(s)
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.265 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.265 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.266 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.279 187212 DEBUG nova.virt.libvirt.vif [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1949687168',display_name='tempest-ServersTestJSON-server-1949687168',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1949687168',id=89,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:12:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-mvq04v0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:12:39Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=7bf2559a-9191-4322-9e46-19761de59dc9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.279 187212 DEBUG nova.network.os_vif_util [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "address": "fa:16:3e:e4:43:ef", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f30bb1b-12", "ovs_interfaceid": "8f30bb1b-124e-4840-966f-7fbd73ceba98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.280 187212 DEBUG nova.network.os_vif_util [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:43:ef,bridge_name='br-int',has_traffic_filtering=True,id=8f30bb1b-124e-4840-966f-7fbd73ceba98,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f30bb1b-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.280 187212 DEBUG os_vif [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:43:ef,bridge_name='br-int',has_traffic_filtering=True,id=8f30bb1b-124e-4840-966f-7fbd73ceba98,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f30bb1b-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.282 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.283 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f30bb1b-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.284 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.303 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.306 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.308 187212 INFO os_vif [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:43:ef,bridge_name='br-int',has_traffic_filtering=True,id=8f30bb1b-124e-4840-966f-7fbd73ceba98,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f30bb1b-12')
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.309 187212 INFO nova.virt.libvirt.driver [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Deleting instance files /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9_del
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.310 187212 INFO nova.virt.libvirt.driver [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Deletion of /var/lib/nova/instances/7bf2559a-9191-4322-9e46-19761de59dc9_del complete
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.343 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.344 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.344 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.355 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:47 compute-0 podman[235595]: 2025-12-05 12:12:47.370390669 +0000 UTC m=+0.355629857 container init 6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.374 187212 DEBUG nova.policy [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2f30ba31f27c4af2b73c6c86da366ebd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8c82a7ffe1fe49a88eb03f0d89c9629e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:12:47 compute-0 podman[235595]: 2025-12-05 12:12:47.379746837 +0000 UTC m=+0.364986025 container start 6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.387 187212 INFO nova.compute.manager [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.388 187212 DEBUG oslo.service.loopingcall [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.388 187212 DEBUG nova.compute.manager [-] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.388 187212 DEBUG nova.network.neutron [-] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:12:47 compute-0 neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb[235614]: [NOTICE]   (235639) : New worker (235641) forked
Dec 05 12:12:47 compute-0 neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb[235614]: [NOTICE]   (235639) : Loading success.
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.413 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.414 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.456 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8f30bb1b-124e-4840-966f-7fbd73ceba98 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 unbound from our chassis
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.458 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.473 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1b135f88-6636-4e41-a9e9-0346c1710095]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.485 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk 1073741824" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.487 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.487 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.503 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f631342c-bbbc-473d-92eb-33cabba5a424]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.507 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[387360da-f1b8-413b-ac64-d3d4c6170fdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.537 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a70bf854-48cc-4e25-9241-1208e023e1e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.551 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.552 187212 DEBUG nova.virt.disk.api [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Checking if we can resize image /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.553 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.555 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[415ed954-30ac-44f3-a12d-66b880e18d8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 17, 'rx_bytes': 700, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 17, 'rx_bytes': 700, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235661, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.570 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78a8cf95-5ff3-4d6e-ae31-0c4f63e9f724]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235664, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235664, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.572 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.575 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.575 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.575 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:47 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:47.576 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.575 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.617 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.618 187212 DEBUG nova.virt.disk.api [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Cannot resize image /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.619 187212 DEBUG nova.objects.instance [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lazy-loading 'migration_context' on Instance uuid 95b4dafa-871e-42c8-8fb1-162d6b45f3aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.635 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.636 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Ensure instance console log exists: /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.636 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.637 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:47 compute-0 nova_compute[187208]: 2025-12-05 12:12:47.637 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:48 compute-0 nova_compute[187208]: 2025-12-05 12:12:48.294 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:48 compute-0 kernel: tapc7d6a93d-87 (unregistering): left promiscuous mode
Dec 05 12:12:48 compute-0 NetworkManager[55691]: <info>  [1764936768.4363] device (tapc7d6a93d-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:12:48 compute-0 ovn_controller[95610]: 2025-12-05T12:12:48Z|00893|binding|INFO|Releasing lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 from this chassis (sb_readonly=0)
Dec 05 12:12:48 compute-0 nova_compute[187208]: 2025-12-05 12:12:48.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:48 compute-0 nova_compute[187208]: 2025-12-05 12:12:48.449 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:48 compute-0 ovn_controller[95610]: 2025-12-05T12:12:48Z|00894|binding|INFO|Setting lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 down in Southbound
Dec 05 12:12:48 compute-0 ovn_controller[95610]: 2025-12-05T12:12:48Z|00895|binding|INFO|Removing iface tapc7d6a93d-87 ovn-installed in OVS
Dec 05 12:12:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:48.459 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:e5:61 10.100.0.8'], port_security=['fa:16:3e:17:e5:61 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:48 compute-0 nova_compute[187208]: 2025-12-05 12:12:48.461 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:48.461 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 unbound from our chassis
Dec 05 12:12:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:48.463 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:12:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:48.464 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc9d50e-a622-442c-996e-3309f6582b2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:48 compute-0 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d00000058.scope: Deactivated successfully.
Dec 05 12:12:48 compute-0 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d00000058.scope: Consumed 13.300s CPU time.
Dec 05 12:12:48 compute-0 systemd-machined[153543]: Machine qemu-100-instance-00000058 terminated.
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.194 187212 DEBUG nova.network.neutron [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Successfully created port: e0274344-5869-486b-a457-04b90e756602 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.201 187212 INFO nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance shutdown successfully after 13 seconds.
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.212 187212 INFO nova.virt.libvirt.driver [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance destroyed successfully.
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.213 187212 DEBUG nova.objects.instance [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'numa_topology' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.236 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.236 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.237 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.237 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.238 187212 DEBUG nova.network.neutron [-] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.244 187212 INFO nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Attempting rescue
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.245 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.252 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.252 187212 INFO nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Creating image(s)
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.253 187212 DEBUG oslo_concurrency.lockutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.253 187212 DEBUG oslo_concurrency.lockutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.254 187212 DEBUG oslo_concurrency.lockutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.254 187212 DEBUG nova.objects.instance [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.287 187212 INFO nova.compute.manager [-] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Took 1.90 seconds to deallocate network for instance.
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.292 187212 DEBUG oslo_concurrency.lockutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.292 187212 DEBUG oslo_concurrency.lockutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.307 187212 DEBUG oslo_concurrency.processutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.368 187212 DEBUG oslo_concurrency.processutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.369 187212 DEBUG oslo_concurrency.processutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.394 187212 DEBUG oslo_concurrency.lockutils [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.394 187212 DEBUG oslo_concurrency.lockutils [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.427 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.484 187212 DEBUG oslo_concurrency.processutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.rescue" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.485 187212 DEBUG oslo_concurrency.lockutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.486 187212 DEBUG nova.objects.instance [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'migration_context' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.499 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.500 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.528 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.530 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Start _get_guest_xml network_info=[{"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-689964296-network", "vif_mac": "fa:16:3e:17:e5:61"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a6987852-063f-405d-a848-6b382694811e', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.530 187212 DEBUG nova.objects.instance [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'resources' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.559 187212 WARNING nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.569 187212 DEBUG nova.virt.libvirt.host [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.570 187212 DEBUG nova.virt.libvirt.host [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.573 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.580 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.616 187212 DEBUG nova.compute.manager [req-ae167c59-c5fa-41f0-93cb-e6e0f18ad65e req-cd6c5603-2885-4d9b-813c-9a865c35054e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Received event network-vif-deleted-8f30bb1b-124e-4840-966f-7fbd73ceba98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.617 187212 DEBUG nova.virt.libvirt.host [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.618 187212 DEBUG nova.virt.libvirt.host [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.618 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.618 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.619 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.619 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.619 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.620 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.620 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.620 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.620 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.620 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.621 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.621 187212 DEBUG nova.virt.hardware [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.621 187212 DEBUG nova.objects.instance [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.654 187212 DEBUG nova.virt.libvirt.vif [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-604643112',display_name='tempest-ServerRescueTestJSON-server-604643112',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-604643112',id=88,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:12:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f73626a62534c97a06b6ec98d749111',ramdisk_id='',reservation_id='r-ri0fnqjs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-122605385',owner_user_name='tempest-ServerRescueTestJSON-122605385-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:30Z,user_data=None,user_id='d12bb49c0ca84e8dad933b49753c7b24',uuid=b661b497-acb9-4b26-8e26-7d0802bca8bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-689964296-network", "vif_mac": "fa:16:3e:17:e5:61"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.655 187212 DEBUG nova.network.os_vif_util [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converting VIF {"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-689964296-network", "vif_mac": "fa:16:3e:17:e5:61"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.655 187212 DEBUG nova.network.os_vif_util [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:e5:61,bridge_name='br-int',has_traffic_filtering=True,id=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7d6a93d-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.656 187212 DEBUG nova.objects.instance [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'pci_devices' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.661 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.662 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.690 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <uuid>b661b497-acb9-4b26-8e26-7d0802bca8bf</uuid>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <name>instance-00000058</name>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerRescueTestJSON-server-604643112</nova:name>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:12:49</nova:creationTime>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:12:49 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:12:49 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:12:49 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:12:49 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:12:49 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:12:49 compute-0 nova_compute[187208]:         <nova:user uuid="d12bb49c0ca84e8dad933b49753c7b24">tempest-ServerRescueTestJSON-122605385-project-member</nova:user>
Dec 05 12:12:49 compute-0 nova_compute[187208]:         <nova:project uuid="8f73626a62534c97a06b6ec98d749111">tempest-ServerRescueTestJSON-122605385</nova:project>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:12:49 compute-0 nova_compute[187208]:         <nova:port uuid="c7d6a93d-8775-4c4e-9a60-bc8e87e5b310">
Dec 05 12:12:49 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <system>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <entry name="serial">b661b497-acb9-4b26-8e26-7d0802bca8bf</entry>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <entry name="uuid">b661b497-acb9-4b26-8e26-7d0802bca8bf</entry>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </system>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <os>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   </os>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <features>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   </features>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.rescue"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <target dev="vdb" bus="virtio"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.config.rescue"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:17:e5:61"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <target dev="tapc7d6a93d-87"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/console.log" append="off"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <video>
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </video>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:12:49 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:12:49 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:12:49 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:12:49 compute-0 nova_compute[187208]: </domain>
Dec 05 12:12:49 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.703 187212 INFO nova.virt.libvirt.driver [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance destroyed successfully.
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.724 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.731 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.764 187212 DEBUG nova.compute.provider_tree [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.787 187212 DEBUG nova.scheduler.client.report [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.794 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.794 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.794 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.794 187212 DEBUG nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] No VIF found with MAC fa:16:3e:17:e5:61, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.795 187212 INFO nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Using config drive
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.799 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.799 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.823 187212 DEBUG nova.objects.instance [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'ec2_ids' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.828 187212 DEBUG oslo_concurrency.lockutils [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.433s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.858 187212 INFO nova.scheduler.client.report [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Deleted allocations for instance 7bf2559a-9191-4322-9e46-19761de59dc9
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.860 187212 DEBUG nova.objects.instance [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'keypairs' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.870 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.888 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.952 187212 DEBUG oslo_concurrency.lockutils [None req-44e578ab-fcfe-4161-aa23-98e5fc1f4978 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "7bf2559a-9191-4322-9e46-19761de59dc9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.956 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.rescue --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:49 compute-0 nova_compute[187208]: 2025-12-05 12:12:49.956 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.013 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk.rescue --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.014 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.050 187212 DEBUG nova.compute.manager [req-3e4b758d-74e8-4f0f-9207-393f94db034d req-efd8fe5c-7547-425b-8700-15a10d3e2abc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-vif-plugged-0e7829e3-325a-430d-898f-510a4c544ffa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.050 187212 DEBUG oslo_concurrency.lockutils [req-3e4b758d-74e8-4f0f-9207-393f94db034d req-efd8fe5c-7547-425b-8700-15a10d3e2abc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.050 187212 DEBUG oslo_concurrency.lockutils [req-3e4b758d-74e8-4f0f-9207-393f94db034d req-efd8fe5c-7547-425b-8700-15a10d3e2abc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.051 187212 DEBUG oslo_concurrency.lockutils [req-3e4b758d-74e8-4f0f-9207-393f94db034d req-efd8fe5c-7547-425b-8700-15a10d3e2abc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.052 187212 DEBUG nova.compute.manager [req-3e4b758d-74e8-4f0f-9207-393f94db034d req-efd8fe5c-7547-425b-8700-15a10d3e2abc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Processing event network-vif-plugged-0e7829e3-325a-430d-898f-510a4c544ffa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.071 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.072 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.131 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.136 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.207 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.207 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.270 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.277 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.359 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.360 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.415 187212 INFO nova.virt.libvirt.driver [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Creating config drive at /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.config.rescue
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.420 187212 DEBUG oslo_concurrency.processutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_c2l52w_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.441 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.550 187212 DEBUG oslo_concurrency.processutils [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_c2l52w_" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:50 compute-0 kernel: tapc7d6a93d-87: entered promiscuous mode
Dec 05 12:12:50 compute-0 systemd-udevd[235670]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:12:50 compute-0 NetworkManager[55691]: <info>  [1764936770.6362] manager: (tapc7d6a93d-87): new Tun device (/org/freedesktop/NetworkManager/Devices/347)
Dec 05 12:12:50 compute-0 ovn_controller[95610]: 2025-12-05T12:12:50Z|00896|binding|INFO|Claiming lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for this chassis.
Dec 05 12:12:50 compute-0 ovn_controller[95610]: 2025-12-05T12:12:50Z|00897|binding|INFO|c7d6a93d-8775-4c4e-9a60-bc8e87e5b310: Claiming fa:16:3e:17:e5:61 10.100.0.8
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.637 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:50.642 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:e5:61 10.100.0.8'], port_security=['fa:16:3e:17:e5:61 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:50.643 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 bound to our chassis
Dec 05 12:12:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:50.644 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:12:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:50.645 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce3a6601-b7d1-4eb8-a97a-871f4a63356f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:50 compute-0 NetworkManager[55691]: <info>  [1764936770.6464] device (tapc7d6a93d-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:12:50 compute-0 NetworkManager[55691]: <info>  [1764936770.6477] device (tapc7d6a93d-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:12:50 compute-0 ovn_controller[95610]: 2025-12-05T12:12:50Z|00898|binding|INFO|Setting lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 ovn-installed in OVS
Dec 05 12:12:50 compute-0 ovn_controller[95610]: 2025-12-05T12:12:50Z|00899|binding|INFO|Setting lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 up in Southbound
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.652 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.655 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:50 compute-0 systemd-machined[153543]: New machine qemu-103-instance-00000058.
Dec 05 12:12:50 compute-0 systemd[1]: Started Virtual Machine qemu-103-instance-00000058.
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.744 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.745 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4903MB free_disk=72.96663284301758GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.745 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.746 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.832 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 30cb83d4-3a34-4420-bc83-099b266da48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.832 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 28e48516-8665-4d98-a92d-c84b7da9a284 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.833 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance f2a101e0-138f-404e-b6e0-e1359272f560 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.833 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance f3769524-43d7-4c3b-be59-18bf7af73e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.833 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b661b497-acb9-4b26-8e26-7d0802bca8bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.833 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 846c0e55-1620-4c7a-9792-d4f5f0d728d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.833 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 95b4dafa-871e-42c8-8fb1-162d6b45f3aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.834 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.834 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1408MB phys_disk=79GB used_disk=7GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.899 187212 DEBUG nova.network.neutron [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Successfully updated port: e0274344-5869-486b-a457-04b90e756602 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.917 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "refresh_cache-95b4dafa-871e-42c8-8fb1-162d6b45f3aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.917 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquired lock "refresh_cache-95b4dafa-871e-42c8-8fb1-162d6b45f3aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.918 187212 DEBUG nova.network.neutron [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:12:50 compute-0 nova_compute[187208]: 2025-12-05 12:12:50.988 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.005 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.037 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.038 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.104 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for b661b497-acb9-4b26-8e26-7d0802bca8bf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.105 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936771.1041527, b661b497-acb9-4b26-8e26-7d0802bca8bf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.105 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] VM Resumed (Lifecycle Event)
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.110 187212 DEBUG nova.compute.manager [None req-d70785fc-a798-4bec-8d45-f515a486ff58 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.168 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.171 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.175 187212 INFO nova.compute.manager [None req-33aefcf1-e2e4-4756-a561-ee789cbc939a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Pausing
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.176 187212 DEBUG nova.objects.instance [None req-33aefcf1-e2e4-4756-a561-ee789cbc939a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'flavor' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.202 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] During sync_power_state the instance has a pending task (rescuing). Skip.
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.203 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936771.104997, b661b497-acb9-4b26-8e26-7d0802bca8bf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.203 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] VM Started (Lifecycle Event)
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.227 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.231 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.242 187212 DEBUG nova.compute.manager [None req-33aefcf1-e2e4-4756-a561-ee789cbc939a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.254 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936771.2125266, 28e48516-8665-4d98-a92d-c84b7da9a284 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.255 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Paused (Lifecycle Event)
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.277 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.281 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:51 compute-0 nova_compute[187208]: 2025-12-05 12:12:51.426 187212 DEBUG nova.network.neutron [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:12:52 compute-0 nova_compute[187208]: 2025-12-05 12:12:52.038 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:12:52 compute-0 nova_compute[187208]: 2025-12-05 12:12:52.039 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:12:52 compute-0 nova_compute[187208]: 2025-12-05 12:12:52.039 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:12:52 compute-0 nova_compute[187208]: 2025-12-05 12:12:52.285 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.210 187212 DEBUG nova.network.neutron [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Updating instance_info_cache with network_info: [{"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:53 compute-0 podman[235774]: 2025-12-05 12:12:53.213101268 +0000 UTC m=+0.057254124 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.281 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Releasing lock "refresh_cache-95b4dafa-871e-42c8-8fb1-162d6b45f3aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.282 187212 DEBUG nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Instance network_info: |[{"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.285 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Start _get_guest_xml network_info=[{"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.289 187212 WARNING nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.296 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.306 187212 DEBUG nova.virt.libvirt.host [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.307 187212 DEBUG nova.virt.libvirt.host [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.317 187212 DEBUG nova.virt.libvirt.host [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.318 187212 DEBUG nova.virt.libvirt.host [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.319 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.319 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.320 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.320 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.320 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.320 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.321 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.321 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.321 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.322 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.322 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.322 187212 DEBUG nova.virt.hardware [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.327 187212 DEBUG nova.virt.libvirt.vif [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-2085942428',display_name='tempest-InstanceActionsTestJSON-server-2085942428',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-2085942428',id=91,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c82a7ffe1fe49a88eb03f0d89c9629e',ramdisk_id='',reservation_id='r-7ofp0hr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-345226518',owner_user_name='tempest-InstanceActionsTestJSON-345226518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:47Z,user_data=None,user_id='2f30ba31f27c4af2b73c6c86da366ebd',uuid=95b4dafa-871e-42c8-8fb1-162d6b45f3aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.328 187212 DEBUG nova.network.os_vif_util [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converting VIF {"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.329 187212 DEBUG nova.network.os_vif_util [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.330 187212 DEBUG nova.objects.instance [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lazy-loading 'pci_devices' on Instance uuid 95b4dafa-871e-42c8-8fb1-162d6b45f3aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.347 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <uuid>95b4dafa-871e-42c8-8fb1-162d6b45f3aa</uuid>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <name>instance-0000005b</name>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <nova:name>tempest-InstanceActionsTestJSON-server-2085942428</nova:name>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:12:53</nova:creationTime>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:12:53 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:12:53 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:12:53 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:12:53 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:12:53 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:12:53 compute-0 nova_compute[187208]:         <nova:user uuid="2f30ba31f27c4af2b73c6c86da366ebd">tempest-InstanceActionsTestJSON-345226518-project-member</nova:user>
Dec 05 12:12:53 compute-0 nova_compute[187208]:         <nova:project uuid="8c82a7ffe1fe49a88eb03f0d89c9629e">tempest-InstanceActionsTestJSON-345226518</nova:project>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:12:53 compute-0 nova_compute[187208]:         <nova:port uuid="e0274344-5869-486b-a457-04b90e756602">
Dec 05 12:12:53 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <system>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <entry name="serial">95b4dafa-871e-42c8-8fb1-162d6b45f3aa</entry>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <entry name="uuid">95b4dafa-871e-42c8-8fb1-162d6b45f3aa</entry>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     </system>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <os>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   </os>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <features>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   </features>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.config"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:0f:99:d7"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <target dev="tape0274344-58"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/console.log" append="off"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <video>
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     </video>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:12:53 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:12:53 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:12:53 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:12:53 compute-0 nova_compute[187208]: </domain>
Dec 05 12:12:53 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.348 187212 DEBUG nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Preparing to wait for external event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.348 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.348 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.349 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.349 187212 DEBUG nova.virt.libvirt.vif [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-2085942428',display_name='tempest-InstanceActionsTestJSON-server-2085942428',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-2085942428',id=91,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c82a7ffe1fe49a88eb03f0d89c9629e',ramdisk_id='',reservation_id='r-7ofp0hr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-345226518',owner_user_name='tempest-InstanceActionsTestJSON-345226518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:47Z,user_data=None,user_id='2f30ba31f27c4af2b73c6c86da366ebd',uuid=95b4dafa-871e-42c8-8fb1-162d6b45f3aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.350 187212 DEBUG nova.network.os_vif_util [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converting VIF {"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.350 187212 DEBUG nova.network.os_vif_util [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.351 187212 DEBUG os_vif [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.351 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.352 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.352 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.356 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape0274344-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.357 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape0274344-58, col_values=(('external_ids', {'iface-id': 'e0274344-5869-486b-a457-04b90e756602', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:99:d7', 'vm-uuid': '95b4dafa-871e-42c8-8fb1-162d6b45f3aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.358 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.361 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:53 compute-0 NetworkManager[55691]: <info>  [1764936773.3619] manager: (tape0274344-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.365 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.367 187212 INFO os_vif [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58')
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.566 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.567 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.567 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] No VIF found with MAC fa:16:3e:0f:99:d7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.567 187212 INFO nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Using config drive
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.679 187212 DEBUG oslo_concurrency.lockutils [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "f3769524-43d7-4c3b-be59-18bf7af73e18" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.680 187212 DEBUG oslo_concurrency.lockutils [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.680 187212 DEBUG oslo_concurrency.lockutils [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.680 187212 DEBUG oslo_concurrency.lockutils [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.680 187212 DEBUG oslo_concurrency.lockutils [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.681 187212 INFO nova.compute.manager [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Terminating instance
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.683 187212 DEBUG nova.compute.manager [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:12:53 compute-0 kernel: tapecb20c89-e0 (unregistering): left promiscuous mode
Dec 05 12:12:53 compute-0 NetworkManager[55691]: <info>  [1764936773.7094] device (tapecb20c89-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:12:53 compute-0 ovn_controller[95610]: 2025-12-05T12:12:53Z|00900|binding|INFO|Releasing lport ecb20c89-e04b-4dcb-9b67-08705004bcf8 from this chassis (sb_readonly=0)
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.721 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 ovn_controller[95610]: 2025-12-05T12:12:53Z|00901|binding|INFO|Setting lport ecb20c89-e04b-4dcb-9b67-08705004bcf8 down in Southbound
Dec 05 12:12:53 compute-0 ovn_controller[95610]: 2025-12-05T12:12:53Z|00902|binding|INFO|Removing iface tapecb20c89-e0 ovn-installed in OVS
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.725 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.729 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:25:c4 10.100.0.7'], port_security=['fa:16:3e:a8:25:c4 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f3769524-43d7-4c3b-be59-18bf7af73e18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ecb20c89-e04b-4dcb-9b67-08705004bcf8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.731 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ecb20c89-e04b-4dcb-9b67-08705004bcf8 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 unbound from our chassis
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.736 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.744 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.765 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43131fbd-7bdb-41b2-914a-bc2efaef4778]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:53 compute-0 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d00000057.scope: Deactivated successfully.
Dec 05 12:12:53 compute-0 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d00000057.scope: Consumed 13.396s CPU time.
Dec 05 12:12:53 compute-0 systemd-machined[153543]: Machine qemu-99-instance-00000057 terminated.
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.797 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[054dc9c7-2d11-4d3f-aecb-8bc6cf5a51be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.801 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[54cc68ba-a632-4c6a-9357-c2cfdea2c0a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.834 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[032b8312-3819-4c53-83ea-cb84db46690a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.851 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ffff9435-5a03-4fa1-add7-a6112138c409]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 19, 'rx_bytes': 700, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 19, 'rx_bytes': 700, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235812, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.865 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.867 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.874 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd339c9-35c2-4a6b-81f5-d8ed3a0ae56f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235813, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235813, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.876 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.877 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.884 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.885 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.885 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.886 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.886 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:53.887 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.906 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.948 187212 INFO nova.virt.libvirt.driver [-] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Instance destroyed successfully.
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.949 187212 DEBUG nova.objects.instance [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'resources' on Instance uuid f3769524-43d7-4c3b-be59-18bf7af73e18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.962 187212 DEBUG nova.virt.libvirt.vif [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1949687168',display_name='tempest-ServersTestJSON-server-1949687168',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1949687168',id=87,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:12:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-bluhmwh4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:12:22Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=f3769524-43d7-4c3b-be59-18bf7af73e18,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.963 187212 DEBUG nova.network.os_vif_util [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "address": "fa:16:3e:a8:25:c4", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecb20c89-e0", "ovs_interfaceid": "ecb20c89-e04b-4dcb-9b67-08705004bcf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.963 187212 DEBUG nova.network.os_vif_util [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:25:c4,bridge_name='br-int',has_traffic_filtering=True,id=ecb20c89-e04b-4dcb-9b67-08705004bcf8,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecb20c89-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.964 187212 DEBUG os_vif [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:25:c4,bridge_name='br-int',has_traffic_filtering=True,id=ecb20c89-e04b-4dcb-9b67-08705004bcf8,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecb20c89-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.965 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.966 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapecb20c89-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.967 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:53 compute-0 nova_compute[187208]: 2025-12-05 12:12:53.969 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.011 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.014 187212 INFO os_vif [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:25:c4,bridge_name='br-int',has_traffic_filtering=True,id=ecb20c89-e04b-4dcb-9b67-08705004bcf8,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecb20c89-e0')
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.014 187212 INFO nova.virt.libvirt.driver [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Deleting instance files /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18_del
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.015 187212 INFO nova.virt.libvirt.driver [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Deletion of /var/lib/nova/instances/f3769524-43d7-4c3b-be59-18bf7af73e18_del complete
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.075 187212 INFO nova.compute.manager [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Took 0.39 seconds to destroy the instance on the hypervisor.
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.076 187212 DEBUG oslo.service.loopingcall [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.076 187212 DEBUG nova.compute.manager [-] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.077 187212 DEBUG nova.network.neutron [-] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:12:54 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 12:12:54 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.491 187212 DEBUG nova.compute.manager [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-changed-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.491 187212 DEBUG nova.compute.manager [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Refreshing instance network info cache due to event network-changed-e0274344-5869-486b-a457-04b90e756602. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.491 187212 DEBUG oslo_concurrency.lockutils [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-95b4dafa-871e-42c8-8fb1-162d6b45f3aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.492 187212 DEBUG oslo_concurrency.lockutils [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-95b4dafa-871e-42c8-8fb1-162d6b45f3aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.492 187212 DEBUG nova.network.neutron [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Refreshing network info cache for port e0274344-5869-486b-a457-04b90e756602 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.506 187212 INFO nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Creating config drive at /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.config
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.511 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl47gizh9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.652 187212 DEBUG oslo_concurrency.processutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl47gizh9" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:54 compute-0 kernel: tape0274344-58: entered promiscuous mode
Dec 05 12:12:54 compute-0 systemd-udevd[235804]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:12:54 compute-0 NetworkManager[55691]: <info>  [1764936774.7477] manager: (tape0274344-58): new Tun device (/org/freedesktop/NetworkManager/Devices/349)
Dec 05 12:12:54 compute-0 ovn_controller[95610]: 2025-12-05T12:12:54Z|00903|binding|INFO|Claiming lport e0274344-5869-486b-a457-04b90e756602 for this chassis.
Dec 05 12:12:54 compute-0 ovn_controller[95610]: 2025-12-05T12:12:54Z|00904|binding|INFO|e0274344-5869-486b-a457-04b90e756602: Claiming fa:16:3e:0f:99:d7 10.100.0.13
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.755 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:54 compute-0 NetworkManager[55691]: <info>  [1764936774.7630] device (tape0274344-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:12:54 compute-0 NetworkManager[55691]: <info>  [1764936774.7640] device (tape0274344-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.795 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:99:d7 10.100.0.13'], port_security=['fa:16:3e:0f:99:d7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '95b4dafa-871e-42c8-8fb1-162d6b45f3aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65234c0b-56d7-44c4-8665-41785bc53beb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c82a7ffe1fe49a88eb03f0d89c9629e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '89874479-4098-428c-b848-ebbb8d8ab5f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=787cd8eb-fb25-4820-b1a7-6d591e54c07a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e0274344-5869-486b-a457-04b90e756602) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.796 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e0274344-5869-486b-a457-04b90e756602 in datapath 65234c0b-56d7-44c4-8665-41785bc53beb bound to our chassis
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.799 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 65234c0b-56d7-44c4-8665-41785bc53beb
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.808 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[22e2d7f9-904e-4d30-a60c-ca96ef8922e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.809 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap65234c0b-51 in ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.811 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap65234c0b-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.812 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[152d0ca2-6555-4249-b902-aff5987d6719]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.812 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[94fd96bd-e41e-4ac3-85e8-faa1e6d5483f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.823 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8f19d68a-9b56-4374-9eae-aaa3f7cf0c9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 systemd-machined[153543]: New machine qemu-104-instance-0000005b.
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.842 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:54 compute-0 ovn_controller[95610]: 2025-12-05T12:12:54Z|00905|binding|INFO|Setting lport e0274344-5869-486b-a457-04b90e756602 ovn-installed in OVS
Dec 05 12:12:54 compute-0 ovn_controller[95610]: 2025-12-05T12:12:54Z|00906|binding|INFO|Setting lport e0274344-5869-486b-a457-04b90e756602 up in Southbound
Dec 05 12:12:54 compute-0 nova_compute[187208]: 2025-12-05 12:12:54.846 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.849 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[98f35fa2-0f54-4397-bbb1-b37a40e24a55]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 systemd[1]: Started Virtual Machine qemu-104-instance-0000005b.
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.879 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e6d38e1b-864d-49cb-a21e-8bbede141451]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.884 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb42177-bd70-4fb1-b589-026dc61931d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 NetworkManager[55691]: <info>  [1764936774.8858] manager: (tap65234c0b-50): new Veth device (/org/freedesktop/NetworkManager/Devices/350)
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.891 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.918 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[018cde3c-0c4a-4853-a906-f132efb11e62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.921 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[01f06c1e-8370-420a-8b89-58f4073e788d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 NetworkManager[55691]: <info>  [1764936774.9442] device (tap65234c0b-50): carrier: link connected
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.948 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[deab7e5e-02b9-4ef7-993e-7d92aa54545f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.966 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6af9e7f4-8f72-4504-a0d2-b1d640086e3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65234c0b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:c4:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 248], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415550, 'reachable_time': 17461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235883, 'error': None, 'target': 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.980 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8eded58f-22db-49c3-94bf-0b03eeeb8efb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:c4be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415550, 'tstamp': 415550}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235884, 'error': None, 'target': 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:54.996 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5420adfe-cb7b-43ac-8589-2d2f0b111b1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65234c0b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:c4:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 248], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415550, 'reachable_time': 17461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235885, 'error': None, 'target': 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.027 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0610ec18-6abf-4247-8249-a76744afdd75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.091 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1b9848-fb3f-4560-9e89-3c610526f815]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.092 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65234c0b-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.093 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.093 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65234c0b-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:55 compute-0 kernel: tap65234c0b-50: entered promiscuous mode
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:55 compute-0 NetworkManager[55691]: <info>  [1764936775.1034] manager: (tap65234c0b-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.106 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap65234c0b-50, col_values=(('external_ids', {'iface-id': 'a30d2e0e-09db-4ced-93ec-e9e2d828cf95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.107 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:55 compute-0 ovn_controller[95610]: 2025-12-05T12:12:55Z|00907|binding|INFO|Releasing lport a30d2e0e-09db-4ced-93ec-e9e2d828cf95 from this chassis (sb_readonly=1)
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.122 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.124 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/65234c0b-56d7-44c4-8665-41785bc53beb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/65234c0b-56d7-44c4-8665-41785bc53beb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.125 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6b005297-b5bd-4716-8a9a-406deb3dfa97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.126 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-65234c0b-56d7-44c4-8665-41785bc53beb
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/65234c0b-56d7-44c4-8665-41785bc53beb.pid.haproxy
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 65234c0b-56d7-44c4-8665-41785bc53beb
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:12:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:55.126 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'env', 'PROCESS_TAG=haproxy-65234c0b-56d7-44c4-8665-41785bc53beb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/65234c0b-56d7-44c4-8665-41785bc53beb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.581 187212 DEBUG nova.compute.manager [req-245c367c-e061-436d-ba39-15992119b5c8 req-cf1f2573-2d22-44f4-9615-511e0c029b04 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-vif-plugged-0e7829e3-325a-430d-898f-510a4c544ffa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.582 187212 DEBUG oslo_concurrency.lockutils [req-245c367c-e061-436d-ba39-15992119b5c8 req-cf1f2573-2d22-44f4-9615-511e0c029b04 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.583 187212 DEBUG oslo_concurrency.lockutils [req-245c367c-e061-436d-ba39-15992119b5c8 req-cf1f2573-2d22-44f4-9615-511e0c029b04 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.583 187212 DEBUG oslo_concurrency.lockutils [req-245c367c-e061-436d-ba39-15992119b5c8 req-cf1f2573-2d22-44f4-9615-511e0c029b04 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.583 187212 DEBUG nova.compute.manager [req-245c367c-e061-436d-ba39-15992119b5c8 req-cf1f2573-2d22-44f4-9615-511e0c029b04 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] No event matching network-vif-plugged-0e7829e3-325a-430d-898f-510a4c544ffa in dict_keys([('network-vif-plugged', '961ee213-955e-471f-8cb4-ad0d3a82285c')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.583 187212 WARNING nova.compute.manager [req-245c367c-e061-436d-ba39-15992119b5c8 req-cf1f2573-2d22-44f4-9615-511e0c029b04 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received unexpected event network-vif-plugged-0e7829e3-325a-430d-898f-510a4c544ffa for instance with vm_state building and task_state spawning.
Dec 05 12:12:55 compute-0 podman[235916]: 2025-12-05 12:12:55.49631896 +0000 UTC m=+0.021028725 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.607 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936775.6069129, 95b4dafa-871e-42c8-8fb1-162d6b45f3aa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.608 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] VM Started (Lifecycle Event)
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.630 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.635 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936775.607099, 95b4dafa-871e-42c8-8fb1-162d6b45f3aa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.636 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] VM Paused (Lifecycle Event)
Dec 05 12:12:55 compute-0 podman[235916]: 2025-12-05 12:12:55.642851975 +0000 UTC m=+0.167561720 container create f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.656 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.661 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:55 compute-0 systemd[1]: Started libpod-conmon-f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54.scope.
Dec 05 12:12:55 compute-0 nova_compute[187208]: 2025-12-05 12:12:55.697 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:12:55 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:12:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e955ad4e0dfc4b80cb3a7e139e28478d43ec17cc197a0c9cc5c92873a01e021f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:12:55 compute-0 podman[235916]: 2025-12-05 12:12:55.894223479 +0000 UTC m=+0.418933224 container init f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:12:55 compute-0 podman[235916]: 2025-12-05 12:12:55.903747092 +0000 UTC m=+0.428456837 container start f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 05 12:12:55 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[235938]: [NOTICE]   (235943) : New worker (235945) forked
Dec 05 12:12:55 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[235938]: [NOTICE]   (235943) : Loading success.
Dec 05 12:12:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:56.483 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:f7:85 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-16b1f5d3-c1d7-4b48-b03c-62aa528fa1b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16b1f5d3-c1d7-4b48-b03c-62aa528fa1b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ff94c302f4541f9bdb0a79ab2b69a76', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ba72e37-0b76-41ce-ac4e-c4f740ad652c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6a74d978-6996-41ef-8449-2d4850d0fd4d) old=Port_Binding(mac=['fa:16:3e:33:f7:85 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-16b1f5d3-c1d7-4b48-b03c-62aa528fa1b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16b1f5d3-c1d7-4b48-b03c-62aa528fa1b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ff94c302f4541f9bdb0a79ab2b69a76', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:12:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:56.484 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6a74d978-6996-41ef-8449-2d4850d0fd4d in datapath 16b1f5d3-c1d7-4b48-b03c-62aa528fa1b0 updated
Dec 05 12:12:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:56.486 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16b1f5d3-c1d7-4b48-b03c-62aa528fa1b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:12:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:12:56.487 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2ae102-9b74-41df-92b5-8e2676baf4eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.542 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.543 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.565 187212 DEBUG nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.651 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.651 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.661 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.661 187212 INFO nova.compute.claims [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.751 187212 DEBUG nova.network.neutron [-] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.778 187212 INFO nova.compute.manager [-] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Took 2.70 seconds to deallocate network for instance.
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.846 187212 DEBUG oslo_concurrency.lockutils [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.946 187212 DEBUG nova.compute.provider_tree [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.965 187212 DEBUG nova.scheduler.client.report [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.989 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.990 187212 DEBUG nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:12:56 compute-0 nova_compute[187208]: 2025-12-05 12:12:56.992 187212 DEBUG oslo_concurrency.lockutils [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.056 187212 DEBUG nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.056 187212 DEBUG nova.network.neutron [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.078 187212 INFO nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.102 187212 DEBUG nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.202 187212 DEBUG nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.204 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.204 187212 INFO nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Creating image(s)
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.205 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.205 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.206 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.230 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.259 187212 INFO nova.compute.manager [None req-89222f44-fadb-4ec6-ad4c-da6da6f6c226 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Unrescuing
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.260 187212 DEBUG oslo_concurrency.lockutils [None req-89222f44-fadb-4ec6-ad4c-da6da6f6c226 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.261 187212 DEBUG oslo_concurrency.lockutils [None req-89222f44-fadb-4ec6-ad4c-da6da6f6c226 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquired lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.261 187212 DEBUG nova.network.neutron [None req-89222f44-fadb-4ec6-ad4c-da6da6f6c226 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.268 187212 DEBUG nova.compute.provider_tree [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.285 187212 DEBUG nova.scheduler.client.report [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.302 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.303 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.303 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.316 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.354 187212 DEBUG oslo_concurrency.lockutils [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.412 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.413 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.449 187212 INFO nova.compute.manager [None req-f8165e2b-dd80-4c77-8714-27985b616aa6 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Unpausing
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.451 187212 DEBUG nova.objects.instance [None req-f8165e2b-dd80-4c77-8714-27985b616aa6 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'flavor' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.464 187212 DEBUG nova.policy [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.480 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk 1073741824" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.480 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.481 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.531 187212 INFO nova.scheduler.client.report [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Deleted allocations for instance f3769524-43d7-4c3b-be59-18bf7af73e18
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.538 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936777.5378323, 28e48516-8665-4d98-a92d-c84b7da9a284 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.539 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Resumed (Lifecycle Event)
Dec 05 12:12:57 compute-0 virtqemud[186841]: argument unsupported: QEMU guest agent is not configured
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.543 187212 DEBUG nova.virt.libvirt.guest [None req-f8165e2b-dd80-4c77-8714-27985b616aa6 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.543 187212 DEBUG nova.compute.manager [None req-f8165e2b-dd80-4c77-8714-27985b616aa6 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.555 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.556 187212 DEBUG nova.virt.disk.api [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.556 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.582 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.590 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.626 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.627 187212 DEBUG nova.virt.disk.api [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.627 187212 DEBUG nova.objects.instance [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.633 187212 DEBUG nova.compute.manager [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-vif-plugged-961ee213-955e-471f-8cb4-ad0d3a82285c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.634 187212 DEBUG oslo_concurrency.lockutils [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.634 187212 DEBUG oslo_concurrency.lockutils [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.635 187212 DEBUG oslo_concurrency.lockutils [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.635 187212 DEBUG nova.compute.manager [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Processing event network-vif-plugged-961ee213-955e-471f-8cb4-ad0d3a82285c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.635 187212 DEBUG nova.compute.manager [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-unplugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.635 187212 DEBUG oslo_concurrency.lockutils [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.636 187212 DEBUG oslo_concurrency.lockutils [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.636 187212 DEBUG oslo_concurrency.lockutils [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.636 187212 DEBUG nova.compute.manager [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] No waiting events found dispatching network-vif-unplugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.636 187212 WARNING nova.compute.manager [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received unexpected event network-vif-unplugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for instance with vm_state rescued and task_state unrescuing.
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.637 187212 DEBUG nova.compute.manager [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.637 187212 DEBUG oslo_concurrency.lockutils [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.637 187212 DEBUG oslo_concurrency.lockutils [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.638 187212 DEBUG oslo_concurrency.lockutils [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.638 187212 DEBUG nova.compute.manager [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] No waiting events found dispatching network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.638 187212 WARNING nova.compute.manager [req-60fe3aa9-1f4d-4507-8a47-9df933ae424d req-0d4bc3a4-24b4-4c77-a69b-47cc1498eac4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received unexpected event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for instance with vm_state rescued and task_state unrescuing.
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.640 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Instance event wait completed in 12 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.646 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.651 187212 INFO nova.virt.libvirt.driver [-] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Instance spawned successfully.
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.652 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.705 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.705 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.706 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.706 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.707 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.707 187212 DEBUG nova.virt.libvirt.driver [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.775 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.776 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Ensure instance console log exists: /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.776 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.776 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.777 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.786 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] During sync_power_state the instance has a pending task (unpausing). Skip.
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.787 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936777.6453297, 846c0e55-1620-4c7a-9792-d4f5f0d728d8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:12:57 compute-0 nova_compute[187208]: 2025-12-05 12:12:57.787 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] VM Resumed (Lifecycle Event)
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.109 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.112 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.128 187212 INFO nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Took 30.22 seconds to spawn the instance on the hypervisor.
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.128 187212 DEBUG nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.132 187212 DEBUG oslo_concurrency.lockutils [None req-987fb3ec-a936-4f4d-864e-a3a87f769da7 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "f3769524-43d7-4c3b-be59-18bf7af73e18" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.157 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.207 187212 INFO nova.compute.manager [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Took 30.96 seconds to build instance.
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.234 187212 DEBUG oslo_concurrency.lockutils [None req-a867d516-568e-4bb6-802c-f7932ce491a0 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 31.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.299 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.959 187212 DEBUG nova.network.neutron [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Updated VIF entry in instance network info cache for port e0274344-5869-486b-a457-04b90e756602. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.960 187212 DEBUG nova.network.neutron [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Updating instance_info_cache with network_info: [{"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:12:58 compute-0 nova_compute[187208]: 2025-12-05 12:12:58.968 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:12:59 compute-0 nova_compute[187208]: 2025-12-05 12:12:59.053 187212 DEBUG oslo_concurrency.lockutils [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-95b4dafa-871e-42c8-8fb1-162d6b45f3aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:12:59 compute-0 nova_compute[187208]: 2025-12-05 12:12:59.053 187212 DEBUG nova.compute.manager [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-vif-plugged-961ee213-955e-471f-8cb4-ad0d3a82285c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:12:59 compute-0 nova_compute[187208]: 2025-12-05 12:12:59.054 187212 DEBUG oslo_concurrency.lockutils [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:12:59 compute-0 nova_compute[187208]: 2025-12-05 12:12:59.054 187212 DEBUG oslo_concurrency.lockutils [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:12:59 compute-0 nova_compute[187208]: 2025-12-05 12:12:59.054 187212 DEBUG oslo_concurrency.lockutils [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:12:59 compute-0 nova_compute[187208]: 2025-12-05 12:12:59.054 187212 DEBUG nova.compute.manager [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] No waiting events found dispatching network-vif-plugged-961ee213-955e-471f-8cb4-ad0d3a82285c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:12:59 compute-0 nova_compute[187208]: 2025-12-05 12:12:59.055 187212 WARNING nova.compute.manager [req-8fcd1080-0a82-41ce-9fc9-8313a2a7ce5c req-d20c3ca3-b1a7-4084-a7a6-09c61aaf0238 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received unexpected event network-vif-plugged-961ee213-955e-471f-8cb4-ad0d3a82285c for instance with vm_state building and task_state spawning.
Dec 05 12:12:59 compute-0 podman[235970]: 2025-12-05 12:12:59.216655193 +0000 UTC m=+0.066459788 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.172 187212 DEBUG nova.compute.manager [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.173 187212 DEBUG oslo_concurrency.lockutils [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.173 187212 DEBUG oslo_concurrency.lockutils [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.174 187212 DEBUG oslo_concurrency.lockutils [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.174 187212 DEBUG nova.compute.manager [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Processing event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.174 187212 DEBUG nova.compute.manager [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.174 187212 DEBUG oslo_concurrency.lockutils [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.175 187212 DEBUG oslo_concurrency.lockutils [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.175 187212 DEBUG oslo_concurrency.lockutils [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.175 187212 DEBUG nova.compute.manager [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] No waiting events found dispatching network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.175 187212 WARNING nova.compute.manager [req-a6189da4-104c-493e-9591-ff33c133ffa1 req-65bb990c-9ade-413d-9b4a-a50317c50ee2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received unexpected event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 for instance with vm_state building and task_state spawning.
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.176 187212 DEBUG nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.179 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936780.179347, 95b4dafa-871e-42c8-8fb1-162d6b45f3aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.179 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] VM Resumed (Lifecycle Event)
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.181 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.185 187212 INFO nova.virt.libvirt.driver [-] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Instance spawned successfully.
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.185 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.207 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.213 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.217 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.217 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.218 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.218 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.219 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.219 187212 DEBUG nova.virt.libvirt.driver [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.253 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.285 187212 INFO nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Took 13.02 seconds to spawn the instance on the hypervisor.
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.286 187212 DEBUG nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.354 187212 INFO nova.compute.manager [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Took 13.66 seconds to build instance.
Dec 05 12:13:00 compute-0 nova_compute[187208]: 2025-12-05 12:13:00.372 187212 DEBUG oslo_concurrency.lockutils [None req-074e8c82-3b1a-491a-9fb6-133709e8b33f 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:01 compute-0 nova_compute[187208]: 2025-12-05 12:13:01.310 187212 DEBUG nova.compute.manager [req-03af94fe-81be-417c-84e5-3b29bfc18b4e req-8834272e-91b6-4ed3-8565-490cf3e43283 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Received event network-vif-deleted-ecb20c89-e04b-4dcb-9b67-08705004bcf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.209 187212 DEBUG nova.network.neutron [None req-89222f44-fadb-4ec6-ad4c-da6da6f6c226 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Updating instance_info_cache with network_info: [{"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.229 187212 DEBUG oslo_concurrency.lockutils [None req-89222f44-fadb-4ec6-ad4c-da6da6f6c226 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Releasing lock "refresh_cache-b661b497-acb9-4b26-8e26-7d0802bca8bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.231 187212 DEBUG nova.objects.instance [None req-89222f44-fadb-4ec6-ad4c-da6da6f6c226 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'flavor' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.256 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936767.254813, 7bf2559a-9191-4322-9e46-19761de59dc9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.256 187212 INFO nova.compute.manager [-] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] VM Stopped (Lifecycle Event)
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.277 187212 DEBUG nova.compute.manager [None req-87dbe3a1-dd85-40ea-b7ef-964876b8e60b - - - - - -] [instance: 7bf2559a-9191-4322-9e46-19761de59dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:02 compute-0 kernel: tapc7d6a93d-87 (unregistering): left promiscuous mode
Dec 05 12:13:02 compute-0 NetworkManager[55691]: <info>  [1764936782.2898] device (tapc7d6a93d-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.302 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:02 compute-0 ovn_controller[95610]: 2025-12-05T12:13:02Z|00908|binding|INFO|Releasing lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 from this chassis (sb_readonly=0)
Dec 05 12:13:02 compute-0 ovn_controller[95610]: 2025-12-05T12:13:02Z|00909|binding|INFO|Setting lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 down in Southbound
Dec 05 12:13:02 compute-0 ovn_controller[95610]: 2025-12-05T12:13:02Z|00910|binding|INFO|Removing iface tapc7d6a93d-87 ovn-installed in OVS
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.308 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.326 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:02.329 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:e5:61 10.100.0.8'], port_security=['fa:16:3e:17:e5:61 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '6', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:02.330 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 unbound from our chassis
Dec 05 12:13:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:02.332 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:13:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:02.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe11f69-c0c6-47c1-8322-4c7f8fa5b5c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:02 compute-0 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d00000058.scope: Deactivated successfully.
Dec 05 12:13:02 compute-0 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d00000058.scope: Consumed 11.347s CPU time.
Dec 05 12:13:02 compute-0 systemd-machined[153543]: Machine qemu-103-instance-00000058 terminated.
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.553 187212 INFO nova.virt.libvirt.driver [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance destroyed successfully.
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.553 187212 DEBUG nova.objects.instance [None req-89222f44-fadb-4ec6-ad4c-da6da6f6c226 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'numa_topology' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:02 compute-0 kernel: tapc7d6a93d-87: entered promiscuous mode
Dec 05 12:13:02 compute-0 systemd-udevd[235996]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:13:02 compute-0 NetworkManager[55691]: <info>  [1764936782.6683] manager: (tapc7d6a93d-87): new Tun device (/org/freedesktop/NetworkManager/Devices/352)
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.667 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:02 compute-0 ovn_controller[95610]: 2025-12-05T12:13:02Z|00911|binding|INFO|Claiming lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for this chassis.
Dec 05 12:13:02 compute-0 ovn_controller[95610]: 2025-12-05T12:13:02Z|00912|binding|INFO|c7d6a93d-8775-4c4e-9a60-bc8e87e5b310: Claiming fa:16:3e:17:e5:61 10.100.0.8
Dec 05 12:13:02 compute-0 NetworkManager[55691]: <info>  [1764936782.6777] device (tapc7d6a93d-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:13:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:02.677 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:e5:61 10.100.0.8'], port_security=['fa:16:3e:17:e5:61 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '6', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:02.678 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 bound to our chassis
Dec 05 12:13:02 compute-0 NetworkManager[55691]: <info>  [1764936782.6805] device (tapc7d6a93d-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:13:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:02.679 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:13:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:02.681 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf153ebd-6b01-4fbb-9322-90132ed22eee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:02 compute-0 ovn_controller[95610]: 2025-12-05T12:13:02Z|00913|binding|INFO|Setting lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 ovn-installed in OVS
Dec 05 12:13:02 compute-0 ovn_controller[95610]: 2025-12-05T12:13:02Z|00914|binding|INFO|Setting lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 up in Southbound
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:02 compute-0 nova_compute[187208]: 2025-12-05 12:13:02.694 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:02 compute-0 systemd-machined[153543]: New machine qemu-105-instance-00000058.
Dec 05 12:13:02 compute-0 systemd[1]: Started Virtual Machine qemu-105-instance-00000058.
Dec 05 12:13:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:03.017 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:03.018 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:03.020 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.263 187212 DEBUG nova.network.neutron [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Successfully created port: 5316adeb-5a49-4a58-b997-f132a083ff13 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.302 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.362 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for b661b497-acb9-4b26-8e26-7d0802bca8bf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.363 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936783.3620782, b661b497-acb9-4b26-8e26-7d0802bca8bf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.363 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] VM Resumed (Lifecycle Event)
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.366 187212 DEBUG nova.compute.manager [None req-89222f44-fadb-4ec6-ad4c-da6da6f6c226 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.391 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.394 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.420 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] During sync_power_state the instance has a pending task (unrescuing). Skip.
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.421 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936783.3648546, b661b497-acb9-4b26-8e26-7d0802bca8bf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.421 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] VM Started (Lifecycle Event)
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.441 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.446 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:03 compute-0 nova_compute[187208]: 2025-12-05 12:13:03.970 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.562 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.563 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.581 187212 DEBUG nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.613 187212 DEBUG oslo_concurrency.lockutils [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.614 187212 DEBUG oslo_concurrency.lockutils [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.614 187212 DEBUG oslo_concurrency.lockutils [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.615 187212 DEBUG oslo_concurrency.lockutils [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.615 187212 DEBUG oslo_concurrency.lockutils [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.616 187212 INFO nova.compute.manager [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Terminating instance
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.618 187212 DEBUG nova.compute.manager [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:13:04 compute-0 kernel: tap0e7829e3-32 (unregistering): left promiscuous mode
Dec 05 12:13:04 compute-0 NetworkManager[55691]: <info>  [1764936784.6437] device (tap0e7829e3-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:04 compute-0 ovn_controller[95610]: 2025-12-05T12:13:04Z|00915|binding|INFO|Releasing lport 0e7829e3-325a-430d-898f-510a4c544ffa from this chassis (sb_readonly=0)
Dec 05 12:13:04 compute-0 ovn_controller[95610]: 2025-12-05T12:13:04Z|00916|binding|INFO|Setting lport 0e7829e3-325a-430d-898f-510a4c544ffa down in Southbound
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.674 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 ovn_controller[95610]: 2025-12-05T12:13:04Z|00917|binding|INFO|Removing iface tap0e7829e3-32 ovn-installed in OVS
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.678 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:04.684 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:97:61 10.100.0.232'], port_security=['fa:16:3e:87:97:61 10.100.0.232'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.232/24', 'neutron:device_id': '846c0e55-1620-4c7a-9792-d4f5f0d728d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29118b32-7f75-46f6-9be9-f833be642f99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f523b7fe-86fa-434f-ad90-764f0845cc18, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0e7829e3-325a-430d-898f-510a4c544ffa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:04.685 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0e7829e3-325a-430d-898f-510a4c544ffa in datapath 29118b32-7f75-46f6-9be9-f833be642f99 unbound from our chassis
Dec 05 12:13:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:04.688 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 29118b32-7f75-46f6-9be9-f833be642f99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.688 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:04.690 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[79442998-7a90-406c-aaec-bd06e780709b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:04.690 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99 namespace which is not needed anymore
Dec 05 12:13:04 compute-0 kernel: tap961ee213-95 (unregistering): left promiscuous mode
Dec 05 12:13:04 compute-0 NetworkManager[55691]: <info>  [1764936784.7008] device (tap961ee213-95): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.706 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.707 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.710 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 ovn_controller[95610]: 2025-12-05T12:13:04Z|00918|binding|INFO|Releasing lport 961ee213-955e-471f-8cb4-ad0d3a82285c from this chassis (sb_readonly=0)
Dec 05 12:13:04 compute-0 ovn_controller[95610]: 2025-12-05T12:13:04Z|00919|binding|INFO|Setting lport 961ee213-955e-471f-8cb4-ad0d3a82285c down in Southbound
Dec 05 12:13:04 compute-0 ovn_controller[95610]: 2025-12-05T12:13:04Z|00920|binding|INFO|Removing iface tap961ee213-95 ovn-installed in OVS
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.717 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.721 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.722 187212 INFO nova.compute.claims [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:13:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:04.727 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:9b:af 10.100.1.248'], port_security=['fa:16:3e:31:9b:af 10.100.1.248'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.248/24', 'neutron:device_id': '846c0e55-1620-4c7a-9792-d4f5f0d728d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7aededcaee54c4bbb7cba6007565f65', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fba61c9a-8063-483b-87c5-dc43f44e9c3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f1562571-290f-4ea3-be5a-e33bc5391f1b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=961ee213-955e-471f-8cb4-ad0d3a82285c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.730 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Dec 05 12:13:04 compute-0 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d0000005a.scope: Consumed 7.366s CPU time.
Dec 05 12:13:04 compute-0 systemd-machined[153543]: Machine qemu-102-instance-0000005a terminated.
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.764 187212 DEBUG nova.compute.manager [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.765 187212 DEBUG oslo_concurrency.lockutils [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.766 187212 DEBUG oslo_concurrency.lockutils [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.766 187212 DEBUG oslo_concurrency.lockutils [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.766 187212 DEBUG nova.compute.manager [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] No waiting events found dispatching network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.767 187212 WARNING nova.compute.manager [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received unexpected event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for instance with vm_state active and task_state None.
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.767 187212 DEBUG nova.compute.manager [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.767 187212 DEBUG oslo_concurrency.lockutils [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.767 187212 DEBUG oslo_concurrency.lockutils [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.768 187212 DEBUG oslo_concurrency.lockutils [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.768 187212 DEBUG nova.compute.manager [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] No waiting events found dispatching network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.768 187212 WARNING nova.compute.manager [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received unexpected event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for instance with vm_state active and task_state None.
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.768 187212 DEBUG nova.compute.manager [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-unplugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.768 187212 DEBUG oslo_concurrency.lockutils [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.769 187212 DEBUG oslo_concurrency.lockutils [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.769 187212 DEBUG oslo_concurrency.lockutils [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.769 187212 DEBUG nova.compute.manager [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] No waiting events found dispatching network-vif-unplugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.769 187212 WARNING nova.compute.manager [req-55735695-76ff-4463-a0b8-c89cbc8c59f8 req-2a456e9c-f697-4490-9dbb-4c408a3db7db 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received unexpected event network-vif-unplugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for instance with vm_state active and task_state None.
Dec 05 12:13:04 compute-0 NetworkManager[55691]: <info>  [1764936784.8409] manager: (tap0e7829e3-32): new Tun device (/org/freedesktop/NetworkManager/Devices/353)
Dec 05 12:13:04 compute-0 neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99[235535]: [NOTICE]   (235539) : haproxy version is 2.8.14-c23fe91
Dec 05 12:13:04 compute-0 neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99[235535]: [NOTICE]   (235539) : path to executable is /usr/sbin/haproxy
Dec 05 12:13:04 compute-0 neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99[235535]: [WARNING]  (235539) : Exiting Master process...
Dec 05 12:13:04 compute-0 neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99[235535]: [WARNING]  (235539) : Exiting Master process...
Dec 05 12:13:04 compute-0 neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99[235535]: [ALERT]    (235539) : Current worker (235541) exited with code 143 (Terminated)
Dec 05 12:13:04 compute-0 neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99[235535]: [WARNING]  (235539) : All workers exited. Exiting... (0)
Dec 05 12:13:04 compute-0 NetworkManager[55691]: <info>  [1764936784.8534] manager: (tap961ee213-95): new Tun device (/org/freedesktop/NetworkManager/Devices/354)
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.854 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 systemd[1]: libpod-4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6.scope: Deactivated successfully.
Dec 05 12:13:04 compute-0 podman[236074]: 2025-12-05 12:13:04.859917291 +0000 UTC m=+0.054394833 container died 4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.873 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.908 187212 INFO nova.virt.libvirt.driver [-] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Instance destroyed successfully.
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.909 187212 DEBUG nova.objects.instance [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lazy-loading 'resources' on Instance uuid 846c0e55-1620-4c7a-9792-d4f5f0d728d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6-userdata-shm.mount: Deactivated successfully.
Dec 05 12:13:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-fad4c702a0e082abb762dbdee375854cb8e067772014c4935c6133443e58ea4a-merged.mount: Deactivated successfully.
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.915 187212 DEBUG oslo_concurrency.lockutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.916 187212 DEBUG oslo_concurrency.lockutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.916 187212 INFO nova.compute.manager [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Rebooting instance
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.927 187212 DEBUG nova.virt.libvirt.vif [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-729713724',display_name='tempest-ServersTestMultiNic-server-729713724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-729713724',id=90,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:12:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-zskavgn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:12:58Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=846c0e55-1620-4c7a-9792-d4f5f0d728d8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.928 187212 DEBUG nova.network.os_vif_util [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "0e7829e3-325a-430d-898f-510a4c544ffa", "address": "fa:16:3e:87:97:61", "network": {"id": "29118b32-7f75-46f6-9be9-f833be642f99", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1124188024", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e7829e3-32", "ovs_interfaceid": "0e7829e3-325a-430d-898f-510a4c544ffa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.928 187212 DEBUG nova.network.os_vif_util [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:97:61,bridge_name='br-int',has_traffic_filtering=True,id=0e7829e3-325a-430d-898f-510a4c544ffa,network=Network(29118b32-7f75-46f6-9be9-f833be642f99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e7829e3-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.929 187212 DEBUG os_vif [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:97:61,bridge_name='br-int',has_traffic_filtering=True,id=0e7829e3-325a-430d-898f-510a4c544ffa,network=Network(29118b32-7f75-46f6-9be9-f833be642f99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e7829e3-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:13:04 compute-0 podman[236074]: 2025-12-05 12:13:04.930678447 +0000 UTC m=+0.125155979 container cleanup 4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.932 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.933 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e7829e3-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 systemd[1]: libpod-conmon-4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6.scope: Deactivated successfully.
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.938 187212 DEBUG oslo_concurrency.lockutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "refresh_cache-95b4dafa-871e-42c8-8fb1-162d6b45f3aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.939 187212 DEBUG oslo_concurrency.lockutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquired lock "refresh_cache-95b4dafa-871e-42c8-8fb1-162d6b45f3aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.940 187212 DEBUG nova.network.neutron [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.945 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.952 187212 INFO os_vif [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:97:61,bridge_name='br-int',has_traffic_filtering=True,id=0e7829e3-325a-430d-898f-510a4c544ffa,network=Network(29118b32-7f75-46f6-9be9-f833be642f99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e7829e3-32')
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.953 187212 DEBUG nova.virt.libvirt.vif [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-729713724',display_name='tempest-ServersTestMultiNic-server-729713724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-729713724',id=90,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:12:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f7aededcaee54c4bbb7cba6007565f65',ramdisk_id='',reservation_id='r-zskavgn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1621990639',owner_user_name='tempest-ServersTestMultiNic-1621990639-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:12:58Z,user_data=None,user_id='430719002c284cd28237859ea6061eef',uuid=846c0e55-1620-4c7a-9792-d4f5f0d728d8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.953 187212 DEBUG nova.network.os_vif_util [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converting VIF {"id": "961ee213-955e-471f-8cb4-ad0d3a82285c", "address": "fa:16:3e:31:9b:af", "network": {"id": "bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2094839326", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7aededcaee54c4bbb7cba6007565f65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap961ee213-95", "ovs_interfaceid": "961ee213-955e-471f-8cb4-ad0d3a82285c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.954 187212 DEBUG nova.network.os_vif_util [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:9b:af,bridge_name='br-int',has_traffic_filtering=True,id=961ee213-955e-471f-8cb4-ad0d3a82285c,network=Network(bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap961ee213-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.954 187212 DEBUG os_vif [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9b:af,bridge_name='br-int',has_traffic_filtering=True,id=961ee213-955e-471f-8cb4-ad0d3a82285c,network=Network(bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap961ee213-95') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.955 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.956 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap961ee213-95, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.965 187212 INFO os_vif [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9b:af,bridge_name='br-int',has_traffic_filtering=True,id=961ee213-955e-471f-8cb4-ad0d3a82285c,network=Network(bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap961ee213-95')
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.966 187212 INFO nova.virt.libvirt.driver [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Deleting instance files /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8_del
Dec 05 12:13:04 compute-0 nova_compute[187208]: 2025-12-05 12:13:04.967 187212 INFO nova.virt.libvirt.driver [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Deletion of /var/lib/nova/instances/846c0e55-1620-4c7a-9792-d4f5f0d728d8_del complete
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.010 187212 DEBUG nova.compute.provider_tree [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.014 187212 INFO nova.compute.manager [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.015 187212 DEBUG oslo.service.loopingcall [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.015 187212 DEBUG nova.compute.manager [-] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.016 187212 DEBUG nova.network.neutron [-] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.032 187212 DEBUG nova.scheduler.client.report [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.052 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.053 187212 DEBUG nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:13:05 compute-0 podman[236119]: 2025-12-05 12:13:05.072560959 +0000 UTC m=+0.120283779 container remove 4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.078 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6ca93b00-48b0-47e3-9327-fed83d355921]: (4, ('Fri Dec  5 12:13:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99 (4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6)\n4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6\nFri Dec  5 12:13:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99 (4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6)\n4a9fcda3f2b4b6984798903893dd2ed0752e33acba75994128e0a95079b735d6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.080 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e96e8cc7-eed2-4117-8aae-f4c348d6fcb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.081 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29118b32-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.083 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:05 compute-0 kernel: tap29118b32-70: left promiscuous mode
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.097 187212 DEBUG nova.compute.manager [req-2bf7aac6-db7d-40e5-ac30-09c0a5c7c7f7 req-94a0e2ba-0f63-42d3-abdb-23e36444d924 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-vif-unplugged-0e7829e3-325a-430d-898f-510a4c544ffa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.098 187212 DEBUG oslo_concurrency.lockutils [req-2bf7aac6-db7d-40e5-ac30-09c0a5c7c7f7 req-94a0e2ba-0f63-42d3-abdb-23e36444d924 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.099 187212 DEBUG oslo_concurrency.lockutils [req-2bf7aac6-db7d-40e5-ac30-09c0a5c7c7f7 req-94a0e2ba-0f63-42d3-abdb-23e36444d924 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.099 187212 DEBUG oslo_concurrency.lockutils [req-2bf7aac6-db7d-40e5-ac30-09c0a5c7c7f7 req-94a0e2ba-0f63-42d3-abdb-23e36444d924 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.099 187212 DEBUG nova.compute.manager [req-2bf7aac6-db7d-40e5-ac30-09c0a5c7c7f7 req-94a0e2ba-0f63-42d3-abdb-23e36444d924 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] No waiting events found dispatching network-vif-unplugged-0e7829e3-325a-430d-898f-510a4c544ffa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.100 187212 DEBUG nova.compute.manager [req-2bf7aac6-db7d-40e5-ac30-09c0a5c7c7f7 req-94a0e2ba-0f63-42d3-abdb-23e36444d924 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-vif-unplugged-0e7829e3-325a-430d-898f-510a4c544ffa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.103 187212 DEBUG nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.104 187212 DEBUG nova.network.neutron [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[98ab827b-8dff-43b6-99fc-a2fd1d6ab506]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.111 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.125 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce3b2dea-12d3-43af-b431-d575fcc6e9a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.127 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62d2e8bc-d704-4cfd-bdb2-7fdad0ca3429]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.127 187212 INFO nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.144 187212 DEBUG nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.148 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[125d2dea-bc85-4934-944c-da65a27cdf12]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414567, 'reachable_time': 17374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236131, 'error': None, 'target': 'ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d29118b32\x2d7f75\x2d46f6\x2d9be9\x2df833be642f99.mount: Deactivated successfully.
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.153 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-29118b32-7f75-46f6-9be9-f833be642f99 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.153 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[ff799a09-d631-4f89-b6e6-ba5d53b74f54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.154 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 961ee213-955e-471f-8cb4-ad0d3a82285c in datapath bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb unbound from our chassis
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.157 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.159 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[75682df6-2545-43fa-b9ef-59cd2b09c8cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.159 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb namespace which is not needed anymore
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.231 187212 DEBUG nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.232 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.233 187212 INFO nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Creating image(s)
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.234 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "/var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.234 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.235 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.250 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.331 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.332 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.333 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:05 compute-0 neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb[235614]: [NOTICE]   (235639) : haproxy version is 2.8.14-c23fe91
Dec 05 12:13:05 compute-0 neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb[235614]: [NOTICE]   (235639) : path to executable is /usr/sbin/haproxy
Dec 05 12:13:05 compute-0 neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb[235614]: [WARNING]  (235639) : Exiting Master process...
Dec 05 12:13:05 compute-0 neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb[235614]: [ALERT]    (235639) : Current worker (235641) exited with code 143 (Terminated)
Dec 05 12:13:05 compute-0 neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb[235614]: [WARNING]  (235639) : All workers exited. Exiting... (0)
Dec 05 12:13:05 compute-0 systemd[1]: libpod-6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8.scope: Deactivated successfully.
Dec 05 12:13:05 compute-0 podman[236149]: 2025-12-05 12:13:05.346596592 +0000 UTC m=+0.094084951 container died 6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.347 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.393 187212 DEBUG nova.policy [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.405 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.410 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:05 compute-0 podman[236174]: 2025-12-05 12:13:05.528766629 +0000 UTC m=+0.147282089 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 05 12:13:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8-userdata-shm.mount: Deactivated successfully.
Dec 05 12:13:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-12e3b2ef5267c5c9a54e90389de405eff9410b180f781d0c8a7327cc11d9cf1c-merged.mount: Deactivated successfully.
Dec 05 12:13:05 compute-0 podman[236166]: 2025-12-05 12:13:05.540845408 +0000 UTC m=+0.168101351 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Dec 05 12:13:05 compute-0 podman[236149]: 2025-12-05 12:13:05.564945075 +0000 UTC m=+0.312433474 container cleanup 6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.564 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk 1073741824" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.566 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.566 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:05 compute-0 systemd[1]: libpod-conmon-6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8.scope: Deactivated successfully.
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.638 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.639 187212 DEBUG nova.virt.disk.api [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Checking if we can resize image /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.639 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.719 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.721 187212 DEBUG nova.virt.disk.api [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Cannot resize image /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.721 187212 DEBUG nova.objects.instance [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'migration_context' on Instance uuid a85836ac-2737-428e-85a9-ffd8bd60f4a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.739 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.739 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Ensure instance console log exists: /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.740 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.740 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.740 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:05 compute-0 podman[236220]: 2025-12-05 12:13:05.91860075 +0000 UTC m=+0.324650947 container remove 6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.930 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c0efae2c-8cf9-429a-9a7b-5c8990e34a84]: (4, ('Fri Dec  5 12:13:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb (6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8)\n6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8\nFri Dec  5 12:13:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb (6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8)\n6caad3eb866714b2cabb69d3ee8f43075c35fca614a7f7b86baaa60844329cf8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.932 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5e085130-ebcb-4cc3-bcd7-02ef686d90db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.933 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfee25ff-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.936 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:05 compute-0 kernel: tapbfee25ff-e0: left promiscuous mode
Dec 05 12:13:05 compute-0 nova_compute[187208]: 2025-12-05 12:13:05.953 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.955 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ab1da9c9-8187-4e16-ad77-7bb9826afde5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.972 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e04f8b8b-c14b-49ab-b793-ab2a85993aaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.976 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cacf0d57-2c95-448b-8c7a-b24a468a3bd9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.993 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a7c2aa-692b-44dd-a429-50e3e1e15ca0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414694, 'reachable_time': 31097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236239, 'error': None, 'target': 'ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:05 compute-0 systemd[1]: run-netns-ovnmeta\x2dbfee25ff\x2de9ba\x2d4c5d\x2d8ff1\x2d92c7f67871bb.mount: Deactivated successfully.
Dec 05 12:13:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.998 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bfee25ff-e9ba-4c5d-8ff1-92c7f67871bb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:13:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:05.999 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8dfd7e-b2d9-4482-8bcf-d2ed614b1a8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:06 compute-0 nova_compute[187208]: 2025-12-05 12:13:06.282 187212 DEBUG nova.network.neutron [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Successfully created port: 3d342ae2-99f5-47b8-8c24-89dc69c89971 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:13:06 compute-0 nova_compute[187208]: 2025-12-05 12:13:06.941 187212 DEBUG nova.network.neutron [-] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:06 compute-0 nova_compute[187208]: 2025-12-05 12:13:06.963 187212 INFO nova.compute.manager [-] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Took 1.95 seconds to deallocate network for instance.
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.015 187212 DEBUG oslo_concurrency.lockutils [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.016 187212 DEBUG oslo_concurrency.lockutils [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.197 187212 DEBUG nova.network.neutron [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Successfully updated port: 5316adeb-5a49-4a58-b997-f132a083ff13 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.210 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.210 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.211 187212 DEBUG nova.network.neutron [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.230 187212 DEBUG nova.compute.provider_tree [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.248 187212 DEBUG nova.scheduler.client.report [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.269 187212 DEBUG oslo_concurrency.lockutils [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.304 187212 INFO nova.scheduler.client.report [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Deleted allocations for instance 846c0e55-1620-4c7a-9792-d4f5f0d728d8
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.339 187212 DEBUG nova.network.neutron [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Successfully updated port: 3d342ae2-99f5-47b8-8c24-89dc69c89971 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.361 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "refresh_cache-a85836ac-2737-428e-85a9-ffd8bd60f4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.361 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquired lock "refresh_cache-a85836ac-2737-428e-85a9-ffd8bd60f4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.361 187212 DEBUG nova.network.neutron [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.366 187212 DEBUG oslo_concurrency.lockutils [None req-8612a5ea-6302-4ce2-8a9d-d86e22fb7402 430719002c284cd28237859ea6061eef f7aededcaee54c4bbb7cba6007565f65 - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.501 187212 DEBUG nova.network.neutron [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.605 187212 DEBUG nova.network.neutron [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.779 187212 DEBUG nova.compute.manager [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.779 187212 DEBUG oslo_concurrency.lockutils [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.780 187212 DEBUG oslo_concurrency.lockutils [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.780 187212 DEBUG oslo_concurrency.lockutils [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.780 187212 DEBUG nova.compute.manager [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] No waiting events found dispatching network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.780 187212 WARNING nova.compute.manager [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received unexpected event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for instance with vm_state active and task_state None.
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.781 187212 DEBUG nova.compute.manager [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.781 187212 DEBUG oslo_concurrency.lockutils [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.781 187212 DEBUG oslo_concurrency.lockutils [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.781 187212 DEBUG oslo_concurrency.lockutils [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.782 187212 DEBUG nova.compute.manager [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] No waiting events found dispatching network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.782 187212 WARNING nova.compute.manager [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received unexpected event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for instance with vm_state active and task_state None.
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.782 187212 DEBUG nova.compute.manager [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.782 187212 DEBUG oslo_concurrency.lockutils [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.782 187212 DEBUG oslo_concurrency.lockutils [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.783 187212 DEBUG oslo_concurrency.lockutils [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.783 187212 DEBUG nova.compute.manager [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] No waiting events found dispatching network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.783 187212 WARNING nova.compute.manager [req-8d7db08d-cdce-4826-9163-1f4bdc685090 req-c129fdac-b64e-4805-88f7-be4047c1335e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received unexpected event network-vif-plugged-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 for instance with vm_state active and task_state None.
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.936 187212 DEBUG oslo_concurrency.lockutils [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.937 187212 DEBUG oslo_concurrency.lockutils [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.937 187212 DEBUG oslo_concurrency.lockutils [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.937 187212 DEBUG oslo_concurrency.lockutils [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.938 187212 DEBUG oslo_concurrency.lockutils [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.939 187212 INFO nova.compute.manager [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Terminating instance
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.940 187212 DEBUG nova.compute.manager [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:13:07 compute-0 kernel: tapc7d6a93d-87 (unregistering): left promiscuous mode
Dec 05 12:13:07 compute-0 NetworkManager[55691]: <info>  [1764936787.9619] device (tapc7d6a93d-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:07 compute-0 ovn_controller[95610]: 2025-12-05T12:13:07Z|00921|binding|INFO|Releasing lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 from this chassis (sb_readonly=0)
Dec 05 12:13:07 compute-0 ovn_controller[95610]: 2025-12-05T12:13:07Z|00922|binding|INFO|Setting lport c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 down in Southbound
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.973 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:07 compute-0 ovn_controller[95610]: 2025-12-05T12:13:07Z|00923|binding|INFO|Removing iface tapc7d6a93d-87 ovn-installed in OVS
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.975 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:07.979 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:e5:61 10.100.0.8'], port_security=['fa:16:3e:17:e5:61 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b497-acb9-4b26-8e26-7d0802bca8bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '8', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:07.980 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 unbound from our chassis
Dec 05 12:13:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:07.982 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:13:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:07.983 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[851862d5-c4a9-4429-83dd-b53191e46d94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:07 compute-0 nova_compute[187208]: 2025-12-05 12:13:07.989 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.000 187212 DEBUG nova.compute.manager [req-317daaf2-34f2-4feb-b6a4-4daa3a3f82fd req-f3159138-4625-4cb7-9e31-f2734532d5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-vif-plugged-0e7829e3-325a-430d-898f-510a4c544ffa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.001 187212 DEBUG oslo_concurrency.lockutils [req-317daaf2-34f2-4feb-b6a4-4daa3a3f82fd req-f3159138-4625-4cb7-9e31-f2734532d5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.002 187212 DEBUG oslo_concurrency.lockutils [req-317daaf2-34f2-4feb-b6a4-4daa3a3f82fd req-f3159138-4625-4cb7-9e31-f2734532d5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.002 187212 DEBUG oslo_concurrency.lockutils [req-317daaf2-34f2-4feb-b6a4-4daa3a3f82fd req-f3159138-4625-4cb7-9e31-f2734532d5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "846c0e55-1620-4c7a-9792-d4f5f0d728d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.002 187212 DEBUG nova.compute.manager [req-317daaf2-34f2-4feb-b6a4-4daa3a3f82fd req-f3159138-4625-4cb7-9e31-f2734532d5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] No waiting events found dispatching network-vif-plugged-0e7829e3-325a-430d-898f-510a4c544ffa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.003 187212 WARNING nova.compute.manager [req-317daaf2-34f2-4feb-b6a4-4daa3a3f82fd req-f3159138-4625-4cb7-9e31-f2734532d5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received unexpected event network-vif-plugged-0e7829e3-325a-430d-898f-510a4c544ffa for instance with vm_state deleted and task_state None.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.003 187212 DEBUG nova.compute.manager [req-317daaf2-34f2-4feb-b6a4-4daa3a3f82fd req-f3159138-4625-4cb7-9e31-f2734532d5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-vif-deleted-961ee213-955e-471f-8cb4-ad0d3a82285c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.004 187212 DEBUG nova.compute.manager [req-317daaf2-34f2-4feb-b6a4-4daa3a3f82fd req-f3159138-4625-4cb7-9e31-f2734532d5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Received event network-vif-deleted-0e7829e3-325a-430d-898f-510a4c544ffa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:08 compute-0 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d00000058.scope: Deactivated successfully.
Dec 05 12:13:08 compute-0 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d00000058.scope: Consumed 5.264s CPU time.
Dec 05 12:13:08 compute-0 systemd-machined[153543]: Machine qemu-105-instance-00000058 terminated.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.045 187212 DEBUG nova.network.neutron [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Updating instance_info_cache with network_info: [{"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.071 187212 DEBUG oslo_concurrency.lockutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Releasing lock "refresh_cache-95b4dafa-871e-42c8-8fb1-162d6b45f3aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.073 187212 DEBUG nova.compute.manager [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:08 compute-0 kernel: tape0274344-58 (unregistering): left promiscuous mode
Dec 05 12:13:08 compute-0 NetworkManager[55691]: <info>  [1764936788.2254] device (tape0274344-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.225 187212 INFO nova.virt.libvirt.driver [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Instance destroyed successfully.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.226 187212 DEBUG nova.objects.instance [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'resources' on Instance uuid b661b497-acb9-4b26-8e26-7d0802bca8bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:08 compute-0 ovn_controller[95610]: 2025-12-05T12:13:08Z|00924|binding|INFO|Releasing lport e0274344-5869-486b-a457-04b90e756602 from this chassis (sb_readonly=0)
Dec 05 12:13:08 compute-0 ovn_controller[95610]: 2025-12-05T12:13:08Z|00925|binding|INFO|Setting lport e0274344-5869-486b-a457-04b90e756602 down in Southbound
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.234 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 ovn_controller[95610]: 2025-12-05T12:13:08Z|00926|binding|INFO|Removing iface tape0274344-58 ovn-installed in OVS
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.237 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.246 187212 DEBUG nova.virt.libvirt.vif [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-604643112',display_name='tempest-ServerRescueTestJSON-server-604643112',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-604643112',id=88,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:12:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f73626a62534c97a06b6ec98d749111',ramdisk_id='',reservation_id='r-ri0fnqjs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-122605385',owner_user_name='tempest-ServerRescueTestJSON-122605385-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:13:03Z,user_data=None,user_id='d12bb49c0ca84e8dad933b49753c7b24',uuid=b661b497-acb9-4b26-8e26-7d0802bca8bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.247 187212 DEBUG nova.network.os_vif_util [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converting VIF {"id": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "address": "fa:16:3e:17:e5:61", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7d6a93d-87", "ovs_interfaceid": "c7d6a93d-8775-4c4e-9a60-bc8e87e5b310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.247 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:99:d7 10.100.0.13'], port_security=['fa:16:3e:0f:99:d7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '95b4dafa-871e-42c8-8fb1-162d6b45f3aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65234c0b-56d7-44c4-8665-41785bc53beb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c82a7ffe1fe49a88eb03f0d89c9629e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '89874479-4098-428c-b848-ebbb8d8ab5f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=787cd8eb-fb25-4820-b1a7-6d591e54c07a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e0274344-5869-486b-a457-04b90e756602) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.247 187212 DEBUG nova.network.os_vif_util [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:e5:61,bridge_name='br-int',has_traffic_filtering=True,id=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7d6a93d-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.248 187212 DEBUG os_vif [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:e5:61,bridge_name='br-int',has_traffic_filtering=True,id=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7d6a93d-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.248 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e0274344-5869-486b-a457-04b90e756602 in datapath 65234c0b-56d7-44c4-8665-41785bc53beb unbound from our chassis
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.249 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.250 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc7d6a93d-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.250 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 65234c0b-56d7-44c4-8665-41785bc53beb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.251 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.251 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0b2632ca-c72f-4b07-a66f-d1ba6fa958c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.251 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb namespace which is not needed anymore
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.252 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.255 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.257 187212 INFO os_vif [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:e5:61,bridge_name='br-int',has_traffic_filtering=True,id=c7d6a93d-8775-4c4e-9a60-bc8e87e5b310,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7d6a93d-87')
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.258 187212 INFO nova.virt.libvirt.driver [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Deleting instance files /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf_del
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.258 187212 INFO nova.virt.libvirt.driver [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Deletion of /var/lib/nova/instances/b661b497-acb9-4b26-8e26-7d0802bca8bf_del complete
Dec 05 12:13:08 compute-0 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Dec 05 12:13:08 compute-0 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d0000005b.scope: Consumed 8.842s CPU time.
Dec 05 12:13:08 compute-0 systemd-machined[153543]: Machine qemu-104-instance-0000005b terminated.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.303 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.322 187212 INFO nova.compute.manager [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.323 187212 DEBUG oslo.service.loopingcall [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.323 187212 DEBUG nova.compute.manager [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.323 187212 DEBUG nova.network.neutron [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:13:08 compute-0 NetworkManager[55691]: <info>  [1764936788.4303] manager: (tape0274344-58): new Tun device (/org/freedesktop/NetworkManager/Devices/355)
Dec 05 12:13:08 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[235938]: [NOTICE]   (235943) : haproxy version is 2.8.14-c23fe91
Dec 05 12:13:08 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[235938]: [NOTICE]   (235943) : path to executable is /usr/sbin/haproxy
Dec 05 12:13:08 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[235938]: [WARNING]  (235943) : Exiting Master process...
Dec 05 12:13:08 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[235938]: [ALERT]    (235943) : Current worker (235945) exited with code 143 (Terminated)
Dec 05 12:13:08 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[235938]: [WARNING]  (235943) : All workers exited. Exiting... (0)
Dec 05 12:13:08 compute-0 systemd[1]: libpod-f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54.scope: Deactivated successfully.
Dec 05 12:13:08 compute-0 podman[236287]: 2025-12-05 12:13:08.459771699 +0000 UTC m=+0.125635743 container died f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.472 187212 INFO nova.virt.libvirt.driver [-] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Instance destroyed successfully.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.472 187212 DEBUG nova.objects.instance [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lazy-loading 'resources' on Instance uuid 95b4dafa-871e-42c8-8fb1-162d6b45f3aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.507 187212 DEBUG nova.virt.libvirt.vif [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-2085942428',display_name='tempest-InstanceActionsTestJSON-server-2085942428',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-2085942428',id=91,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8c82a7ffe1fe49a88eb03f0d89c9629e',ramdisk_id='',reservation_id='r-7ofp0hr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-345226518',owner_user_name='tempest-InstanceActionsTestJSON-345226518-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:13:08Z,user_data=None,user_id='2f30ba31f27c4af2b73c6c86da366ebd',uuid=95b4dafa-871e-42c8-8fb1-162d6b45f3aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.508 187212 DEBUG nova.network.os_vif_util [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converting VIF {"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.508 187212 DEBUG nova.network.os_vif_util [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.509 187212 DEBUG os_vif [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.510 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.511 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape0274344-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.516 187212 INFO os_vif [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58')
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.524 187212 DEBUG nova.virt.libvirt.driver [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Start _get_guest_xml network_info=[{"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.530 187212 WARNING nova.virt.libvirt.driver [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.537 187212 DEBUG nova.virt.libvirt.host [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.538 187212 DEBUG nova.virt.libvirt.host [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.543 187212 DEBUG nova.virt.libvirt.host [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.544 187212 DEBUG nova.virt.libvirt.host [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.544 187212 DEBUG nova.virt.libvirt.driver [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.544 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.545 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.545 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.545 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.545 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.545 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.546 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.546 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.546 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.546 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.546 187212 DEBUG nova.virt.hardware [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.546 187212 DEBUG nova.objects.instance [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 95b4dafa-871e-42c8-8fb1-162d6b45f3aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.561 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.583 187212 DEBUG nova.network.neutron [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Updating instance_info_cache with network_info: [{"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54-userdata-shm.mount: Deactivated successfully.
Dec 05 12:13:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-e955ad4e0dfc4b80cb3a7e139e28478d43ec17cc197a0c9cc5c92873a01e021f-merged.mount: Deactivated successfully.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.605 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Releasing lock "refresh_cache-a85836ac-2737-428e-85a9-ffd8bd60f4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.606 187212 DEBUG nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Instance network_info: |[{"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.609 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Start _get_guest_xml network_info=[{"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.614 187212 WARNING nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:13:08 compute-0 podman[236287]: 2025-12-05 12:13:08.619037404 +0000 UTC m=+0.284901428 container cleanup f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.619 187212 DEBUG nova.virt.libvirt.host [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.621 187212 DEBUG nova.virt.libvirt.host [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.623 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.config --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.625 187212 DEBUG oslo_concurrency.lockutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.625 187212 DEBUG oslo_concurrency.lockutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.626 187212 DEBUG oslo_concurrency.lockutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.627 187212 DEBUG nova.virt.libvirt.vif [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-2085942428',display_name='tempest-InstanceActionsTestJSON-server-2085942428',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-2085942428',id=91,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8c82a7ffe1fe49a88eb03f0d89c9629e',ramdisk_id='',reservation_id='r-7ofp0hr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-345226518',owner_user_name='tempest-InstanceActionsTestJSON-345226518-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:13:08Z,user_data=None,user_id='2f30ba31f27c4af2b73c6c86da366ebd',uuid=95b4dafa-871e-42c8-8fb1-162d6b45f3aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.627 187212 DEBUG nova.network.os_vif_util [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converting VIF {"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.628 187212 DEBUG nova.network.os_vif_util [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.629 187212 DEBUG nova.objects.instance [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lazy-loading 'pci_devices' on Instance uuid 95b4dafa-871e-42c8-8fb1-162d6b45f3aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.630 187212 DEBUG nova.virt.libvirt.host [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.631 187212 DEBUG nova.virt.libvirt.host [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.632 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.632 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.632 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.633 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.633 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.633 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.634 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.634 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.634 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.635 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.635 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.635 187212 DEBUG nova.virt.hardware [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.639 187212 DEBUG nova.virt.libvirt.vif [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:13:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-226401896',display_name='tempest-ServersTestJSON-server-226401896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-226401896',id=93,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-bjbbgegp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:05Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=a85836ac-2737-428e-85a9-ffd8bd60f4a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.640 187212 DEBUG nova.network.os_vif_util [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.641 187212 DEBUG nova.network.os_vif_util [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:1f:ee,bridge_name='br-int',has_traffic_filtering=True,id=3d342ae2-99f5-47b8-8c24-89dc69c89971,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d342ae2-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.642 187212 DEBUG nova.objects.instance [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'pci_devices' on Instance uuid a85836ac-2737-428e-85a9-ffd8bd60f4a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.645 187212 DEBUG nova.virt.libvirt.driver [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <uuid>95b4dafa-871e-42c8-8fb1-162d6b45f3aa</uuid>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <name>instance-0000005b</name>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:name>tempest-InstanceActionsTestJSON-server-2085942428</nova:name>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:13:08</nova:creationTime>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:user uuid="2f30ba31f27c4af2b73c6c86da366ebd">tempest-InstanceActionsTestJSON-345226518-project-member</nova:user>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:project uuid="8c82a7ffe1fe49a88eb03f0d89c9629e">tempest-InstanceActionsTestJSON-345226518</nova:project>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:port uuid="e0274344-5869-486b-a457-04b90e756602">
Dec 05 12:13:08 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <system>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="serial">95b4dafa-871e-42c8-8fb1-162d6b45f3aa</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="uuid">95b4dafa-871e-42c8-8fb1-162d6b45f3aa</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </system>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <os>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </os>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <features>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </features>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk.config"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:0f:99:d7"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <target dev="tape0274344-58"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/console.log" append="off"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <video>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </video>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:13:08 compute-0 nova_compute[187208]: </domain>
Dec 05 12:13:08 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:13:08 compute-0 systemd[1]: libpod-conmon-f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54.scope: Deactivated successfully.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.649 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.687 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <uuid>a85836ac-2737-428e-85a9-ffd8bd60f4a3</uuid>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <name>instance-0000005d</name>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestJSON-server-226401896</nova:name>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:13:08</nova:creationTime>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:user uuid="62153b585ecc4e6fa2ad567851d49081">tempest-ServersTestJSON-1492365581-project-member</nova:user>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:project uuid="0c982a61e3fc4c8da9248076bb0361ac">tempest-ServersTestJSON-1492365581</nova:project>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         <nova:port uuid="3d342ae2-99f5-47b8-8c24-89dc69c89971">
Dec 05 12:13:08 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <system>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="serial">a85836ac-2737-428e-85a9-ffd8bd60f4a3</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="uuid">a85836ac-2737-428e-85a9-ffd8bd60f4a3</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </system>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <os>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </os>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <features>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </features>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk.config"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:cc:1f:ee"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <target dev="tap3d342ae2-99"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/console.log" append="off"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <video>
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </video>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:13:08 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:13:08 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:13:08 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:13:08 compute-0 nova_compute[187208]: </domain>
Dec 05 12:13:08 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.689 187212 DEBUG nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Preparing to wait for external event network-vif-plugged-3d342ae2-99f5-47b8-8c24-89dc69c89971 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.689 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.690 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.690 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.691 187212 DEBUG nova.virt.libvirt.vif [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:13:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-226401896',display_name='tempest-ServersTestJSON-server-226401896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-226401896',id=93,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-bjbbgegp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:05Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=a85836ac-2737-428e-85a9-ffd8bd60f4a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.692 187212 DEBUG nova.network.os_vif_util [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.694 187212 DEBUG nova.network.os_vif_util [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:1f:ee,bridge_name='br-int',has_traffic_filtering=True,id=3d342ae2-99f5-47b8-8c24-89dc69c89971,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d342ae2-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.695 187212 DEBUG os_vif [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:1f:ee,bridge_name='br-int',has_traffic_filtering=True,id=3d342ae2-99f5-47b8-8c24-89dc69c89971,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d342ae2-99') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.695 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.696 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.696 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.700 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.700 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d342ae2-99, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.701 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3d342ae2-99, col_values=(('external_ids', {'iface-id': '3d342ae2-99f5-47b8-8c24-89dc69c89971', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:1f:ee', 'vm-uuid': 'a85836ac-2737-428e-85a9-ffd8bd60f4a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.702 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 NetworkManager[55691]: <info>  [1764936788.7038] manager: (tap3d342ae2-99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.709 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.709 187212 INFO os_vif [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:1f:ee,bridge_name='br-int',has_traffic_filtering=True,id=3d342ae2-99f5-47b8-8c24-89dc69c89971,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d342ae2-99')
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.719 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.720 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.787 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.788 187212 DEBUG nova.objects.instance [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 95b4dafa-871e-42c8-8fb1-162d6b45f3aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:08 compute-0 podman[236333]: 2025-12-05 12:13:08.812207559 +0000 UTC m=+0.173741494 container remove f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.816 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.818 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b81368-0b7a-4895-9e4f-1cd2b1d48a8d]: (4, ('Fri Dec  5 12:13:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb (f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54)\nf619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54\nFri Dec  5 12:13:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb (f619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54)\nf619884dee798f8c1b2608d7bb9bc7ec3801d7d89707627f128d74f94fa1fc54\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.820 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9fed40-9d60-4a6c-ab85-ff786ca78bd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.821 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65234c0b-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:08 compute-0 kernel: tap65234c0b-50: left promiscuous mode
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.837 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.844 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.845 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d0990708-bb55-4022-9b2a-9cc83e45648d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.864 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.865 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.866 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No VIF found with MAC fa:16:3e:cc:1f:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.866 187212 INFO nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Using config drive
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.867 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf02beb-0f32-4d87-ba8b-fd5731d535a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.869 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[60a47d39-de47-4fc1-a89c-2246c8b30939]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.882 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.883 187212 DEBUG nova.virt.disk.api [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Checking if we can resize image /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.884 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.887 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[caf3e613-2f35-4b7a-90cf-674cfffe60a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415543, 'reachable_time': 18329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236361, 'error': None, 'target': 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.889 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:13:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:08.890 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[273d42e3-5fd2-4be6-9d3b-997ed2f34216]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d65234c0b\x2d56d7\x2d44c4\x2d8665\x2d41785bc53beb.mount: Deactivated successfully.
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.944 187212 DEBUG oslo_concurrency.processutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.945 187212 DEBUG nova.virt.disk.api [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Cannot resize image /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.946 187212 DEBUG nova.objects.instance [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lazy-loading 'migration_context' on Instance uuid 95b4dafa-871e-42c8-8fb1-162d6b45f3aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.947 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936773.9459684, f3769524-43d7-4c3b-be59-18bf7af73e18 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.948 187212 INFO nova.compute.manager [-] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] VM Stopped (Lifecycle Event)
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.975 187212 DEBUG nova.compute.manager [None req-f79de74c-8bf5-481a-b371-52bb7a01895f - - - - - -] [instance: f3769524-43d7-4c3b-be59-18bf7af73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.976 187212 DEBUG nova.virt.libvirt.vif [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-2085942428',display_name='tempest-InstanceActionsTestJSON-server-2085942428',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-2085942428',id=91,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='8c82a7ffe1fe49a88eb03f0d89c9629e',ramdisk_id='',reservation_id='r-7ofp0hr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-345226518',owner_user_name='tempest-InstanceActionsTestJSON-345226518-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:08Z,user_data=None,user_id='2f30ba31f27c4af2b73c6c86da366ebd',uuid=95b4dafa-871e-42c8-8fb1-162d6b45f3aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.976 187212 DEBUG nova.network.os_vif_util [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converting VIF {"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.977 187212 DEBUG nova.network.os_vif_util [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.978 187212 DEBUG os_vif [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.978 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.979 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.979 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.982 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape0274344-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.982 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape0274344-58, col_values=(('external_ids', {'iface-id': 'e0274344-5869-486b-a457-04b90e756602', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:99:d7', 'vm-uuid': '95b4dafa-871e-42c8-8fb1-162d6b45f3aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.984 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 NetworkManager[55691]: <info>  [1764936788.9852] manager: (tape0274344-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.986 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.990 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:08 compute-0 nova_compute[187208]: 2025-12-05 12:13:08.991 187212 INFO os_vif [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58')
Dec 05 12:13:09 compute-0 kernel: tape0274344-58: entered promiscuous mode
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.0661] manager: (tape0274344-58): new Tun device (/org/freedesktop/NetworkManager/Devices/358)
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00927|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00928|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00929|binding|INFO|Claiming lport e0274344-5869-486b-a457-04b90e756602 for this chassis.
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00930|binding|INFO|e0274344-5869-486b-a457-04b90e756602: Claiming fa:16:3e:0f:99:d7 10.100.0.13
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.144 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:99:d7 10.100.0.13'], port_security=['fa:16:3e:0f:99:d7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '95b4dafa-871e-42c8-8fb1-162d6b45f3aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65234c0b-56d7-44c4-8665-41785bc53beb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c82a7ffe1fe49a88eb03f0d89c9629e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '89874479-4098-428c-b848-ebbb8d8ab5f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=787cd8eb-fb25-4820-b1a7-6d591e54c07a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e0274344-5869-486b-a457-04b90e756602) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.145 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e0274344-5869-486b-a457-04b90e756602 in datapath 65234c0b-56d7-44c4-8665-41785bc53beb bound to our chassis
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.147 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 65234c0b-56d7-44c4-8665-41785bc53beb
Dec 05 12:13:09 compute-0 systemd-udevd[236386]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:13:09 compute-0 systemd-machined[153543]: New machine qemu-106-instance-0000005b.
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.159 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6f1b13-ba14-4d68-87cf-b22abd1b9fb5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.160 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap65234c0b-51 in ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.162 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap65234c0b-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.162 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[10409ade-4ae3-4c67-b977-1fc4b3dc7469]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.163 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0db94967-a76b-405b-8648-522a4cc4d3cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.1695] device (tape0274344-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.1705] device (tape0274344-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.176 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5c0976-4959-42e0-83f7-f6583a37762f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 systemd[1]: Started Virtual Machine qemu-106-instance-0000005b.
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.202 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[df932e86-dd2b-4bb8-aa71-d6ed4aee4d26]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.217 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00931|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00932|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.235 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.236 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e039a63a-84ea-4448-9641-48547e8e824a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.2704] manager: (tap65234c0b-50): new Veth device (/org/freedesktop/NetworkManager/Devices/359)
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.270 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[16e430e0-5a7f-42e6-a628-29c72f8ceb98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.268 187212 DEBUG nova.network.neutron [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00933|binding|INFO|Setting lport e0274344-5869-486b-a457-04b90e756602 ovn-installed in OVS
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00934|binding|INFO|Setting lport e0274344-5869-486b-a457-04b90e756602 up in Southbound
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.300 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.300 187212 DEBUG nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance network_info: |[{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.301 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.304 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start _get_guest_xml network_info=[{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.308 187212 WARNING nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.312 187212 DEBUG nova.virt.libvirt.host [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.313 187212 DEBUG nova.virt.libvirt.host [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.316 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4d5b14-bdf3-492d-864b-f642464a66cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.316 187212 DEBUG nova.virt.libvirt.host [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.317 187212 DEBUG nova.virt.libvirt.host [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.317 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.317 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.317 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.318 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.318 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.318 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.318 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.318 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.319 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.319 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.319 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.319 187212 DEBUG nova.virt.hardware [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.319 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb7a34b-0fe9-4d98-b0fa-296c42438334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.323 187212 DEBUG nova.virt.libvirt.vif [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.323 187212 DEBUG nova.network.os_vif_util [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.323 187212 DEBUG nova.network.os_vif_util [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.324 187212 DEBUG nova.objects.instance [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.341 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <uuid>2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</uuid>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <name>instance-0000005c</name>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestJSON-server-954339420</nova:name>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:13:09</nova:creationTime>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:13:09 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:13:09 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:13:09 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:13:09 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:13:09 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:13:09 compute-0 nova_compute[187208]:         <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec 05 12:13:09 compute-0 nova_compute[187208]:         <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:13:09 compute-0 nova_compute[187208]:         <nova:port uuid="5316adeb-5a49-4a58-b997-f132a083ff13">
Dec 05 12:13:09 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <system>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <entry name="serial">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <entry name="uuid">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     </system>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <os>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   </os>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <features>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   </features>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:9a:d0:34"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <target dev="tap5316adeb-5a"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/console.log" append="off"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <video>
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     </video>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:13:09 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:13:09 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:13:09 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:13:09 compute-0 nova_compute[187208]: </domain>
Dec 05 12:13:09 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.342 187212 DEBUG nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Preparing to wait for external event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.343 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.343 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.343 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.3439] device (tap65234c0b-50): carrier: link connected
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.344 187212 DEBUG nova.virt.libvirt.vif [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:12:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.344 187212 DEBUG nova.network.os_vif_util [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.345 187212 DEBUG nova.network.os_vif_util [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.345 187212 DEBUG os_vif [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.346 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.347 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.347 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.350 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c7dc75d5-1806-4480-9891-1e68aa5b71a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.351 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.351 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5316adeb-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.352 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5316adeb-5a, col_values=(('external_ids', {'iface-id': '5316adeb-5a49-4a58-b997-f132a083ff13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:d0:34', 'vm-uuid': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.3543] manager: (tap5316adeb-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.353 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.361 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.362 187212 INFO os_vif [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.369 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c77496b7-348f-4c6b-9683-3a3aeb21ca4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65234c0b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:c4:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 256], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416990, 'reachable_time': 38547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236429, 'error': None, 'target': 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.386 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[93940938-7400-41d4-94d3-b49d0045ea4f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:c4be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416990, 'tstamp': 416990}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236433, 'error': None, 'target': 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.400 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6df64d-443d-464c-abc0-ae2fb292c92f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65234c0b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:c4:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 256], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416990, 'reachable_time': 38547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236435, 'error': None, 'target': 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.419 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 95b4dafa-871e-42c8-8fb1-162d6b45f3aa due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.420 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936789.4194672, 95b4dafa-871e-42c8-8fb1-162d6b45f3aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.421 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] VM Resumed (Lifecycle Event)
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.423 187212 DEBUG nova.compute.manager [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.428 187212 INFO nova.virt.libvirt.driver [-] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Instance rebooted successfully.
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.429 187212 DEBUG nova.compute.manager [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.446 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.450 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.480 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.480 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936789.4225605, 95b4dafa-871e-42c8-8fb1-162d6b45f3aa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.480 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] VM Started (Lifecycle Event)
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.493 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.493 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.494 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No VIF found with MAC fa:16:3e:9a:d0:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.494 187212 INFO nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Using config drive
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.499 187212 INFO nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Creating config drive at /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk.config
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.505 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyhktrev4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.543 187212 DEBUG oslo_concurrency.lockutils [None req-33307a15-5f08-4161-a342-e326a73102aa 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.545 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.549 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.555 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2512e962-7510-4d52-b5fd-f1e97b2a2ba8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.633 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4ccdba72-da95-4b7b-9a97-479e63123e11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.634 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65234c0b-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.635 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.635 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65234c0b-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.637 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.6385] manager: (tap65234c0b-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Dec 05 12:13:09 compute-0 kernel: tap65234c0b-50: entered promiscuous mode
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.649 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap65234c0b-50, col_values=(('external_ids', {'iface-id': 'a30d2e0e-09db-4ced-93ec-e9e2d828cf95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.647 187212 DEBUG oslo_concurrency.processutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyhktrev4" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.648 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00935|binding|INFO|Releasing lport a30d2e0e-09db-4ced-93ec-e9e2d828cf95 from this chassis (sb_readonly=0)
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.651 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.666 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.674 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/65234c0b-56d7-44c4-8665-41785bc53beb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/65234c0b-56d7-44c4-8665-41785bc53beb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.673 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.675 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f5b26b6f-aade-4569-b414-1e3ea05ad121]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.676 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-65234c0b-56d7-44c4-8665-41785bc53beb
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/65234c0b-56d7-44c4-8665-41785bc53beb.pid.haproxy
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 65234c0b-56d7-44c4-8665-41785bc53beb
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.676 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'env', 'PROCESS_TAG=haproxy-65234c0b-56d7-44c4-8665-41785bc53beb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/65234c0b-56d7-44c4-8665-41785bc53beb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.7172] manager: (tap3d342ae2-99): new Tun device (/org/freedesktop/NetworkManager/Devices/362)
Dec 05 12:13:09 compute-0 kernel: tap3d342ae2-99: entered promiscuous mode
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00936|binding|INFO|Claiming lport 3d342ae2-99f5-47b8-8c24-89dc69c89971 for this chassis.
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00937|binding|INFO|3d342ae2-99f5-47b8-8c24-89dc69c89971: Claiming fa:16:3e:cc:1f:ee 10.100.0.5
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.722 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.7319] device (tap3d342ae2-99): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:13:09 compute-0 NetworkManager[55691]: <info>  [1764936789.7333] device (tap3d342ae2-99): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:13:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:09.735 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:1f:ee 10.100.0.5'], port_security=['fa:16:3e:cc:1f:ee 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a85836ac-2737-428e-85a9-ffd8bd60f4a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=3d342ae2-99f5-47b8-8c24-89dc69c89971) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00938|binding|INFO|Setting lport 3d342ae2-99f5-47b8-8c24-89dc69c89971 ovn-installed in OVS
Dec 05 12:13:09 compute-0 ovn_controller[95610]: 2025-12-05T12:13:09Z|00939|binding|INFO|Setting lport 3d342ae2-99f5-47b8-8c24-89dc69c89971 up in Southbound
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.751 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:09 compute-0 systemd-machined[153543]: New machine qemu-107-instance-0000005d.
Dec 05 12:13:09 compute-0 systemd[1]: Started Virtual Machine qemu-107-instance-0000005d.
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.916 187212 DEBUG nova.network.neutron [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.935 187212 INFO nova.compute.manager [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Took 1.61 seconds to deallocate network for instance.
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.989 187212 DEBUG oslo_concurrency.lockutils [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:09 compute-0 nova_compute[187208]: 2025-12-05 12:13:09.990 187212 DEBUG oslo_concurrency.lockutils [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.067 187212 INFO nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Creating config drive at /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.075 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd6o3z9fx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:10 compute-0 podman[236500]: 2025-12-05 12:13:10.039513593 +0000 UTC m=+0.028128624 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.152 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936790.151861, a85836ac-2737-428e-85a9-ffd8bd60f4a3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.153 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] VM Started (Lifecycle Event)
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.190 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.195 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936790.1520734, a85836ac-2737-428e-85a9-ffd8bd60f4a3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.195 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] VM Paused (Lifecycle Event)
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.209 187212 DEBUG oslo_concurrency.processutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd6o3z9fx" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.218 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.220 187212 DEBUG nova.compute.provider_tree [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.238 187212 DEBUG nova.scheduler.client.report [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.243 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:10 compute-0 podman[236500]: 2025-12-05 12:13:10.251748949 +0000 UTC m=+0.240363950 container create e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.279 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.282 187212 DEBUG oslo_concurrency.lockutils [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:10 compute-0 kernel: tap5316adeb-5a: entered promiscuous mode
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.288 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:10 compute-0 NetworkManager[55691]: <info>  [1764936790.2906] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/363)
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.292 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:10 compute-0 ovn_controller[95610]: 2025-12-05T12:13:10Z|00940|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec 05 12:13:10 compute-0 ovn_controller[95610]: 2025-12-05T12:13:10Z|00941|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.298 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:10 compute-0 NetworkManager[55691]: <info>  [1764936790.3048] device (tap5316adeb-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:13:10 compute-0 NetworkManager[55691]: <info>  [1764936790.3066] device (tap5316adeb-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.311 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.318 187212 INFO nova.scheduler.client.report [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Deleted allocations for instance b661b497-acb9-4b26-8e26-7d0802bca8bf
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.351 187212 DEBUG nova.compute.manager [req-e96ca4c3-eef2-428f-92db-b22275d12c91 req-98ffb1c1-aab7-4af4-af46-6f710cd50189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-changed-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.352 187212 DEBUG nova.compute.manager [req-e96ca4c3-eef2-428f-92db-b22275d12c91 req-98ffb1c1-aab7-4af4-af46-6f710cd50189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Refreshing instance network info cache due to event network-changed-5316adeb-5a49-4a58-b997-f132a083ff13. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.352 187212 DEBUG oslo_concurrency.lockutils [req-e96ca4c3-eef2-428f-92db-b22275d12c91 req-98ffb1c1-aab7-4af4-af46-6f710cd50189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.352 187212 DEBUG oslo_concurrency.lockutils [req-e96ca4c3-eef2-428f-92db-b22275d12c91 req-98ffb1c1-aab7-4af4-af46-6f710cd50189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.352 187212 DEBUG nova.network.neutron [req-e96ca4c3-eef2-428f-92db-b22275d12c91 req-98ffb1c1-aab7-4af4-af46-6f710cd50189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Refreshing network info cache for port 5316adeb-5a49-4a58-b997-f132a083ff13 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:13:10 compute-0 ovn_controller[95610]: 2025-12-05T12:13:10Z|00942|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec 05 12:13:10 compute-0 ovn_controller[95610]: 2025-12-05T12:13:10Z|00943|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:10 compute-0 systemd[1]: Started libpod-conmon-e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc.scope.
Dec 05 12:13:10 compute-0 systemd-machined[153543]: New machine qemu-108-instance-0000005c.
Dec 05 12:13:10 compute-0 systemd[1]: Started Virtual Machine qemu-108-instance-0000005c.
Dec 05 12:13:10 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:13:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcf0e39addf14043cddeb8df2cc67378b7e47b38083c0d9bcf962ef722b3a3c5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.414 187212 DEBUG oslo_concurrency.lockutils [None req-123c08d3-d763-4ecb-8670-e951602d6370 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "b661b497-acb9-4b26-8e26-7d0802bca8bf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.460 187212 DEBUG nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Received event network-changed-3d342ae2-99f5-47b8-8c24-89dc69c89971 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.461 187212 DEBUG nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Refreshing instance network info cache due to event network-changed-3d342ae2-99f5-47b8-8c24-89dc69c89971. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.461 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-a85836ac-2737-428e-85a9-ffd8bd60f4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.461 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-a85836ac-2737-428e-85a9-ffd8bd60f4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.461 187212 DEBUG nova.network.neutron [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Refreshing network info cache for port 3d342ae2-99f5-47b8-8c24-89dc69c89971 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:13:10 compute-0 podman[236500]: 2025-12-05 12:13:10.480816851 +0000 UTC m=+0.469431892 container init e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 05 12:13:10 compute-0 podman[236500]: 2025-12-05 12:13:10.485864247 +0000 UTC m=+0.474479378 container start e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:13:10 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[236543]: [NOTICE]   (236553) : New worker (236555) forked
Dec 05 12:13:10 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[236543]: [NOTICE]   (236553) : Loading success.
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.589 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 3d342ae2-99f5-47b8-8c24-89dc69c89971 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 unbound from our chassis
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.592 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.611 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b9edd870-ed6f-4dde-93c3-3181c2657743]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.652 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bc646240-8af6-4c57-8b96-ce391dee53c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.655 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cd408180-0771-4885-8121-d7dd982d155d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.683 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[72ab2354-76ac-4c4f-a834-a992f57d1f6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.698 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[61dd7148-e1de-4bdd-b6ea-b3f6080f31fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 21, 'rx_bytes': 700, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 21, 'rx_bytes': 700, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236569, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.711 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff2385a-1cf3-408b-866a-bc7b1e9a8ded]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236570, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236570, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.714 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:10 compute-0 nova_compute[187208]: 2025-12-05 12:13:10.716 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.718 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.718 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.719 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.719 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.720 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.722 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.733 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e418e04c-c896-48df-8247-bfc91bca530e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.734 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9ed41c2-b1 in ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.737 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9ed41c2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.737 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[df5b336a-0d2f-4e37-8a64-83dc6a2e4198]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.737 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[36bbf2ee-6ff7-4451-b689-b7f8ca7b3a26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.754 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[9f95452b-7823-4aaf-b11a-676270ef163e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.768 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[325cb3d6-90c4-41d8-b027-723f3c89aa4d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.798 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[11b6c3ac-893c-4ddf-91e6-2dff19110769]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 NetworkManager[55691]: <info>  [1764936790.8048] manager: (tapf9ed41c2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/364)
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.806 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[160ab673-3b9e-4d56-8aa0-eea6192add2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.853 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[af4bd65d-9b08-4a29-93bf-e1762643902d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.859 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a059a72b-a0a1-4985-a56d-fdab021d7f3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 NetworkManager[55691]: <info>  [1764936790.8844] device (tapf9ed41c2-b0): carrier: link connected
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.890 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce8de27-fb9c-4d36-a2c5-b27ee817317e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.908 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[197f5d56-6c20-4fa2-8428-e921c911be82]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 259], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417144, 'reachable_time': 25215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236581, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.924 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43d441e9-824a-44ea-93ac-992dd1b68b8a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:9111'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 417144, 'tstamp': 417144}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236582, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.941 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9c46c6-7b24-4471-a577-b63dfb3cd843]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 259], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417144, 'reachable_time': 25215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236583, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:10.972 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb40da0-7094-460e-8726-d95d88d82c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:11.031 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84e98a60-749d-472a-ba97-2e921d49c4c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:11.032 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:11.033 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:11.033 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.035 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:11 compute-0 kernel: tapf9ed41c2-b0: entered promiscuous mode
Dec 05 12:13:11 compute-0 NetworkManager[55691]: <info>  [1764936791.0371] manager: (tapf9ed41c2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:11.043 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:11 compute-0 ovn_controller[95610]: 2025-12-05T12:13:11Z|00944|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.044 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:11.047 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:11.048 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[33f703b0-3727-4793-ad90-1237d39f7d49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:11.049 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:13:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:11.051 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'env', 'PROCESS_TAG=haproxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9ed41c2-b085-41ff-ac71-6256a4e30e85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.056 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.354 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936791.35284, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.355 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Started (Lifecycle Event)
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.383 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.421 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936791.3530068, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.421 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Paused (Lifecycle Event)
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.448 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.451 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:11 compute-0 nova_compute[187208]: 2025-12-05 12:13:11.485 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:13:11 compute-0 podman[236622]: 2025-12-05 12:13:11.504547329 +0000 UTC m=+0.117200599 container create 8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 12:13:11 compute-0 podman[236622]: 2025-12-05 12:13:11.409974115 +0000 UTC m=+0.022627405 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:13:11 compute-0 systemd[1]: Started libpod-conmon-8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887.scope.
Dec 05 12:13:11 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:13:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ef0da064995570ab0cfc29a1c5131088e3b52fe546aa4976392ee1079b3bbaf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:13:11 compute-0 podman[236622]: 2025-12-05 12:13:11.599066382 +0000 UTC m=+0.211719662 container init 8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 12:13:11 compute-0 podman[236622]: 2025-12-05 12:13:11.605449006 +0000 UTC m=+0.218102276 container start 8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 05 12:13:11 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[236637]: [NOTICE]   (236641) : New worker (236643) forked
Dec 05 12:13:11 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[236637]: [NOTICE]   (236641) : Loading success.
Dec 05 12:13:13 compute-0 podman[236652]: 2025-12-05 12:13:13.216468564 +0000 UTC m=+0.057631647 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 12:13:13 compute-0 podman[236653]: 2025-12-05 12:13:13.245768781 +0000 UTC m=+0.084557745 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.252 187212 DEBUG oslo_concurrency.lockutils [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.255 187212 DEBUG oslo_concurrency.lockutils [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.256 187212 DEBUG oslo_concurrency.lockutils [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.256 187212 DEBUG oslo_concurrency.lockutils [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.256 187212 DEBUG oslo_concurrency.lockutils [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.259 187212 INFO nova.compute.manager [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Terminating instance
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.262 187212 DEBUG nova.compute.manager [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:13:13 compute-0 kernel: tap0bab1586-b0 (unregistering): left promiscuous mode
Dec 05 12:13:13 compute-0 NetworkManager[55691]: <info>  [1764936793.2878] device (tap0bab1586-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.298 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:13 compute-0 ovn_controller[95610]: 2025-12-05T12:13:13Z|00945|binding|INFO|Releasing lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 from this chassis (sb_readonly=0)
Dec 05 12:13:13 compute-0 ovn_controller[95610]: 2025-12-05T12:13:13Z|00946|binding|INFO|Setting lport 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 down in Southbound
Dec 05 12:13:13 compute-0 ovn_controller[95610]: 2025-12-05T12:13:13Z|00947|binding|INFO|Removing iface tap0bab1586-b0 ovn-installed in OVS
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.300 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:13.308 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a0:97 10.100.0.11'], port_security=['fa:16:3e:d5:a0:97 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f2a101e0-138f-404e-b6e0-e1359272f560', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dfc8194-4267-4990-b8bc-6ea0b59180e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f73626a62534c97a06b6ec98d749111', 'neutron:revision_number': '6', 'neutron:security_group_ids': '33edd2b9-62ae-4f3d-8139-cd1f7488d285', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76331bfe-99d2-4af9-a913-603b9f83c953, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:13.310 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 in datapath 1dfc8194-4267-4990-b8bc-6ea0b59180e7 unbound from our chassis
Dec 05 12:13:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:13.311 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dfc8194-4267-4990-b8bc-6ea0b59180e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 05 12:13:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:13.312 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[455fdd4e-5ee6-42b6-b36f-6f3cab058300]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.316 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.321 187212 DEBUG nova.compute.manager [req-219e1bde-adbd-47f9-b5e0-fc3a68976136 req-0ebf0c39-7ebc-4542-a21d-74c901b4854d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Received event network-vif-deleted-c7d6a93d-8775-4c4e-9a60-bc8e87e5b310 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:13 compute-0 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d00000056.scope: Deactivated successfully.
Dec 05 12:13:13 compute-0 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d00000056.scope: Consumed 13.440s CPU time.
Dec 05 12:13:13 compute-0 systemd-machined[153543]: Machine qemu-98-instance-00000056 terminated.
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.387 187212 DEBUG nova.compute.manager [req-4491253f-cf0c-4883-bfd4-690930070fd8 req-561b8097-772d-4cc8-833c-c34c9e55f07a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.388 187212 DEBUG oslo_concurrency.lockutils [req-4491253f-cf0c-4883-bfd4-690930070fd8 req-561b8097-772d-4cc8-833c-c34c9e55f07a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.388 187212 DEBUG oslo_concurrency.lockutils [req-4491253f-cf0c-4883-bfd4-690930070fd8 req-561b8097-772d-4cc8-833c-c34c9e55f07a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.388 187212 DEBUG oslo_concurrency.lockutils [req-4491253f-cf0c-4883-bfd4-690930070fd8 req-561b8097-772d-4cc8-833c-c34c9e55f07a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.388 187212 DEBUG nova.compute.manager [req-4491253f-cf0c-4883-bfd4-690930070fd8 req-561b8097-772d-4cc8-833c-c34c9e55f07a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] No waiting events found dispatching network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.389 187212 WARNING nova.compute.manager [req-4491253f-cf0c-4883-bfd4-690930070fd8 req-561b8097-772d-4cc8-833c-c34c9e55f07a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received unexpected event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 for instance with vm_state active and task_state None.
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.551 187212 INFO nova.virt.libvirt.driver [-] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Instance destroyed successfully.
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.551 187212 DEBUG nova.objects.instance [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lazy-loading 'resources' on Instance uuid f2a101e0-138f-404e-b6e0-e1359272f560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.567 187212 DEBUG oslo_concurrency.lockutils [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.568 187212 DEBUG oslo_concurrency.lockutils [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.568 187212 DEBUG oslo_concurrency.lockutils [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.568 187212 DEBUG oslo_concurrency.lockutils [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.568 187212 DEBUG oslo_concurrency.lockutils [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.570 187212 INFO nova.compute.manager [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Terminating instance
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.571 187212 DEBUG nova.compute.manager [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.573 187212 DEBUG nova.virt.libvirt.vif [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:11:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-234668707',display_name='tempest-ServerRescueTestJSON-server-234668707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-234668707',id=86,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:12:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f73626a62534c97a06b6ec98d749111',ramdisk_id='',reservation_id='r-p6hppypz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-122605385',owner_user_name='tempest-ServerRescueTestJSON-122605385-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:12:16Z,user_data=None,user_id='d12bb49c0ca84e8dad933b49753c7b24',uuid=f2a101e0-138f-404e-b6e0-e1359272f560,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.574 187212 DEBUG nova.network.os_vif_util [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converting VIF {"id": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "address": "fa:16:3e:d5:a0:97", "network": {"id": "1dfc8194-4267-4990-b8bc-6ea0b59180e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-689964296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "8f73626a62534c97a06b6ec98d749111", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bab1586-b0", "ovs_interfaceid": "0bab1586-b06a-4ae9-a0f9-9fbea816c5c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.575 187212 DEBUG nova.network.os_vif_util [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a0:97,bridge_name='br-int',has_traffic_filtering=True,id=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bab1586-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.575 187212 DEBUG os_vif [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a0:97,bridge_name='br-int',has_traffic_filtering=True,id=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bab1586-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.578 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0bab1586-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.848 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.854 187212 INFO os_vif [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a0:97,bridge_name='br-int',has_traffic_filtering=True,id=0bab1586-b06a-4ae9-a0f9-9fbea816c5c2,network=Network(1dfc8194-4267-4990-b8bc-6ea0b59180e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bab1586-b0')
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.855 187212 INFO nova.virt.libvirt.driver [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Deleting instance files /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560_del
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.856 187212 INFO nova.virt.libvirt.driver [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Deletion of /var/lib/nova/instances/f2a101e0-138f-404e-b6e0-e1359272f560_del complete
Dec 05 12:13:13 compute-0 kernel: tape0274344-58 (unregistering): left promiscuous mode
Dec 05 12:13:13 compute-0 NetworkManager[55691]: <info>  [1764936793.8689] device (tape0274344-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:13 compute-0 ovn_controller[95610]: 2025-12-05T12:13:13Z|00948|binding|INFO|Releasing lport e0274344-5869-486b-a457-04b90e756602 from this chassis (sb_readonly=0)
Dec 05 12:13:13 compute-0 ovn_controller[95610]: 2025-12-05T12:13:13Z|00949|binding|INFO|Setting lport e0274344-5869-486b-a457-04b90e756602 down in Southbound
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.874 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:13 compute-0 ovn_controller[95610]: 2025-12-05T12:13:13Z|00950|binding|INFO|Removing iface tape0274344-58 ovn-installed in OVS
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.877 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:13.887 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:99:d7 10.100.0.13'], port_security=['fa:16:3e:0f:99:d7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '95b4dafa-871e-42c8-8fb1-162d6b45f3aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65234c0b-56d7-44c4-8665-41785bc53beb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c82a7ffe1fe49a88eb03f0d89c9629e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '89874479-4098-428c-b848-ebbb8d8ab5f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=787cd8eb-fb25-4820-b1a7-6d591e54c07a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e0274344-5869-486b-a457-04b90e756602) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:13.888 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e0274344-5869-486b-a457-04b90e756602 in datapath 65234c0b-56d7-44c4-8665-41785bc53beb unbound from our chassis
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.890 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:13.891 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 65234c0b-56d7-44c4-8665-41785bc53beb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:13:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:13.892 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[327c83af-a852-4dd6-8730-5daa5d768a10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:13 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:13.893 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb namespace which is not needed anymore
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.905 187212 INFO nova.compute.manager [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Took 0.64 seconds to destroy the instance on the hypervisor.
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.906 187212 DEBUG oslo.service.loopingcall [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.906 187212 DEBUG nova.compute.manager [-] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:13:13 compute-0 nova_compute[187208]: 2025-12-05 12:13:13.906 187212 DEBUG nova.network.neutron [-] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:13:13 compute-0 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Dec 05 12:13:13 compute-0 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d0000005b.scope: Consumed 4.735s CPU time.
Dec 05 12:13:13 compute-0 systemd-machined[153543]: Machine qemu-106-instance-0000005b terminated.
Dec 05 12:13:13 compute-0 kernel: tape0274344-58: entered promiscuous mode
Dec 05 12:13:14 compute-0 kernel: tape0274344-58 (unregistering): left promiscuous mode
Dec 05 12:13:14 compute-0 NetworkManager[55691]: <info>  [1764936794.0443] manager: (tape0274344-58): new Tun device (/org/freedesktop/NetworkManager/Devices/366)
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.049 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.096 187212 INFO nova.virt.libvirt.driver [-] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Instance destroyed successfully.
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.097 187212 DEBUG nova.objects.instance [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lazy-loading 'resources' on Instance uuid 95b4dafa-871e-42c8-8fb1-162d6b45f3aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.116 187212 DEBUG nova.virt.libvirt.vif [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-2085942428',display_name='tempest-InstanceActionsTestJSON-server-2085942428',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-2085942428',id=91,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8c82a7ffe1fe49a88eb03f0d89c9629e',ramdisk_id='',reservation_id='r-7ofp0hr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-345226518',owner_user_name='tempest-InstanceActionsTestJSON-345226518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:13:09Z,user_data=None,user_id='2f30ba31f27c4af2b73c6c86da366ebd',uuid=95b4dafa-871e-42c8-8fb1-162d6b45f3aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.117 187212 DEBUG nova.network.os_vif_util [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converting VIF {"id": "e0274344-5869-486b-a457-04b90e756602", "address": "fa:16:3e:0f:99:d7", "network": {"id": "65234c0b-56d7-44c4-8665-41785bc53beb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-214291783-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c82a7ffe1fe49a88eb03f0d89c9629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0274344-58", "ovs_interfaceid": "e0274344-5869-486b-a457-04b90e756602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.117 187212 DEBUG nova.network.os_vif_util [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.118 187212 DEBUG os_vif [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.119 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.119 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape0274344-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.120 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.122 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.124 187212 INFO os_vif [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:99:d7,bridge_name='br-int',has_traffic_filtering=True,id=e0274344-5869-486b-a457-04b90e756602,network=Network(65234c0b-56d7-44c4-8665-41785bc53beb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0274344-58')
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.125 187212 INFO nova.virt.libvirt.driver [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Deleting instance files /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa_del
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.125 187212 INFO nova.virt.libvirt.driver [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Deletion of /var/lib/nova/instances/95b4dafa-871e-42c8-8fb1-162d6b45f3aa_del complete
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.191 187212 INFO nova.compute.manager [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Took 0.62 seconds to destroy the instance on the hypervisor.
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.192 187212 DEBUG oslo.service.loopingcall [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.193 187212 DEBUG nova.compute.manager [-] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.193 187212 DEBUG nova.network.neutron [-] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:13:14 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[236543]: [NOTICE]   (236553) : haproxy version is 2.8.14-c23fe91
Dec 05 12:13:14 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[236543]: [NOTICE]   (236553) : path to executable is /usr/sbin/haproxy
Dec 05 12:13:14 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[236543]: [WARNING]  (236553) : Exiting Master process...
Dec 05 12:13:14 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[236543]: [WARNING]  (236553) : Exiting Master process...
Dec 05 12:13:14 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[236543]: [ALERT]    (236553) : Current worker (236555) exited with code 143 (Terminated)
Dec 05 12:13:14 compute-0 neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb[236543]: [WARNING]  (236553) : All workers exited. Exiting... (0)
Dec 05 12:13:14 compute-0 systemd[1]: libpod-e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc.scope: Deactivated successfully.
Dec 05 12:13:14 compute-0 podman[236751]: 2025-12-05 12:13:14.210281057 +0000 UTC m=+0.202282630 container died e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 12:13:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc-userdata-shm.mount: Deactivated successfully.
Dec 05 12:13:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-dcf0e39addf14043cddeb8df2cc67378b7e47b38083c0d9bcf962ef722b3a3c5-merged.mount: Deactivated successfully.
Dec 05 12:13:14 compute-0 podman[236751]: 2025-12-05 12:13:14.343139548 +0000 UTC m=+0.335141121 container cleanup e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:13:14 compute-0 systemd[1]: libpod-conmon-e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc.scope: Deactivated successfully.
Dec 05 12:13:14 compute-0 podman[236795]: 2025-12-05 12:13:14.353679412 +0000 UTC m=+0.056795803 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 12:13:14 compute-0 podman[236817]: 2025-12-05 12:13:14.444591751 +0000 UTC m=+0.078526281 container remove e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:13:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:14.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[123a8643-23b1-4c2f-bcb6-81e2263184c6]: (4, ('Fri Dec  5 12:13:13 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb (e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc)\ne4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc\nFri Dec  5 12:13:14 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb (e4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc)\ne4687a14be79c07282dc21d5674bc1e397e084be8cdedfaef7ea431d52a500cc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:14.451 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b43ad423-e63d-45b1-be06-79d8f18ed50e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:14.453 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65234c0b-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.455 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:14 compute-0 kernel: tap65234c0b-50: left promiscuous mode
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.467 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.470 187212 DEBUG nova.network.neutron [req-e96ca4c3-eef2-428f-92db-b22275d12c91 req-98ffb1c1-aab7-4af4-af46-6f710cd50189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updated VIF entry in instance network info cache for port 5316adeb-5a49-4a58-b997-f132a083ff13. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.471 187212 DEBUG nova.network.neutron [req-e96ca4c3-eef2-428f-92db-b22275d12c91 req-98ffb1c1-aab7-4af4-af46-6f710cd50189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:14.471 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[98aa200c-0a6f-4123-8e0f-2a5719dbe505]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:14.486 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0a906066-271f-41d9-ae47-aac36271d2a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:14.487 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[68971dcf-e655-4040-9866-2e6b5584581c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.488 187212 DEBUG oslo_concurrency.lockutils [req-e96ca4c3-eef2-428f-92db-b22275d12c91 req-98ffb1c1-aab7-4af4-af46-6f710cd50189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.497 187212 DEBUG nova.network.neutron [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Updated VIF entry in instance network info cache for port 3d342ae2-99f5-47b8-8c24-89dc69c89971. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.498 187212 DEBUG nova.network.neutron [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Updating instance_info_cache with network_info: [{"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:14.503 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[37024e65-c508-40eb-8af1-4c861abdc7d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416979, 'reachable_time': 33528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236832, 'error': None, 'target': 'ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:14.506 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-65234c0b-56d7-44c4-8665-41785bc53beb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:13:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:14.506 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[174a41fa-f7ff-4f5c-958a-420101719e15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:14 compute-0 systemd[1]: run-netns-ovnmeta\x2d65234c0b\x2d56d7\x2d44c4\x2d8665\x2d41785bc53beb.mount: Deactivated successfully.
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.516 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-a85836ac-2737-428e-85a9-ffd8bd60f4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.517 187212 DEBUG nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-vif-unplugged-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.517 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.517 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.517 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.517 187212 DEBUG nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] No waiting events found dispatching network-vif-unplugged-e0274344-5869-486b-a457-04b90e756602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.518 187212 WARNING nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received unexpected event network-vif-unplugged-e0274344-5869-486b-a457-04b90e756602 for instance with vm_state active and task_state None.
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.518 187212 DEBUG nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.518 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.518 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.518 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.518 187212 DEBUG nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] No waiting events found dispatching network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.519 187212 WARNING nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received unexpected event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 for instance with vm_state active and task_state None.
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.519 187212 DEBUG nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.519 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.519 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.519 187212 DEBUG oslo_concurrency.lockutils [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.519 187212 DEBUG nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] No waiting events found dispatching network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:14 compute-0 nova_compute[187208]: 2025-12-05 12:13:14.520 187212 WARNING nova.compute.manager [req-236ea4b4-0a04-4d6e-95bc-b3260203e9cf req-57709318-a667-4610-b8bf-37a317dec83b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received unexpected event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 for instance with vm_state active and task_state None.
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.732 187212 DEBUG nova.network.neutron [-] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.754 187212 INFO nova.compute.manager [-] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Took 1.56 seconds to deallocate network for instance.
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.801 187212 DEBUG oslo_concurrency.lockutils [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.802 187212 DEBUG oslo_concurrency.lockutils [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.950 187212 DEBUG nova.compute.provider_tree [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.964 187212 DEBUG nova.scheduler.client.report [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.971 187212 DEBUG nova.compute.manager [req-769e65ca-f3b0-487c-8851-e3a438396084 req-a5d8f42d-e888-438f-8b34-3033625f7d17 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-vif-unplugged-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.971 187212 DEBUG oslo_concurrency.lockutils [req-769e65ca-f3b0-487c-8851-e3a438396084 req-a5d8f42d-e888-438f-8b34-3033625f7d17 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.972 187212 DEBUG oslo_concurrency.lockutils [req-769e65ca-f3b0-487c-8851-e3a438396084 req-a5d8f42d-e888-438f-8b34-3033625f7d17 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.972 187212 DEBUG oslo_concurrency.lockutils [req-769e65ca-f3b0-487c-8851-e3a438396084 req-a5d8f42d-e888-438f-8b34-3033625f7d17 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.973 187212 DEBUG nova.compute.manager [req-769e65ca-f3b0-487c-8851-e3a438396084 req-a5d8f42d-e888-438f-8b34-3033625f7d17 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] No waiting events found dispatching network-vif-unplugged-e0274344-5869-486b-a457-04b90e756602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.973 187212 WARNING nova.compute.manager [req-769e65ca-f3b0-487c-8851-e3a438396084 req-a5d8f42d-e888-438f-8b34-3033625f7d17 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received unexpected event network-vif-unplugged-e0274344-5869-486b-a457-04b90e756602 for instance with vm_state deleted and task_state None.
Dec 05 12:13:15 compute-0 nova_compute[187208]: 2025-12-05 12:13:15.988 187212 DEBUG oslo_concurrency.lockutils [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:16 compute-0 nova_compute[187208]: 2025-12-05 12:13:16.022 187212 INFO nova.scheduler.client.report [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Deleted allocations for instance 95b4dafa-871e-42c8-8fb1-162d6b45f3aa
Dec 05 12:13:16 compute-0 nova_compute[187208]: 2025-12-05 12:13:16.085 187212 DEBUG oslo_concurrency.lockutils [None req-ec7944b0-d730-4e9d-8b23-67b62951ba32 2f30ba31f27c4af2b73c6c86da366ebd 8c82a7ffe1fe49a88eb03f0d89c9629e - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.003 187212 DEBUG nova.compute.manager [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Received event network-vif-plugged-3d342ae2-99f5-47b8-8c24-89dc69c89971 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.004 187212 DEBUG oslo_concurrency.lockutils [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.005 187212 DEBUG oslo_concurrency.lockutils [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.005 187212 DEBUG oslo_concurrency.lockutils [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.005 187212 DEBUG nova.compute.manager [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Processing event network-vif-plugged-3d342ae2-99f5-47b8-8c24-89dc69c89971 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.005 187212 DEBUG nova.compute.manager [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Received event network-vif-plugged-3d342ae2-99f5-47b8-8c24-89dc69c89971 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.006 187212 DEBUG oslo_concurrency.lockutils [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.006 187212 DEBUG oslo_concurrency.lockutils [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.006 187212 DEBUG oslo_concurrency.lockutils [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.007 187212 DEBUG nova.compute.manager [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] No waiting events found dispatching network-vif-plugged-3d342ae2-99f5-47b8-8c24-89dc69c89971 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.007 187212 WARNING nova.compute.manager [req-ca4f5fbf-0cf0-4747-bba4-d80e3b65580f req-57fb4249-4fd5-49b2-be77-71778202a15b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Received unexpected event network-vif-plugged-3d342ae2-99f5-47b8-8c24-89dc69c89971 for instance with vm_state building and task_state spawning.
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.008 187212 DEBUG nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.012 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936797.012713, a85836ac-2737-428e-85a9-ffd8bd60f4a3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.013 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] VM Resumed (Lifecycle Event)
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.015 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.021 187212 INFO nova.virt.libvirt.driver [-] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Instance spawned successfully.
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.021 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.037 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.043 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.046 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.047 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.047 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.048 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.048 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.049 187212 DEBUG nova.virt.libvirt.driver [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.090 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.102 187212 DEBUG nova.network.neutron [-] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.122 187212 INFO nova.compute.manager [-] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Took 3.22 seconds to deallocate network for instance.
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.127 187212 INFO nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Took 11.90 seconds to spawn the instance on the hypervisor.
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.128 187212 DEBUG nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.185 187212 DEBUG oslo_concurrency.lockutils [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.185 187212 DEBUG oslo_concurrency.lockutils [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.203 187212 INFO nova.compute.manager [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Took 12.56 seconds to build instance.
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.223 187212 DEBUG oslo_concurrency.lockutils [None req-9377ddb6-3599-4cae-8bc3-5cc0c6c05d6c 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.330 187212 DEBUG nova.compute.provider_tree [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.353 187212 DEBUG nova.scheduler.client.report [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.380 187212 DEBUG oslo_concurrency.lockutils [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.422 187212 INFO nova.scheduler.client.report [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Deleted allocations for instance f2a101e0-138f-404e-b6e0-e1359272f560
Dec 05 12:13:17 compute-0 nova_compute[187208]: 2025-12-05 12:13:17.502 187212 DEBUG oslo_concurrency.lockutils [None req-cc9f7f6f-2421-490f-84b5-32d55a093d08 d12bb49c0ca84e8dad933b49753c7b24 8f73626a62534c97a06b6ec98d749111 - - default default] Lock "f2a101e0-138f-404e-b6e0-e1359272f560" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:18 compute-0 nova_compute[187208]: 2025-12-05 12:13:18.323 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:18 compute-0 nova_compute[187208]: 2025-12-05 12:13:18.523 187212 DEBUG nova.compute.manager [req-19da69e5-6528-4736-b54c-992a3dede652 req-1311d4e8-b573-45ef-86a5-763f4daa4936 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:18 compute-0 nova_compute[187208]: 2025-12-05 12:13:18.523 187212 DEBUG oslo_concurrency.lockutils [req-19da69e5-6528-4736-b54c-992a3dede652 req-1311d4e8-b573-45ef-86a5-763f4daa4936 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:18 compute-0 nova_compute[187208]: 2025-12-05 12:13:18.523 187212 DEBUG oslo_concurrency.lockutils [req-19da69e5-6528-4736-b54c-992a3dede652 req-1311d4e8-b573-45ef-86a5-763f4daa4936 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:18 compute-0 nova_compute[187208]: 2025-12-05 12:13:18.524 187212 DEBUG oslo_concurrency.lockutils [req-19da69e5-6528-4736-b54c-992a3dede652 req-1311d4e8-b573-45ef-86a5-763f4daa4936 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "95b4dafa-871e-42c8-8fb1-162d6b45f3aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:18 compute-0 nova_compute[187208]: 2025-12-05 12:13:18.524 187212 DEBUG nova.compute.manager [req-19da69e5-6528-4736-b54c-992a3dede652 req-1311d4e8-b573-45ef-86a5-763f4daa4936 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] No waiting events found dispatching network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:18 compute-0 nova_compute[187208]: 2025-12-05 12:13:18.524 187212 WARNING nova.compute.manager [req-19da69e5-6528-4736-b54c-992a3dede652 req-1311d4e8-b573-45ef-86a5-763f4daa4936 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received unexpected event network-vif-plugged-e0274344-5869-486b-a457-04b90e756602 for instance with vm_state deleted and task_state None.
Dec 05 12:13:18 compute-0 nova_compute[187208]: 2025-12-05 12:13:18.525 187212 DEBUG nova.compute.manager [req-19da69e5-6528-4736-b54c-992a3dede652 req-1311d4e8-b573-45ef-86a5-763f4daa4936 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Received event network-vif-deleted-e0274344-5869-486b-a457-04b90e756602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.121 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.721 187212 DEBUG nova.compute.manager [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.721 187212 DEBUG oslo_concurrency.lockutils [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.721 187212 DEBUG oslo_concurrency.lockutils [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.722 187212 DEBUG oslo_concurrency.lockutils [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.722 187212 DEBUG nova.compute.manager [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Processing event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.722 187212 DEBUG nova.compute.manager [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Received event network-vif-deleted-0bab1586-b06a-4ae9-a0f9-9fbea816c5c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.723 187212 DEBUG nova.compute.manager [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.723 187212 DEBUG oslo_concurrency.lockutils [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.723 187212 DEBUG oslo_concurrency.lockutils [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.724 187212 DEBUG oslo_concurrency.lockutils [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.724 187212 DEBUG nova.compute.manager [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.724 187212 WARNING nova.compute.manager [req-5d7b8958-b817-4b03-8c70-de603e2ab5fc req-0e1dfda3-50b3-4240-8580-946fea682fc4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state building and task_state spawning.
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.725 187212 DEBUG nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.729 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936799.7295494, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.730 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Resumed (Lifecycle Event)
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.738 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.741 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance spawned successfully.
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.743 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.750 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.753 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.764 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.765 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.766 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.766 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.766 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.767 187212 DEBUG nova.virt.libvirt.driver [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.770 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.823 187212 INFO nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Took 22.62 seconds to spawn the instance on the hypervisor.
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.823 187212 DEBUG nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.891 187212 INFO nova.compute.manager [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Took 23.27 seconds to build instance.
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.907 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936784.9062388, 846c0e55-1620-4c7a-9792-d4f5f0d728d8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.907 187212 INFO nova.compute.manager [-] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] VM Stopped (Lifecycle Event)
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.921 187212 DEBUG oslo_concurrency.lockutils [None req-5d7b325c-d118-445c-9d28-d58056737e5b 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.378s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:19 compute-0 nova_compute[187208]: 2025-12-05 12:13:19.925 187212 DEBUG nova.compute.manager [None req-baa7b6d9-df20-494b-bad3-2aa893cadb69 - - - - - -] [instance: 846c0e55-1620-4c7a-9792-d4f5f0d728d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:21 compute-0 ovn_controller[95610]: 2025-12-05T12:13:21Z|00951|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:13:21 compute-0 ovn_controller[95610]: 2025-12-05T12:13:21Z|00952|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:13:21 compute-0 ovn_controller[95610]: 2025-12-05T12:13:21Z|00953|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.653 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.711 187212 DEBUG oslo_concurrency.lockutils [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.711 187212 DEBUG oslo_concurrency.lockutils [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.712 187212 DEBUG oslo_concurrency.lockutils [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.712 187212 DEBUG oslo_concurrency.lockutils [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.712 187212 DEBUG oslo_concurrency.lockutils [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.713 187212 INFO nova.compute.manager [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Terminating instance
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.714 187212 DEBUG nova.compute.manager [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:13:21 compute-0 kernel: tap3d342ae2-99 (unregistering): left promiscuous mode
Dec 05 12:13:21 compute-0 NetworkManager[55691]: <info>  [1764936801.7399] device (tap3d342ae2-99): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:21 compute-0 ovn_controller[95610]: 2025-12-05T12:13:21Z|00954|binding|INFO|Releasing lport 3d342ae2-99f5-47b8-8c24-89dc69c89971 from this chassis (sb_readonly=0)
Dec 05 12:13:21 compute-0 ovn_controller[95610]: 2025-12-05T12:13:21Z|00955|binding|INFO|Setting lport 3d342ae2-99f5-47b8-8c24-89dc69c89971 down in Southbound
Dec 05 12:13:21 compute-0 ovn_controller[95610]: 2025-12-05T12:13:21Z|00956|binding|INFO|Removing iface tap3d342ae2-99 ovn-installed in OVS
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.750 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.765 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.773 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:1f:ee 10.100.0.5'], port_security=['fa:16:3e:cc:1f:ee 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a85836ac-2737-428e-85a9-ffd8bd60f4a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=3d342ae2-99f5-47b8-8c24-89dc69c89971) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.775 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 3d342ae2-99f5-47b8-8c24-89dc69c89971 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 unbound from our chassis
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.777 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.792 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[99f055ba-f9b0-41f5-b823-87fdb41c99f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:21 compute-0 systemd[1]: machine-qemu\x2d107\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Dec 05 12:13:21 compute-0 systemd[1]: machine-qemu\x2d107\x2dinstance\x2d0000005d.scope: Consumed 5.138s CPU time.
Dec 05 12:13:21 compute-0 systemd-machined[153543]: Machine qemu-107-instance-0000005d terminated.
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.824 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e95d506b-09a5-4cc9-8dc8-c8a56ed62f34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.827 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9487e581-54f8-44f8-9f04-c4586d39b61f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.856 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a9825bf5-e4e6-4096-8c12-c33b2f7c53a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.877 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cfceb9dc-6743-46d4-9672-b9b0e302feaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 23, 'rx_bytes': 700, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 23, 'rx_bytes': 700, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 15042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236844, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.895 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fefdf62b-30ae-4b90-aef7-178a343c0923]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236845, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236845, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.897 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.899 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.903 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.903 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.904 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:21 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:21.904 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.939 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.944 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.985 187212 INFO nova.virt.libvirt.driver [-] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Instance destroyed successfully.
Dec 05 12:13:21 compute-0 nova_compute[187208]: 2025-12-05 12:13:21.986 187212 DEBUG nova.objects.instance [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'resources' on Instance uuid a85836ac-2737-428e-85a9-ffd8bd60f4a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.013 187212 DEBUG nova.virt.libvirt.vif [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:202:202,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:13:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-226401896',display_name='tempest-ServersTestJSON-server-226401896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-226401896',id=93,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-bjbbgegp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:13:18Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=a85836ac-2737-428e-85a9-ffd8bd60f4a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.014 187212 DEBUG nova.network.os_vif_util [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "address": "fa:16:3e:cc:1f:ee", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d342ae2-99", "ovs_interfaceid": "3d342ae2-99f5-47b8-8c24-89dc69c89971", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.015 187212 DEBUG nova.network.os_vif_util [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:1f:ee,bridge_name='br-int',has_traffic_filtering=True,id=3d342ae2-99f5-47b8-8c24-89dc69c89971,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d342ae2-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.015 187212 DEBUG os_vif [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:1f:ee,bridge_name='br-int',has_traffic_filtering=True,id=3d342ae2-99f5-47b8-8c24-89dc69c89971,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d342ae2-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.017 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.018 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d342ae2-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.023 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.025 187212 INFO os_vif [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:1f:ee,bridge_name='br-int',has_traffic_filtering=True,id=3d342ae2-99f5-47b8-8c24-89dc69c89971,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d342ae2-99')
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.026 187212 INFO nova.virt.libvirt.driver [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Deleting instance files /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3_del
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.027 187212 INFO nova.virt.libvirt.driver [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Deletion of /var/lib/nova/instances/a85836ac-2737-428e-85a9-ffd8bd60f4a3_del complete
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.081 187212 INFO nova.compute.manager [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Took 0.37 seconds to destroy the instance on the hypervisor.
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.085 187212 DEBUG oslo.service.loopingcall [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.085 187212 DEBUG nova.compute.manager [-] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.086 187212 DEBUG nova.network.neutron [-] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.641 187212 DEBUG nova.network.neutron [-] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.659 187212 INFO nova.compute.manager [-] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Took 0.57 seconds to deallocate network for instance.
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.697 187212 DEBUG oslo_concurrency.lockutils [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.698 187212 DEBUG oslo_concurrency.lockutils [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.852 187212 DEBUG nova.compute.provider_tree [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.867 187212 DEBUG nova.scheduler.client.report [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.919 187212 DEBUG oslo_concurrency.lockutils [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:22 compute-0 nova_compute[187208]: 2025-12-05 12:13:22.952 187212 INFO nova.scheduler.client.report [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Deleted allocations for instance a85836ac-2737-428e-85a9-ffd8bd60f4a3
Dec 05 12:13:23 compute-0 nova_compute[187208]: 2025-12-05 12:13:23.024 187212 DEBUG oslo_concurrency.lockutils [None req-45294c1e-d959-427a-857e-0bb342b5857b 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a85836ac-2737-428e-85a9-ffd8bd60f4a3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:23 compute-0 nova_compute[187208]: 2025-12-05 12:13:23.223 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936788.2215621, b661b497-acb9-4b26-8e26-7d0802bca8bf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:23 compute-0 nova_compute[187208]: 2025-12-05 12:13:23.224 187212 INFO nova.compute.manager [-] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] VM Stopped (Lifecycle Event)
Dec 05 12:13:23 compute-0 nova_compute[187208]: 2025-12-05 12:13:23.324 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:23 compute-0 nova_compute[187208]: 2025-12-05 12:13:23.428 187212 DEBUG nova.compute.manager [req-7f437f90-8e95-4241-b325-d213f3344f52 req-cd7b49f3-6966-45bb-a331-0e0c6dd54808 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Received event network-vif-deleted-3d342ae2-99f5-47b8-8c24-89dc69c89971 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:23 compute-0 nova_compute[187208]: 2025-12-05 12:13:23.439 187212 DEBUG nova.compute.manager [None req-67b2a7d6-436e-4c58-8b67-6d19b4c519b9 - - - - - -] [instance: b661b497-acb9-4b26-8e26-7d0802bca8bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:24 compute-0 podman[236863]: 2025-12-05 12:13:24.201182552 +0000 UTC m=+0.052956602 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:13:24 compute-0 NetworkManager[55691]: <info>  [1764936804.4685] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Dec 05 12:13:24 compute-0 NetworkManager[55691]: <info>  [1764936804.4695] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/368)
Dec 05 12:13:24 compute-0 nova_compute[187208]: 2025-12-05 12:13:24.468 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:24 compute-0 ovn_controller[95610]: 2025-12-05T12:13:24Z|00957|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:13:24 compute-0 ovn_controller[95610]: 2025-12-05T12:13:24Z|00958|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:13:24 compute-0 ovn_controller[95610]: 2025-12-05T12:13:24Z|00959|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:13:24 compute-0 nova_compute[187208]: 2025-12-05 12:13:24.518 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:24 compute-0 nova_compute[187208]: 2025-12-05 12:13:24.529 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:26 compute-0 ovn_controller[95610]: 2025-12-05T12:13:26Z|00960|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:13:26 compute-0 ovn_controller[95610]: 2025-12-05T12:13:26Z|00961|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:13:26 compute-0 ovn_controller[95610]: 2025-12-05T12:13:26Z|00962|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:13:26 compute-0 nova_compute[187208]: 2025-12-05 12:13:26.170 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:26 compute-0 nova_compute[187208]: 2025-12-05 12:13:26.244 187212 DEBUG nova.compute.manager [req-04b3e583-62dc-4b26-9849-e975666443e0 req-c0746c06-cda0-4df6-83e1-eff33d05e23d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-changed-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:26 compute-0 nova_compute[187208]: 2025-12-05 12:13:26.245 187212 DEBUG nova.compute.manager [req-04b3e583-62dc-4b26-9849-e975666443e0 req-c0746c06-cda0-4df6-83e1-eff33d05e23d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Refreshing instance network info cache due to event network-changed-5316adeb-5a49-4a58-b997-f132a083ff13. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:13:26 compute-0 nova_compute[187208]: 2025-12-05 12:13:26.245 187212 DEBUG oslo_concurrency.lockutils [req-04b3e583-62dc-4b26-9849-e975666443e0 req-c0746c06-cda0-4df6-83e1-eff33d05e23d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:26 compute-0 nova_compute[187208]: 2025-12-05 12:13:26.246 187212 DEBUG oslo_concurrency.lockutils [req-04b3e583-62dc-4b26-9849-e975666443e0 req-c0746c06-cda0-4df6-83e1-eff33d05e23d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:26 compute-0 nova_compute[187208]: 2025-12-05 12:13:26.246 187212 DEBUG nova.network.neutron [req-04b3e583-62dc-4b26-9849-e975666443e0 req-c0746c06-cda0-4df6-83e1-eff33d05e23d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Refreshing network info cache for port 5316adeb-5a49-4a58-b997-f132a083ff13 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.020 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.481 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.482 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.553 187212 DEBUG nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.637 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.638 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.646 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.647 187212 INFO nova.compute.claims [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.836 187212 DEBUG nova.compute.provider_tree [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.852 187212 DEBUG nova.scheduler.client.report [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.876 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.878 187212 DEBUG nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.924 187212 DEBUG nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.925 187212 DEBUG nova.network.neutron [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.946 187212 INFO nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:13:27 compute-0 nova_compute[187208]: 2025-12-05 12:13:27.969 187212 DEBUG nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.064 187212 DEBUG nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.066 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.067 187212 INFO nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Creating image(s)
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.068 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "/var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.068 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.071 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.092 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.163 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.165 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.165 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.177 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.237 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.238 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.326 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.351 187212 DEBUG nova.policy [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.366 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk 1073741824" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.368 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.368 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.433 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.435 187212 DEBUG nova.virt.disk.api [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Checking if we can resize image /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.435 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.499 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.501 187212 DEBUG nova.virt.disk.api [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Cannot resize image /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.501 187212 DEBUG nova.objects.instance [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'migration_context' on Instance uuid a9c0d69f-7894-4e3f-a056-4225da882a38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.515 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.516 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Ensure instance console log exists: /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.516 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.517 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.517 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.550 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936793.5485518, f2a101e0-138f-404e-b6e0-e1359272f560 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.551 187212 INFO nova.compute.manager [-] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] VM Stopped (Lifecycle Event)
Dec 05 12:13:28 compute-0 nova_compute[187208]: 2025-12-05 12:13:28.587 187212 DEBUG nova.compute.manager [None req-1fd3c61f-256a-42cc-b91e-b75a90466431 - - - - - -] [instance: f2a101e0-138f-404e-b6e0-e1359272f560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:29 compute-0 nova_compute[187208]: 2025-12-05 12:13:29.096 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936794.0938861, 95b4dafa-871e-42c8-8fb1-162d6b45f3aa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:29 compute-0 nova_compute[187208]: 2025-12-05 12:13:29.097 187212 INFO nova.compute.manager [-] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] VM Stopped (Lifecycle Event)
Dec 05 12:13:29 compute-0 nova_compute[187208]: 2025-12-05 12:13:29.118 187212 DEBUG nova.network.neutron [req-04b3e583-62dc-4b26-9849-e975666443e0 req-c0746c06-cda0-4df6-83e1-eff33d05e23d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updated VIF entry in instance network info cache for port 5316adeb-5a49-4a58-b997-f132a083ff13. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:13:29 compute-0 nova_compute[187208]: 2025-12-05 12:13:29.120 187212 DEBUG nova.network.neutron [req-04b3e583-62dc-4b26-9849-e975666443e0 req-c0746c06-cda0-4df6-83e1-eff33d05e23d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:29 compute-0 nova_compute[187208]: 2025-12-05 12:13:29.294 187212 DEBUG nova.compute.manager [None req-959fbcd8-f0a1-4475-bf2d-8f3b6a6aa30c - - - - - -] [instance: 95b4dafa-871e-42c8-8fb1-162d6b45f3aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:29 compute-0 nova_compute[187208]: 2025-12-05 12:13:29.470 187212 DEBUG oslo_concurrency.lockutils [req-04b3e583-62dc-4b26-9849-e975666443e0 req-c0746c06-cda0-4df6-83e1-eff33d05e23d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:30 compute-0 nova_compute[187208]: 2025-12-05 12:13:30.164 187212 DEBUG nova.network.neutron [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Successfully created port: fcb6e165-bcf0-439d-849c-dc8819a32db9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:13:30 compute-0 podman[236903]: 2025-12-05 12:13:30.239006727 +0000 UTC m=+0.088188071 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:13:31 compute-0 nova_compute[187208]: 2025-12-05 12:13:31.661 187212 DEBUG nova.network.neutron [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Successfully updated port: fcb6e165-bcf0-439d-849c-dc8819a32db9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:13:31 compute-0 nova_compute[187208]: 2025-12-05 12:13:31.678 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "refresh_cache-a9c0d69f-7894-4e3f-a056-4225da882a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:31 compute-0 nova_compute[187208]: 2025-12-05 12:13:31.678 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquired lock "refresh_cache-a9c0d69f-7894-4e3f-a056-4225da882a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:31 compute-0 nova_compute[187208]: 2025-12-05 12:13:31.679 187212 DEBUG nova.network.neutron [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:13:31 compute-0 nova_compute[187208]: 2025-12-05 12:13:31.782 187212 DEBUG nova.compute.manager [req-47b524e9-308f-4771-afc2-1c6c455be9f8 req-188cb3e3-18f8-496e-ae29-3d3ff8411c63 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Received event network-changed-fcb6e165-bcf0-439d-849c-dc8819a32db9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:31 compute-0 nova_compute[187208]: 2025-12-05 12:13:31.783 187212 DEBUG nova.compute.manager [req-47b524e9-308f-4771-afc2-1c6c455be9f8 req-188cb3e3-18f8-496e-ae29-3d3ff8411c63 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Refreshing instance network info cache due to event network-changed-fcb6e165-bcf0-439d-849c-dc8819a32db9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:13:31 compute-0 nova_compute[187208]: 2025-12-05 12:13:31.783 187212 DEBUG oslo_concurrency.lockutils [req-47b524e9-308f-4771-afc2-1c6c455be9f8 req-188cb3e3-18f8-496e-ae29-3d3ff8411c63 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-a9c0d69f-7894-4e3f-a056-4225da882a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:31 compute-0 nova_compute[187208]: 2025-12-05 12:13:31.858 187212 DEBUG nova.network.neutron [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:13:32 compute-0 nova_compute[187208]: 2025-12-05 12:13:32.023 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:32 compute-0 ovn_controller[95610]: 2025-12-05T12:13:32Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:13:32 compute-0 ovn_controller[95610]: 2025-12-05T12:13:32Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.149 187212 DEBUG oslo_concurrency.lockutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.150 187212 DEBUG oslo_concurrency.lockutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.150 187212 INFO nova.compute.manager [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Shelving
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.175 187212 DEBUG nova.virt.libvirt.driver [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.299 187212 DEBUG nova.network.neutron [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Updating instance_info_cache with network_info: [{"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.319 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Releasing lock "refresh_cache-a9c0d69f-7894-4e3f-a056-4225da882a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.319 187212 DEBUG nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Instance network_info: |[{"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.320 187212 DEBUG oslo_concurrency.lockutils [req-47b524e9-308f-4771-afc2-1c6c455be9f8 req-188cb3e3-18f8-496e-ae29-3d3ff8411c63 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-a9c0d69f-7894-4e3f-a056-4225da882a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.320 187212 DEBUG nova.network.neutron [req-47b524e9-308f-4771-afc2-1c6c455be9f8 req-188cb3e3-18f8-496e-ae29-3d3ff8411c63 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Refreshing network info cache for port fcb6e165-bcf0-439d-849c-dc8819a32db9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.324 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Start _get_guest_xml network_info=[{"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.328 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.335 187212 WARNING nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.343 187212 DEBUG nova.virt.libvirt.host [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.344 187212 DEBUG nova.virt.libvirt.host [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.351 187212 DEBUG nova.virt.libvirt.host [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.352 187212 DEBUG nova.virt.libvirt.host [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.352 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.353 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.353 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.353 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.354 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.354 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.354 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.354 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.355 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.355 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.355 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.355 187212 DEBUG nova.virt.hardware [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.359 187212 DEBUG nova.virt.libvirt.vif [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:13:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1799618215',display_name='tempest-ServersTestJSON-server-1799618215',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1799618215',id=94,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-1w6sm2sq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:27Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=a9c0d69f-7894-4e3f-a056-4225da882a38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.360 187212 DEBUG nova.network.os_vif_util [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.361 187212 DEBUG nova.network.os_vif_util [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:d5:72,bridge_name='br-int',has_traffic_filtering=True,id=fcb6e165-bcf0-439d-849c-dc8819a32db9,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcb6e165-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.362 187212 DEBUG nova.objects.instance [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'pci_devices' on Instance uuid a9c0d69f-7894-4e3f-a056-4225da882a38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.384 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <uuid>a9c0d69f-7894-4e3f-a056-4225da882a38</uuid>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <name>instance-0000005e</name>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersTestJSON-server-1799618215</nova:name>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:13:33</nova:creationTime>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:13:33 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:13:33 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:13:33 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:13:33 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:13:33 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:13:33 compute-0 nova_compute[187208]:         <nova:user uuid="62153b585ecc4e6fa2ad567851d49081">tempest-ServersTestJSON-1492365581-project-member</nova:user>
Dec 05 12:13:33 compute-0 nova_compute[187208]:         <nova:project uuid="0c982a61e3fc4c8da9248076bb0361ac">tempest-ServersTestJSON-1492365581</nova:project>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:13:33 compute-0 nova_compute[187208]:         <nova:port uuid="fcb6e165-bcf0-439d-849c-dc8819a32db9">
Dec 05 12:13:33 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <system>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <entry name="serial">a9c0d69f-7894-4e3f-a056-4225da882a38</entry>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <entry name="uuid">a9c0d69f-7894-4e3f-a056-4225da882a38</entry>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     </system>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <os>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   </os>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <features>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   </features>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk.config"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:39:d5:72"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <target dev="tapfcb6e165-bc"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/console.log" append="off"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <video>
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     </video>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:13:33 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:13:33 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:13:33 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:13:33 compute-0 nova_compute[187208]: </domain>
Dec 05 12:13:33 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.386 187212 DEBUG nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Preparing to wait for external event network-vif-plugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.386 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.387 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.387 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.388 187212 DEBUG nova.virt.libvirt.vif [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:13:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1799618215',display_name='tempest-ServersTestJSON-server-1799618215',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1799618215',id=94,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-1w6sm2sq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:27Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=a9c0d69f-7894-4e3f-a056-4225da882a38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.389 187212 DEBUG nova.network.os_vif_util [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.389 187212 DEBUG nova.network.os_vif_util [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:d5:72,bridge_name='br-int',has_traffic_filtering=True,id=fcb6e165-bcf0-439d-849c-dc8819a32db9,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcb6e165-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.390 187212 DEBUG os_vif [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:d5:72,bridge_name='br-int',has_traffic_filtering=True,id=fcb6e165-bcf0-439d-849c-dc8819a32db9,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcb6e165-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.391 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.391 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.392 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.397 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.397 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfcb6e165-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.398 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfcb6e165-bc, col_values=(('external_ids', {'iface-id': 'fcb6e165-bcf0-439d-849c-dc8819a32db9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:d5:72', 'vm-uuid': 'a9c0d69f-7894-4e3f-a056-4225da882a38'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.399 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:33 compute-0 NetworkManager[55691]: <info>  [1764936813.4006] manager: (tapfcb6e165-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.401 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.407 187212 INFO os_vif [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:d5:72,bridge_name='br-int',has_traffic_filtering=True,id=fcb6e165-bcf0-439d-849c-dc8819a32db9,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcb6e165-bc')
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.486 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.487 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.488 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] No VIF found with MAC fa:16:3e:39:d5:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:13:33 compute-0 nova_compute[187208]: 2025-12-05 12:13:33.489 187212 INFO nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Using config drive
Dec 05 12:13:34 compute-0 nova_compute[187208]: 2025-12-05 12:13:34.355 187212 INFO nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Creating config drive at /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk.config
Dec 05 12:13:34 compute-0 nova_compute[187208]: 2025-12-05 12:13:34.361 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_nbv38o1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:34 compute-0 nova_compute[187208]: 2025-12-05 12:13:34.496 187212 DEBUG oslo_concurrency.processutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_nbv38o1" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:34 compute-0 kernel: tapfcb6e165-bc: entered promiscuous mode
Dec 05 12:13:34 compute-0 NetworkManager[55691]: <info>  [1764936814.5643] manager: (tapfcb6e165-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/370)
Dec 05 12:13:34 compute-0 ovn_controller[95610]: 2025-12-05T12:13:34Z|00963|binding|INFO|Claiming lport fcb6e165-bcf0-439d-849c-dc8819a32db9 for this chassis.
Dec 05 12:13:34 compute-0 ovn_controller[95610]: 2025-12-05T12:13:34Z|00964|binding|INFO|fcb6e165-bcf0-439d-849c-dc8819a32db9: Claiming fa:16:3e:39:d5:72 10.100.0.12
Dec 05 12:13:34 compute-0 nova_compute[187208]: 2025-12-05 12:13:34.567 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.573 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:d5:72 10.100.0.12'], port_security=['fa:16:3e:39:d5:72 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a9c0d69f-7894-4e3f-a056-4225da882a38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=fcb6e165-bcf0-439d-849c-dc8819a32db9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.574 104471 INFO neutron.agent.ovn.metadata.agent [-] Port fcb6e165-bcf0-439d-849c-dc8819a32db9 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 bound to our chassis
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.577 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:13:34 compute-0 ovn_controller[95610]: 2025-12-05T12:13:34Z|00965|binding|INFO|Setting lport fcb6e165-bcf0-439d-849c-dc8819a32db9 ovn-installed in OVS
Dec 05 12:13:34 compute-0 ovn_controller[95610]: 2025-12-05T12:13:34Z|00966|binding|INFO|Setting lport fcb6e165-bcf0-439d-849c-dc8819a32db9 up in Southbound
Dec 05 12:13:34 compute-0 nova_compute[187208]: 2025-12-05 12:13:34.582 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:34 compute-0 nova_compute[187208]: 2025-12-05 12:13:34.585 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.597 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[936ace16-a5b1-45b2-8dec-5541ff8cb283]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:34 compute-0 systemd-udevd[236969]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:13:34 compute-0 NetworkManager[55691]: <info>  [1764936814.6089] device (tapfcb6e165-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:13:34 compute-0 NetworkManager[55691]: <info>  [1764936814.6099] device (tapfcb6e165-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:13:34 compute-0 systemd-machined[153543]: New machine qemu-109-instance-0000005e.
Dec 05 12:13:34 compute-0 systemd[1]: Started Virtual Machine qemu-109-instance-0000005e.
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.632 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[11ff2c4d-2a9a-49f5-b7ac-69dddd890a0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.636 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4e1d0c-9645-4fa8-8fac-8a1e34537925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.666 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[81c67ad7-283d-4d42-bfb6-f48ee9a0bbf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.682 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[461a0f65-7acc-4b2b-90bc-c004f8532340]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 25, 'rx_bytes': 700, 'tx_bytes': 1194, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 25, 'rx_bytes': 700, 'tx_bytes': 1194, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 26223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236980, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.697 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f3984c4d-4452-42e6-b926-68b6e5e2fa38]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236983, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236983, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.699 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:34 compute-0 nova_compute[187208]: 2025-12-05 12:13:34.700 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.701 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.702 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.702 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:34.704 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.196 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936815.1954932, a9c0d69f-7894-4e3f-a056-4225da882a38 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.196 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] VM Started (Lifecycle Event)
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.231 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.236 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936815.198471, a9c0d69f-7894-4e3f-a056-4225da882a38 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.236 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] VM Paused (Lifecycle Event)
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.260 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.262 187212 DEBUG nova.network.neutron [req-47b524e9-308f-4771-afc2-1c6c455be9f8 req-188cb3e3-18f8-496e-ae29-3d3ff8411c63 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Updated VIF entry in instance network info cache for port fcb6e165-bcf0-439d-849c-dc8819a32db9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.263 187212 DEBUG nova.network.neutron [req-47b524e9-308f-4771-afc2-1c6c455be9f8 req-188cb3e3-18f8-496e-ae29-3d3ff8411c63 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Updating instance_info_cache with network_info: [{"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.267 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.294 187212 DEBUG oslo_concurrency.lockutils [req-47b524e9-308f-4771-afc2-1c6c455be9f8 req-188cb3e3-18f8-496e-ae29-3d3ff8411c63 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-a9c0d69f-7894-4e3f-a056-4225da882a38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.295 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:13:35 compute-0 kernel: tape30774db-d3 (unregistering): left promiscuous mode
Dec 05 12:13:35 compute-0 NetworkManager[55691]: <info>  [1764936815.5708] device (tape30774db-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:35 compute-0 ovn_controller[95610]: 2025-12-05T12:13:35Z|00967|binding|INFO|Releasing lport e30774db-d3d3-4438-b68a-6f7855f55128 from this chassis (sb_readonly=0)
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:35 compute-0 ovn_controller[95610]: 2025-12-05T12:13:35Z|00968|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 down in Southbound
Dec 05 12:13:35 compute-0 ovn_controller[95610]: 2025-12-05T12:13:35Z|00969|binding|INFO|Removing iface tape30774db-d3 ovn-installed in OVS
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.658 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.661 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8e:78 10.100.0.9'], port_security=['fa:16:3e:50:8e:78 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e30774db-d3d3-4438-b68a-6f7855f55128) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.662 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e30774db-d3d3-4438-b68a-6f7855f55128 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be unbound from our chassis
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.664 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82130d25-ff6c-480e-884d-f3d97b6fd9be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.665 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9fdcb56f-6dc5-44e0-b39f-110b2cfb7bbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.666 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be namespace which is not needed anymore
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.670 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:35 compute-0 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d00000050.scope: Deactivated successfully.
Dec 05 12:13:35 compute-0 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d00000050.scope: Consumed 17.992s CPU time.
Dec 05 12:13:35 compute-0 systemd-machined[153543]: Machine qemu-90-instance-00000050 terminated.
Dec 05 12:13:35 compute-0 podman[236998]: 2025-12-05 12:13:35.741869283 +0000 UTC m=+0.060074238 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 12:13:35 compute-0 podman[236995]: 2025-12-05 12:13:35.747922138 +0000 UTC m=+0.066158023 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 12:13:35 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[232845]: [NOTICE]   (232849) : haproxy version is 2.8.14-c23fe91
Dec 05 12:13:35 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[232845]: [NOTICE]   (232849) : path to executable is /usr/sbin/haproxy
Dec 05 12:13:35 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[232845]: [WARNING]  (232849) : Exiting Master process...
Dec 05 12:13:35 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[232845]: [WARNING]  (232849) : Exiting Master process...
Dec 05 12:13:35 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[232845]: [ALERT]    (232849) : Current worker (232851) exited with code 143 (Terminated)
Dec 05 12:13:35 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[232845]: [WARNING]  (232849) : All workers exited. Exiting... (0)
Dec 05 12:13:35 compute-0 systemd[1]: libpod-9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b.scope: Deactivated successfully.
Dec 05 12:13:35 compute-0 podman[237048]: 2025-12-05 12:13:35.797622415 +0000 UTC m=+0.045897588 container died 9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 12:13:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b-userdata-shm.mount: Deactivated successfully.
Dec 05 12:13:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-5da39cf8afffca99931577d63971bef13ebdbc9b82672cc00526d085f11c3c50-merged.mount: Deactivated successfully.
Dec 05 12:13:35 compute-0 podman[237048]: 2025-12-05 12:13:35.844280334 +0000 UTC m=+0.092555477 container cleanup 9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:13:35 compute-0 systemd[1]: libpod-conmon-9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b.scope: Deactivated successfully.
Dec 05 12:13:35 compute-0 podman[237079]: 2025-12-05 12:13:35.912999081 +0000 UTC m=+0.049133321 container remove 9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.919 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[64ad19d3-b217-4efa-915f-7c72dcbf04b8]: (4, ('Fri Dec  5 12:13:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be (9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b)\n9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b\nFri Dec  5 12:13:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be (9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b)\n9b72741f2a3caa1e6152292bc85a4ff7b655bcfe58a642b5633beaab11efc39b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.920 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d973464c-f9a8-4608-bd26-d0625c5e17b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.921 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82130d25-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.924 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:35 compute-0 kernel: tap82130d25-f0: left promiscuous mode
Dec 05 12:13:35 compute-0 nova_compute[187208]: 2025-12-05 12:13:35.940 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.943 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[52f8784d-3149-4236-a4bc-a8160884ca44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.963 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab29d81-88bb-441b-b7e0-fd2300581e35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.964 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[533865b6-cbe8-4580-b895-9341f06a1222]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.979 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc7a641-0fdb-4768-b48c-fcb673c1f3b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403087, 'reachable_time': 30004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237112, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.982 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:13:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:35.982 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8b3b64-433a-4e2d-8de1-b094b6c84ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d82130d25\x2dff6c\x2d480e\x2d884d\x2df3d97b6fd9be.mount: Deactivated successfully.
Dec 05 12:13:36 compute-0 nova_compute[187208]: 2025-12-05 12:13:36.196 187212 INFO nova.virt.libvirt.driver [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance shutdown successfully after 3 seconds.
Dec 05 12:13:36 compute-0 nova_compute[187208]: 2025-12-05 12:13:36.203 187212 INFO nova.virt.libvirt.driver [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance destroyed successfully.
Dec 05 12:13:36 compute-0 nova_compute[187208]: 2025-12-05 12:13:36.203 187212 DEBUG nova.objects.instance [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'numa_topology' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:36 compute-0 nova_compute[187208]: 2025-12-05 12:13:36.558 187212 INFO nova.virt.libvirt.driver [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Beginning cold snapshot process
Dec 05 12:13:36 compute-0 nova_compute[187208]: 2025-12-05 12:13:36.685 187212 DEBUG nova.privsep.utils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:13:36 compute-0 nova_compute[187208]: 2025-12-05 12:13:36.685 187212 DEBUG oslo_concurrency.processutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk /var/lib/nova/instances/snapshots/tmpgz8_hi00/0ce0a99955034a91b2f70f54be70f19c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:36 compute-0 nova_compute[187208]: 2025-12-05 12:13:36.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:36 compute-0 nova_compute[187208]: 2025-12-05 12:13:36.985 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936801.9832983, a85836ac-2737-428e-85a9-ffd8bd60f4a3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:36 compute-0 nova_compute[187208]: 2025-12-05 12:13:36.985 187212 INFO nova.compute.manager [-] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] VM Stopped (Lifecycle Event)
Dec 05 12:13:37 compute-0 nova_compute[187208]: 2025-12-05 12:13:37.009 187212 DEBUG nova.compute.manager [None req-eb43410b-cd19-4807-8a23-61c8e432b3ed - - - - - -] [instance: a85836ac-2737-428e-85a9-ffd8bd60f4a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:37 compute-0 nova_compute[187208]: 2025-12-05 12:13:37.427 187212 DEBUG oslo_concurrency.processutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk /var/lib/nova/instances/snapshots/tmpgz8_hi00/0ce0a99955034a91b2f70f54be70f19c" returned: 0 in 0.742s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:37 compute-0 nova_compute[187208]: 2025-12-05 12:13:37.428 187212 INFO nova.virt.libvirt.driver [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Snapshot extracted, beginning image upload
Dec 05 12:13:38 compute-0 nova_compute[187208]: 2025-12-05 12:13:38.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:38 compute-0 nova_compute[187208]: 2025-12-05 12:13:38.400 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:38 compute-0 nova_compute[187208]: 2025-12-05 12:13:38.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.436 187212 DEBUG nova.compute.manager [req-efe1ced6-365d-4373-a556-63a1d4dc4867 req-a7989c99-14b5-4fb8-b1c2-f2bba49bc32e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Received event network-vif-plugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.437 187212 DEBUG oslo_concurrency.lockutils [req-efe1ced6-365d-4373-a556-63a1d4dc4867 req-a7989c99-14b5-4fb8-b1c2-f2bba49bc32e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.437 187212 DEBUG oslo_concurrency.lockutils [req-efe1ced6-365d-4373-a556-63a1d4dc4867 req-a7989c99-14b5-4fb8-b1c2-f2bba49bc32e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.437 187212 DEBUG oslo_concurrency.lockutils [req-efe1ced6-365d-4373-a556-63a1d4dc4867 req-a7989c99-14b5-4fb8-b1c2-f2bba49bc32e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.438 187212 DEBUG nova.compute.manager [req-efe1ced6-365d-4373-a556-63a1d4dc4867 req-a7989c99-14b5-4fb8-b1c2-f2bba49bc32e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Processing event network-vif-plugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.438 187212 DEBUG nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.443 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.444 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936820.4433498, a9c0d69f-7894-4e3f-a056-4225da882a38 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.444 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] VM Resumed (Lifecycle Event)
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.451 187212 INFO nova.virt.libvirt.driver [-] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Instance spawned successfully.
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.451 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.477 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.483 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.487 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.488 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.488 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.489 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.489 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.490 187212 DEBUG nova.virt.libvirt.driver [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.501 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.539 187212 INFO nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Took 12.47 seconds to spawn the instance on the hypervisor.
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.540 187212 DEBUG nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.604 187212 INFO nova.compute.manager [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Took 12.99 seconds to build instance.
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.623 187212 DEBUG oslo_concurrency.lockutils [None req-9592d500-0479-483c-846c-771e03a2c835 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.768 187212 INFO nova.virt.libvirt.driver [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Snapshot image upload complete
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.769 187212 DEBUG nova.compute.manager [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.841 187212 INFO nova.compute.manager [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Shelve offloading
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.850 187212 INFO nova.virt.libvirt.driver [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance destroyed successfully.
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.851 187212 DEBUG nova.compute.manager [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.854 187212 DEBUG oslo_concurrency.lockutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.854 187212 DEBUG oslo_concurrency.lockutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquired lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:40 compute-0 nova_compute[187208]: 2025-12-05 12:13:40.855 187212 DEBUG nova.network.neutron [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:13:42 compute-0 nova_compute[187208]: 2025-12-05 12:13:42.657 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.008 187212 DEBUG nova.compute.manager [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Received event network-vif-plugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.008 187212 DEBUG oslo_concurrency.lockutils [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.009 187212 DEBUG oslo_concurrency.lockutils [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.009 187212 DEBUG oslo_concurrency.lockutils [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.009 187212 DEBUG nova.compute.manager [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] No waiting events found dispatching network-vif-plugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.009 187212 WARNING nova.compute.manager [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Received unexpected event network-vif-plugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 for instance with vm_state active and task_state None.
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.010 187212 DEBUG nova.compute.manager [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.010 187212 DEBUG oslo_concurrency.lockutils [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.010 187212 DEBUG oslo_concurrency.lockutils [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.010 187212 DEBUG oslo_concurrency.lockutils [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.010 187212 DEBUG nova.compute.manager [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.011 187212 WARNING nova.compute.manager [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state shelved and task_state shelving_offloading.
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.011 187212 DEBUG nova.compute.manager [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.011 187212 DEBUG oslo_concurrency.lockutils [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.011 187212 DEBUG oslo_concurrency.lockutils [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.011 187212 DEBUG oslo_concurrency.lockutils [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.012 187212 DEBUG nova.compute.manager [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.012 187212 WARNING nova.compute.manager [req-8322571e-a724-4776-af2d-9337bfeabf26 req-7157ecab-c2c3-4f1d-84e1-d1ab7c55ddd6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state shelved and task_state shelving_offloading.
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.332 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:43 compute-0 nova_compute[187208]: 2025-12-05 12:13:43.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:44 compute-0 nova_compute[187208]: 2025-12-05 12:13:44.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:44 compute-0 nova_compute[187208]: 2025-12-05 12:13:44.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:13:44 compute-0 nova_compute[187208]: 2025-12-05 12:13:44.083 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:44 compute-0 podman[237123]: 2025-12-05 12:13:44.208804308 +0000 UTC m=+0.057085232 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:13:44 compute-0 podman[237124]: 2025-12-05 12:13:44.271933203 +0000 UTC m=+0.115255253 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 12:13:44 compute-0 nova_compute[187208]: 2025-12-05 12:13:44.930 187212 DEBUG nova.network.neutron [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:44 compute-0 nova_compute[187208]: 2025-12-05 12:13:44.949 187212 DEBUG oslo_concurrency.lockutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Releasing lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:44 compute-0 nova_compute[187208]: 2025-12-05 12:13:44.951 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:44 compute-0 nova_compute[187208]: 2025-12-05 12:13:44.951 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:13:45 compute-0 podman[237172]: 2025-12-05 12:13:45.214624588 +0000 UTC m=+0.064052723 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:13:45 compute-0 nova_compute[187208]: 2025-12-05 12:13:45.309 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:46 compute-0 nova_compute[187208]: 2025-12-05 12:13:46.467 187212 DEBUG oslo_concurrency.lockutils [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:46 compute-0 nova_compute[187208]: 2025-12-05 12:13:46.469 187212 DEBUG oslo_concurrency.lockutils [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:46 compute-0 nova_compute[187208]: 2025-12-05 12:13:46.469 187212 DEBUG nova.compute.manager [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:46 compute-0 nova_compute[187208]: 2025-12-05 12:13:46.474 187212 DEBUG nova.compute.manager [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Dec 05 12:13:46 compute-0 nova_compute[187208]: 2025-12-05 12:13:46.475 187212 DEBUG nova.objects.instance [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'flavor' on Instance uuid a9c0d69f-7894-4e3f-a056-4225da882a38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:46 compute-0 nova_compute[187208]: 2025-12-05 12:13:46.510 187212 DEBUG nova.virt.libvirt.driver [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:13:46 compute-0 nova_compute[187208]: 2025-12-05 12:13:46.870 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:46 compute-0 nova_compute[187208]: 2025-12-05 12:13:46.871 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:46 compute-0 nova_compute[187208]: 2025-12-05 12:13:46.889 187212 DEBUG nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.031 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.032 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.042 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.042 187212 INFO nova.compute.claims [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.082 187212 DEBUG oslo_concurrency.lockutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.083 187212 DEBUG oslo_concurrency.lockutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.083 187212 INFO nova.compute.manager [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Rebooting instance
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.104 187212 DEBUG oslo_concurrency.lockutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.104 187212 DEBUG oslo_concurrency.lockutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.105 187212 DEBUG nova.network.neutron [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.482 187212 DEBUG nova.compute.provider_tree [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.517 187212 DEBUG nova.scheduler.client.report [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.625 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.626 187212 DEBUG nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.687 187212 DEBUG nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.688 187212 DEBUG nova.network.neutron [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.901 187212 INFO nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:13:47 compute-0 nova_compute[187208]: 2025-12-05 12:13:47.919 187212 DEBUG nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.009 187212 DEBUG nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.011 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.012 187212 INFO nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Creating image(s)
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.013 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "/var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.013 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "/var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.014 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "/var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:48.018 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:8b:a2 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7ef5fe5e-41c6-4a9e-a350-0883b0f491ae', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ef5fe5e-41c6-4a9e-a350-0883b0f491ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ff94c302f4541f9bdb0a79ab2b69a76', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=df7566ba-990c-4b19-8e24-00b407f0dad0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cfb4e190-f9b4-40e4-a648-9909f14d38e4) old=Port_Binding(mac=['fa:16:3e:3a:8b:a2 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7ef5fe5e-41c6-4a9e-a350-0883b0f491ae', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ef5fe5e-41c6-4a9e-a350-0883b0f491ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ff94c302f4541f9bdb0a79ab2b69a76', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:48.019 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cfb4e190-f9b4-40e4-a648-9909f14d38e4 in datapath 7ef5fe5e-41c6-4a9e-a350-0883b0f491ae updated
Dec 05 12:13:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:48.021 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7ef5fe5e-41c6-4a9e-a350-0883b0f491ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:13:48 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:48.022 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[81e1c502-fae0-4d00-a611-383ebf674b4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.027 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.048 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.066 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.067 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.067 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.068 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.068 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.068 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.069 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.069 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.116 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.118 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.118 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.129 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.194 187212 INFO nova.virt.libvirt.driver [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance destroyed successfully.
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.195 187212 DEBUG nova.objects.instance [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'resources' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.201 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.203 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.233 187212 DEBUG nova.virt.libvirt.vif [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-826937421',display_name='tempest-ServersNegativeTestJSON-server-826937421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-826937421',id=80,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-snx0qylv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member',shelved_at='2025-12-05T12:13:40.769586',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='e12053fb-5eb2-4850-82fb-a7e9b54de98a'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:13:37Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=28e48516-8665-4d98-a92d-c84b7da9a284,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.235 187212 DEBUG nova.network.os_vif_util [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.237 187212 DEBUG nova.network.os_vif_util [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.238 187212 DEBUG os_vif [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.245 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.246 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape30774db-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.251 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.253 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.257 187212 INFO os_vif [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3')
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.258 187212 INFO nova.virt.libvirt.driver [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Deleting instance files /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284_del
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.265 187212 INFO nova.virt.libvirt.driver [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Deletion of /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284_del complete
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.619 187212 DEBUG nova.policy [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '536970a3e3b745ee970e691d562540eb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'edb8f9390e454b10b9cc67dd88ba920b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.650 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk 1073741824" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.651 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.653 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.654 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.711 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.713 187212 DEBUG nova.virt.disk.api [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Checking if we can resize image /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.713 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.735 187212 INFO nova.scheduler.client.report [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Deleted allocations for instance 28e48516-8665-4d98-a92d-c84b7da9a284
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.753 187212 DEBUG nova.compute.manager [req-58e989d3-b92b-41fb-b977-cbf6f80bd6a7 req-af3b7ab9-43bf-4220-a350-782958e29c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-changed-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.754 187212 DEBUG nova.compute.manager [req-58e989d3-b92b-41fb-b977-cbf6f80bd6a7 req-af3b7ab9-43bf-4220-a350-782958e29c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Refreshing instance network info cache due to event network-changed-e30774db-d3d3-4438-b68a-6f7855f55128. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.754 187212 DEBUG oslo_concurrency.lockutils [req-58e989d3-b92b-41fb-b977-cbf6f80bd6a7 req-af3b7ab9-43bf-4220-a350-782958e29c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.755 187212 DEBUG oslo_concurrency.lockutils [req-58e989d3-b92b-41fb-b977-cbf6f80bd6a7 req-af3b7ab9-43bf-4220-a350-782958e29c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.755 187212 DEBUG nova.network.neutron [req-58e989d3-b92b-41fb-b977-cbf6f80bd6a7 req-af3b7ab9-43bf-4220-a350-782958e29c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Refreshing network info cache for port e30774db-d3d3-4438-b68a-6f7855f55128 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.772 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.773 187212 DEBUG nova.virt.disk.api [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Cannot resize image /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.774 187212 DEBUG nova.objects.instance [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lazy-loading 'migration_context' on Instance uuid 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.798 187212 DEBUG oslo_concurrency.lockutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.799 187212 DEBUG oslo_concurrency.lockutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.801 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.802 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Ensure instance console log exists: /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.802 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.803 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.803 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:48 compute-0 nova_compute[187208]: 2025-12-05 12:13:48.984 187212 DEBUG nova.network.neutron [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.003 187212 DEBUG oslo_concurrency.lockutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.005 187212 DEBUG nova.compute.manager [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.020 187212 DEBUG nova.compute.provider_tree [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.043 187212 DEBUG nova.scheduler.client.report [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.073 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.075 187212 DEBUG oslo_concurrency.lockutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.136 187212 DEBUG oslo_concurrency.lockutils [None req-48a049ac-7c3c-4028-9f16-54875b9c6fd4 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 15.986s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:49 compute-0 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec 05 12:13:49 compute-0 NetworkManager[55691]: <info>  [1764936829.1557] device (tap5316adeb-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:49 compute-0 ovn_controller[95610]: 2025-12-05T12:13:49Z|00970|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 ovn_controller[95610]: 2025-12-05T12:13:49Z|00971|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec 05 12:13:49 compute-0 ovn_controller[95610]: 2025-12-05T12:13:49Z|00972|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.168 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.170 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.172 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.174 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.176 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b053c765-81ca-4afe-aa16-5f3fe6ab75c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.177 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace which is not needed anymore
Dec 05 12:13:49 compute-0 systemd[1]: machine-qemu\x2d108\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Dec 05 12:13:49 compute-0 systemd[1]: machine-qemu\x2d108\x2dinstance\x2d0000005c.scope: Consumed 13.832s CPU time.
Dec 05 12:13:49 compute-0 systemd-machined[153543]: Machine qemu-108-instance-0000005c terminated.
Dec 05 12:13:49 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[236637]: [NOTICE]   (236641) : haproxy version is 2.8.14-c23fe91
Dec 05 12:13:49 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[236637]: [NOTICE]   (236641) : path to executable is /usr/sbin/haproxy
Dec 05 12:13:49 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[236637]: [WARNING]  (236641) : Exiting Master process...
Dec 05 12:13:49 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[236637]: [WARNING]  (236641) : Exiting Master process...
Dec 05 12:13:49 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[236637]: [ALERT]    (236641) : Current worker (236643) exited with code 143 (Terminated)
Dec 05 12:13:49 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[236637]: [WARNING]  (236641) : All workers exited. Exiting... (0)
Dec 05 12:13:49 compute-0 systemd[1]: libpod-8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887.scope: Deactivated successfully.
Dec 05 12:13:49 compute-0 podman[237231]: 2025-12-05 12:13:49.331556016 +0000 UTC m=+0.052324764 container died 8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.359 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887-userdata-shm.mount: Deactivated successfully.
Dec 05 12:13:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-7ef0da064995570ab0cfc29a1c5131088e3b52fe546aa4976392ee1079b3bbaf-merged.mount: Deactivated successfully.
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.366 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 podman[237231]: 2025-12-05 12:13:49.371604534 +0000 UTC m=+0.092373262 container cleanup 8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:13:49 compute-0 systemd[1]: libpod-conmon-8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887.scope: Deactivated successfully.
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.404 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.405 187212 DEBUG nova.objects.instance [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.424 187212 DEBUG nova.virt.libvirt.vif [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:13:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.425 187212 DEBUG nova.network.os_vif_util [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.426 187212 DEBUG nova.network.os_vif_util [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.427 187212 DEBUG os_vif [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.429 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.429 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5316adeb-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.431 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.435 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.438 187212 INFO os_vif [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.447 187212 DEBUG nova.virt.libvirt.driver [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start _get_guest_xml network_info=[{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.453 187212 WARNING nova.virt.libvirt.driver [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.459 187212 DEBUG nova.virt.libvirt.host [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.460 187212 DEBUG nova.virt.libvirt.host [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.462 187212 DEBUG nova.virt.libvirt.host [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.462 187212 DEBUG nova.virt.libvirt.host [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.463 187212 DEBUG nova.virt.libvirt.driver [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.463 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.463 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.464 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.464 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.464 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.464 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.464 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.465 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.465 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.465 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.465 187212 DEBUG nova.virt.hardware [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.465 187212 DEBUG nova.objects.instance [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:49 compute-0 podman[237271]: 2025-12-05 12:13:49.467056154 +0000 UTC m=+0.063511738 container remove 8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.472 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0a67c5c6-6db4-4c96-a868-795d39cd31f1]: (4, ('Fri Dec  5 12:13:49 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887)\n8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887\nFri Dec  5 12:13:49 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887)\n8bbfe64b3ef22d684252144e3c9f1c6d71289eb5bb00c1d9182f3555f9256887\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.475 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3acafdb5-55c3-4100-8f3f-0fb45e5f8958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.475 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:49 compute-0 kernel: tapf9ed41c2-b0: left promiscuous mode
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.477 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.484 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.495 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[999e0016-5491-4376-a035-30bcc742a2ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.505 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.513 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[195196b7-fb7f-42bb-9438-1fc4c7e81059]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.514 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2b435fe2-79d4-49a2-bf1a-cd0ad56d48ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.531 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[40cac9a0-0e73-40e2-81b1-7f24c9847a06]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417135, 'reachable_time': 37316, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237289, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:49 compute-0 systemd[1]: run-netns-ovnmeta\x2df9ed41c2\x2db085\x2d41ff\x2dac71\x2d6256a4e30e85.mount: Deactivated successfully.
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.533 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:13:49 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:49.533 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b98e0b4d-f86f-472f-9549-fb79b5c84527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.554 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.555 187212 DEBUG oslo_concurrency.lockutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.555 187212 DEBUG oslo_concurrency.lockutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.556 187212 DEBUG oslo_concurrency.lockutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.558 187212 DEBUG nova.virt.libvirt.vif [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:13:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.558 187212 DEBUG nova.network.os_vif_util [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.559 187212 DEBUG nova.network.os_vif_util [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.561 187212 DEBUG nova.objects.instance [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.576 187212 DEBUG nova.virt.libvirt.driver [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <uuid>2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</uuid>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <name>instance-0000005c</name>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestJSON-server-954339420</nova:name>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:13:49</nova:creationTime>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:13:49 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:13:49 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:13:49 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:13:49 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:13:49 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:13:49 compute-0 nova_compute[187208]:         <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec 05 12:13:49 compute-0 nova_compute[187208]:         <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:13:49 compute-0 nova_compute[187208]:         <nova:port uuid="5316adeb-5a49-4a58-b997-f132a083ff13">
Dec 05 12:13:49 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <system>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <entry name="serial">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <entry name="uuid">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     </system>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <os>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   </os>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <features>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   </features>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:9a:d0:34"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <target dev="tap5316adeb-5a"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/console.log" append="off"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <video>
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     </video>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:13:49 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:13:49 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:13:49 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:13:49 compute-0 nova_compute[187208]: </domain>
Dec 05 12:13:49 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.578 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.652 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.653 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.726 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.728 187212 DEBUG nova.objects.instance [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.749 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.815 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.816 187212 DEBUG nova.virt.disk.api [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.816 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.881 187212 DEBUG oslo_concurrency.processutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.882 187212 DEBUG nova.virt.disk.api [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.883 187212 DEBUG nova.objects.instance [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.910 187212 DEBUG nova.virt.libvirt.vif [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.910 187212 DEBUG nova.network.os_vif_util [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.912 187212 DEBUG nova.network.os_vif_util [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.912 187212 DEBUG os_vif [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.914 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.915 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.919 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5316adeb-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.920 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5316adeb-5a, col_values=(('external_ids', {'iface-id': '5316adeb-5a49-4a58-b997-f132a083ff13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:d0:34', 'vm-uuid': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:49 compute-0 NetworkManager[55691]: <info>  [1764936829.9233] manager: (tap5316adeb-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/371)
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.927 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.929 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:49 compute-0 nova_compute[187208]: 2025-12-05 12:13:49.931 187212 INFO os_vif [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:13:50 compute-0 kernel: tap5316adeb-5a: entered promiscuous mode
Dec 05 12:13:50 compute-0 systemd-udevd[237214]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:13:50 compute-0 NetworkManager[55691]: <info>  [1764936830.0127] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/372)
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.013 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:50 compute-0 ovn_controller[95610]: 2025-12-05T12:13:50Z|00973|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec 05 12:13:50 compute-0 ovn_controller[95610]: 2025-12-05T12:13:50Z|00974|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.022 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:50 compute-0 ovn_controller[95610]: 2025-12-05T12:13:50Z|00975|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.023 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.026 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:13:50 compute-0 NetworkManager[55691]: <info>  [1764936830.0345] device (tap5316adeb-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:13:50 compute-0 NetworkManager[55691]: <info>  [1764936830.0365] device (tap5316adeb-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.038 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.041 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:50 compute-0 ovn_controller[95610]: 2025-12-05T12:13:50Z|00976|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec 05 12:13:50 compute-0 ovn_controller[95610]: 2025-12-05T12:13:50Z|00977|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.043 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.042 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[352ff210-396c-4e15-8894-2efb006409f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.045 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9ed41c2-b1 in ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.048 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9ed41c2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.048 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ae1ea9-d227-4715-aa7a-6f16edca0281]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.050 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0c83bf-757c-4833-b540-d00a944b584f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:50 compute-0 systemd-machined[153543]: New machine qemu-110-instance-0000005c.
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.064 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b76f98-0b24-4e75-a677-1e385884fa6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 systemd[1]: Started Virtual Machine qemu-110-instance-0000005c.
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.090 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.091 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.091 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.096 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[572ed408-c578-42ca-bd10-5e238dcd5c6b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.140 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ad955ad2-4455-4e73-9e98-0fa9f43f0626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 NetworkManager[55691]: <info>  [1764936830.1515] manager: (tapf9ed41c2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/373)
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.151 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2433bfb8-9954-4219-be48-d2984db45e1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.191 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[28c0546e-9287-405e-920f-270d9f37cd7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.194 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ef747717-f323-4c59-838f-4212472f71d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 NetworkManager[55691]: <info>  [1764936830.2247] device (tapf9ed41c2-b0): carrier: link connected
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.232 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[10372bb8-c7c1-4d2c-81fd-d099b074899e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.247 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.262 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbea9ba-90dd-421c-a647-34abc918d00a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 267], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421078, 'reachable_time': 39341, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237352, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dadda7c1-3ba5-48ba-8811-e25940d957eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:9111'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 421078, 'tstamp': 421078}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237354, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.309 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[66cd2e74-ed12-4c4b-93e9-77acb5962080]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 267], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421078, 'reachable_time': 39341, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237355, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.309 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.311 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.352 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0db5759a-1e6b-44dd-b4ed-5f1e16ef81d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.376 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.382 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.427 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[96f475e9-281d-4440-b1f1-6a493105f7e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.429 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.429 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.429 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:50 compute-0 NetworkManager[55691]: <info>  [1764936830.4321] manager: (tapf9ed41c2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Dec 05 12:13:50 compute-0 kernel: tapf9ed41c2-b0: entered promiscuous mode
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.431 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.436 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:50 compute-0 ovn_controller[95610]: 2025-12-05T12:13:50Z|00978|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.441 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.443 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.444 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4b01a10d-fb7f-47c7-b6da-b7f5e58f52f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.446 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:13:50 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:50.447 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'env', 'PROCESS_TAG=haproxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9ed41c2-b085-41ff-ac71-6256a4e30e85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.450 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.452 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.526 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.534 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.596 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.597 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.652 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.873 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.876 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5173MB free_disk=73.05379486083984GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.876 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.877 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:50 compute-0 podman[237404]: 2025-12-05 12:13:50.917706254 +0000 UTC m=+0.062160518 container create 3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.936 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936815.935955, 28e48516-8665-4d98-a92d-c84b7da9a284 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.937 187212 INFO nova.compute.manager [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Stopped (Lifecycle Event)
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.947 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 30cb83d4-3a34-4420-bc83-099b266da48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.947 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.948 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance a9c0d69f-7894-4e3f-a056-4225da882a38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.948 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.948 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.949 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:13:50 compute-0 nova_compute[187208]: 2025-12-05 12:13:50.953 187212 DEBUG nova.compute.manager [None req-ccfee8ee-509f-4011-883a-30ae80b334e0 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:50 compute-0 systemd[1]: Started libpod-conmon-3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc.scope.
Dec 05 12:13:50 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:13:50 compute-0 podman[237404]: 2025-12-05 12:13:50.88575947 +0000 UTC m=+0.030213764 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:13:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f31c5dbfdf4f785140ff1530ce746992dd132ed0b29b916c458115e2234ee385/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:13:51 compute-0 podman[237404]: 2025-12-05 12:13:51.018418966 +0000 UTC m=+0.162873250 container init 3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 05 12:13:51 compute-0 podman[237404]: 2025-12-05 12:13:51.02478884 +0000 UTC m=+0.169243104 container start 3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:13:51 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[237419]: [NOTICE]   (237423) : New worker (237425) forked
Dec 05 12:13:51 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[237419]: [NOTICE]   (237423) : Loading success.
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.060 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.076 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.103 187212 DEBUG nova.network.neutron [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Successfully created port: 39c92a24-4461-4692-8f17-0b72bbaff52f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.107 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.108 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.109 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.249 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.250 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936831.2492533, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.250 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Resumed (Lifecycle Event)
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.255 187212 DEBUG nova.compute.manager [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.259 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance rebooted successfully.
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.260 187212 DEBUG nova.compute.manager [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.357 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.361 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.388 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.389 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936831.2503695, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.389 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Started (Lifecycle Event)
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.398 187212 DEBUG oslo_concurrency.lockutils [None req-ae1da13f-71ca-4434-9b41-c3d6f2fe7ff3 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.576 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.581 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.700 187212 DEBUG nova.network.neutron [req-58e989d3-b92b-41fb-b977-cbf6f80bd6a7 req-af3b7ab9-43bf-4220-a350-782958e29c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updated VIF entry in instance network info cache for port e30774db-d3d3-4438-b68a-6f7855f55128. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.701 187212 DEBUG nova.network.neutron [req-58e989d3-b92b-41fb-b977-cbf6f80bd6a7 req-af3b7ab9-43bf-4220-a350-782958e29c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": null, "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tape30774db-d3", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:51 compute-0 nova_compute[187208]: 2025-12-05 12:13:51.848 187212 DEBUG oslo_concurrency.lockutils [req-58e989d3-b92b-41fb-b977-cbf6f80bd6a7 req-af3b7ab9-43bf-4220-a350-782958e29c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:53 compute-0 nova_compute[187208]: 2025-12-05 12:13:53.112 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:53 compute-0 nova_compute[187208]: 2025-12-05 12:13:53.112 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:53 compute-0 nova_compute[187208]: 2025-12-05 12:13:53.155 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:53 compute-0 nova_compute[187208]: 2025-12-05 12:13:53.155 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:13:53 compute-0 nova_compute[187208]: 2025-12-05 12:13:53.156 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 12:13:53 compute-0 nova_compute[187208]: 2025-12-05 12:13:53.180 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 12:13:53 compute-0 ovn_controller[95610]: 2025-12-05T12:13:53Z|00115|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:d5:72 10.100.0.12
Dec 05 12:13:53 compute-0 ovn_controller[95610]: 2025-12-05T12:13:53Z|00116|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:d5:72 10.100.0.12
Dec 05 12:13:53 compute-0 nova_compute[187208]: 2025-12-05 12:13:53.650 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:54.095 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:54.096 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.096 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.292 187212 DEBUG nova.network.neutron [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Successfully updated port: 39c92a24-4461-4692-8f17-0b72bbaff52f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.311 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "refresh_cache-67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.311 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquired lock "refresh_cache-67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.312 187212 DEBUG nova.network.neutron [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.705 187212 DEBUG nova.compute.manager [req-baffa397-948d-4325-b26d-c8bf5d18f351 req-844ce7eb-181c-443f-aa4a-9e5e9bab2563 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Received event network-changed-39c92a24-4461-4692-8f17-0b72bbaff52f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.706 187212 DEBUG nova.compute.manager [req-baffa397-948d-4325-b26d-c8bf5d18f351 req-844ce7eb-181c-443f-aa4a-9e5e9bab2563 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Refreshing instance network info cache due to event network-changed-39c92a24-4461-4692-8f17-0b72bbaff52f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.706 187212 DEBUG oslo_concurrency.lockutils [req-baffa397-948d-4325-b26d-c8bf5d18f351 req-844ce7eb-181c-443f-aa4a-9e5e9bab2563 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.771 187212 DEBUG nova.network.neutron [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.860 187212 INFO nova.compute.manager [None req-9fad865d-ac67-4d6f-bac1-4a33b3431b76 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Get console output
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.866 213424 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 12:13:54 compute-0 nova_compute[187208]: 2025-12-05 12:13:54.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:55 compute-0 podman[237455]: 2025-12-05 12:13:55.215061918 +0000 UTC m=+0.060399397 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.126 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.127 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.127 187212 INFO nova.compute.manager [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Unshelving
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.230 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.230 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.235 187212 DEBUG nova.objects.instance [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'pci_requests' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.250 187212 DEBUG nova.objects.instance [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'numa_topology' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.266 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.267 187212 INFO nova.compute.claims [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.465 187212 DEBUG nova.scheduler.client.report [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.492 187212 DEBUG nova.scheduler.client.report [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.493 187212 DEBUG nova.compute.provider_tree [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.511 187212 DEBUG nova.scheduler.client.report [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.539 187212 DEBUG nova.scheduler.client.report [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.607 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.688 187212 DEBUG nova.compute.provider_tree [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.708 187212 DEBUG nova.scheduler.client.report [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.719 187212 DEBUG nova.virt.libvirt.driver [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.738 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:56 compute-0 nova_compute[187208]: 2025-12-05 12:13:56.973 187212 INFO nova.network.neutron [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating port e30774db-d3d3-4438-b68a-6f7855f55128 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.146 187212 DEBUG nova.network.neutron [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Updating instance_info_cache with network_info: [{"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.176 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Releasing lock "refresh_cache-67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.177 187212 DEBUG nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Instance network_info: |[{"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.177 187212 DEBUG oslo_concurrency.lockutils [req-baffa397-948d-4325-b26d-c8bf5d18f351 req-844ce7eb-181c-443f-aa4a-9e5e9bab2563 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.178 187212 DEBUG nova.network.neutron [req-baffa397-948d-4325-b26d-c8bf5d18f351 req-844ce7eb-181c-443f-aa4a-9e5e9bab2563 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Refreshing network info cache for port 39c92a24-4461-4692-8f17-0b72bbaff52f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.182 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Start _get_guest_xml network_info=[{"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.187 187212 WARNING nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.192 187212 DEBUG nova.virt.libvirt.host [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.192 187212 DEBUG nova.virt.libvirt.host [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.203 187212 DEBUG nova.virt.libvirt.host [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.203 187212 DEBUG nova.virt.libvirt.host [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.204 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.204 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.204 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.204 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.205 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.205 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.205 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.205 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.205 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.206 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.206 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.206 187212 DEBUG nova.virt.hardware [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.210 187212 DEBUG nova.virt.libvirt.vif [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:13:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-426042225',display_name='tempest-ServerAddressesTestJSON-server-426042225',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-426042225',id=95,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='edb8f9390e454b10b9cc67dd88ba920b',ramdisk_id='',reservation_id='r-kbmw8qwa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1789644686',owner_user_name='tempest-ServerAddressesTestJSON-1789644686-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:47Z,user_data=None,user_id='536970a3e3b745ee970e691d562540eb',uuid=67b4beef-63ef-4afd-8a2b-35c28d4f1e0b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.210 187212 DEBUG nova.network.os_vif_util [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Converting VIF {"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.211 187212 DEBUG nova.network.os_vif_util [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:c4:84,bridge_name='br-int',has_traffic_filtering=True,id=39c92a24-4461-4692-8f17-0b72bbaff52f,network=Network(932c7aba-dc47-4543-928a-a0b2cdf62766),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c92a24-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.212 187212 DEBUG nova.objects.instance [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lazy-loading 'pci_devices' on Instance uuid 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.226 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <uuid>67b4beef-63ef-4afd-8a2b-35c28d4f1e0b</uuid>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <name>instance-0000005f</name>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerAddressesTestJSON-server-426042225</nova:name>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:13:57</nova:creationTime>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:13:57 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:13:57 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:13:57 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:13:57 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:13:57 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:13:57 compute-0 nova_compute[187208]:         <nova:user uuid="536970a3e3b745ee970e691d562540eb">tempest-ServerAddressesTestJSON-1789644686-project-member</nova:user>
Dec 05 12:13:57 compute-0 nova_compute[187208]:         <nova:project uuid="edb8f9390e454b10b9cc67dd88ba920b">tempest-ServerAddressesTestJSON-1789644686</nova:project>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:13:57 compute-0 nova_compute[187208]:         <nova:port uuid="39c92a24-4461-4692-8f17-0b72bbaff52f">
Dec 05 12:13:57 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <system>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <entry name="serial">67b4beef-63ef-4afd-8a2b-35c28d4f1e0b</entry>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <entry name="uuid">67b4beef-63ef-4afd-8a2b-35c28d4f1e0b</entry>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     </system>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <os>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   </os>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <features>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   </features>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk.config"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:10:c4:84"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <target dev="tap39c92a24-44"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/console.log" append="off"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <video>
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     </video>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:13:57 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:13:57 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:13:57 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:13:57 compute-0 nova_compute[187208]: </domain>
Dec 05 12:13:57 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.227 187212 DEBUG nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Preparing to wait for external event network-vif-plugged-39c92a24-4461-4692-8f17-0b72bbaff52f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.227 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.227 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.227 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.228 187212 DEBUG nova.virt.libvirt.vif [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:13:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-426042225',display_name='tempest-ServerAddressesTestJSON-server-426042225',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-426042225',id=95,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='edb8f9390e454b10b9cc67dd88ba920b',ramdisk_id='',reservation_id='r-kbmw8qwa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1789644686',owner_user_name='tempest-ServerAddressesTestJSON-1789644686-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:47Z,user_data=None,user_id='536970a3e3b745ee970e691d562540eb',uuid=67b4beef-63ef-4afd-8a2b-35c28d4f1e0b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.228 187212 DEBUG nova.network.os_vif_util [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Converting VIF {"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.229 187212 DEBUG nova.network.os_vif_util [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:c4:84,bridge_name='br-int',has_traffic_filtering=True,id=39c92a24-4461-4692-8f17-0b72bbaff52f,network=Network(932c7aba-dc47-4543-928a-a0b2cdf62766),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c92a24-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.229 187212 DEBUG os_vif [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:c4:84,bridge_name='br-int',has_traffic_filtering=True,id=39c92a24-4461-4692-8f17-0b72bbaff52f,network=Network(932c7aba-dc47-4543-928a-a0b2cdf62766),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c92a24-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.230 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.230 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.230 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.235 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.235 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39c92a24-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.236 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap39c92a24-44, col_values=(('external_ids', {'iface-id': '39c92a24-4461-4692-8f17-0b72bbaff52f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:c4:84', 'vm-uuid': '67b4beef-63ef-4afd-8a2b-35c28d4f1e0b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.237 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:57 compute-0 NetworkManager[55691]: <info>  [1764936837.2385] manager: (tap39c92a24-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/375)
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.238 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.245 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.245 187212 INFO os_vif [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:c4:84,bridge_name='br-int',has_traffic_filtering=True,id=39c92a24-4461-4692-8f17-0b72bbaff52f,network=Network(932c7aba-dc47-4543-928a-a0b2cdf62766),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c92a24-44')
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.309 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.309 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.309 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] No VIF found with MAC fa:16:3e:10:c4:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:13:57 compute-0 nova_compute[187208]: 2025-12-05 12:13:57.310 187212 INFO nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Using config drive
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.099 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.290 187212 INFO nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Creating config drive at /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk.config
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.296 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpggvyemsv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.320 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.321 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquired lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.321 187212 DEBUG nova.network.neutron [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.422 187212 DEBUG oslo_concurrency.processutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpggvyemsv" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:13:58 compute-0 kernel: tap39c92a24-44: entered promiscuous mode
Dec 05 12:13:58 compute-0 NetworkManager[55691]: <info>  [1764936838.4964] manager: (tap39c92a24-44): new Tun device (/org/freedesktop/NetworkManager/Devices/376)
Dec 05 12:13:58 compute-0 ovn_controller[95610]: 2025-12-05T12:13:58Z|00979|binding|INFO|Claiming lport 39c92a24-4461-4692-8f17-0b72bbaff52f for this chassis.
Dec 05 12:13:58 compute-0 ovn_controller[95610]: 2025-12-05T12:13:58Z|00980|binding|INFO|39c92a24-4461-4692-8f17-0b72bbaff52f: Claiming fa:16:3e:10:c4:84 10.100.0.9
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.496 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.514 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:c4:84 10.100.0.9'], port_security=['fa:16:3e:10:c4:84 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '67b4beef-63ef-4afd-8a2b-35c28d4f1e0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-932c7aba-dc47-4543-928a-a0b2cdf62766', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'edb8f9390e454b10b9cc67dd88ba920b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '037bf34d-aae3-4661-a7d3-ff4e792f2db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c82e8883-5380-4e89-b64c-1fe3f5c34dee, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=39c92a24-4461-4692-8f17-0b72bbaff52f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.517 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 39c92a24-4461-4692-8f17-0b72bbaff52f in datapath 932c7aba-dc47-4543-928a-a0b2cdf62766 bound to our chassis
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.520 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 932c7aba-dc47-4543-928a-a0b2cdf62766
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.521 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:58 compute-0 ovn_controller[95610]: 2025-12-05T12:13:58Z|00981|binding|INFO|Setting lport 39c92a24-4461-4692-8f17-0b72bbaff52f ovn-installed in OVS
Dec 05 12:13:58 compute-0 ovn_controller[95610]: 2025-12-05T12:13:58Z|00982|binding|INFO|Setting lport 39c92a24-4461-4692-8f17-0b72bbaff52f up in Southbound
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.524 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.537 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb8b13b-005d-4d88-9f15-9377dae51b0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.539 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap932c7aba-d1 in ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.543 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.542 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap932c7aba-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.542 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa385a4b-995a-4452-bff8-f47273a88375]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.547 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c1180ea1-5ea9-4d76-8b63-f0e276de9bfa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 systemd-udevd[237500]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.560 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b0af136c-a948-40b8-8ec1-4abba668f7cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 NetworkManager[55691]: <info>  [1764936838.5663] device (tap39c92a24-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:13:58 compute-0 systemd-machined[153543]: New machine qemu-111-instance-0000005f.
Dec 05 12:13:58 compute-0 NetworkManager[55691]: <info>  [1764936838.5686] device (tap39c92a24-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.581 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78ccee1a-66fd-4ca2-af0c-53d532441344]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 systemd[1]: Started Virtual Machine qemu-111-instance-0000005f.
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.620 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f3b62755-79a5-40e4-9e24-9f29e588c3e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 NetworkManager[55691]: <info>  [1764936838.6326] manager: (tap932c7aba-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/377)
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.631 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4d1e7332-1b54-42e3-b650-fac46d8ce4c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.678 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1f737fa9-434a-4ac0-bbc6-00506b199ed4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.682 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ea1299b0-f1f2-4c6b-a7fd-ffe3beebd2ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.694 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:58 compute-0 NetworkManager[55691]: <info>  [1764936838.7135] device (tap932c7aba-d0): carrier: link connected
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.720 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4412623d-ff79-4f21-a88e-5d8447abbd92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.744 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[67197d1a-19de-4d3e-893a-3babe0b7f2f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap932c7aba-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:ad:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 269], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421927, 'reachable_time': 29695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237533, 'error': None, 'target': 'ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.761 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2189f2-6183-48d5-a3ab-e874c4398608]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:ade0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 421927, 'tstamp': 421927}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237534, 'error': None, 'target': 'ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.781 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[185d71d1-bd74-421b-a8f0-5b7b6f1bfd38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap932c7aba-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:ad:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 269], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421927, 'reachable_time': 29695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237535, 'error': None, 'target': 'ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.817 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[646d4b95-84fb-4740-bc64-1fa57fff5368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.883 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[94c77eeb-a93a-4040-98da-9c8dd0df5e0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.885 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap932c7aba-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.885 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.886 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap932c7aba-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.888 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:58 compute-0 NetworkManager[55691]: <info>  [1764936838.8893] manager: (tap932c7aba-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/378)
Dec 05 12:13:58 compute-0 kernel: tap932c7aba-d0: entered promiscuous mode
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.893 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.894 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap932c7aba-d0, col_values=(('external_ids', {'iface-id': 'bd19a90d-5e66-43dc-8ccc-9be76ee50afe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.895 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:58 compute-0 ovn_controller[95610]: 2025-12-05T12:13:58Z|00983|binding|INFO|Releasing lport bd19a90d-5e66-43dc-8ccc-9be76ee50afe from this chassis (sb_readonly=0)
Dec 05 12:13:58 compute-0 nova_compute[187208]: 2025-12-05 12:13:58.908 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.910 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/932c7aba-dc47-4543-928a-a0b2cdf62766.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/932c7aba-dc47-4543-928a-a0b2cdf62766.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.911 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1912f1ca-0d52-4de6-a650-23b11d671e0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.912 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-932c7aba-dc47-4543-928a-a0b2cdf62766
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/932c7aba-dc47-4543-928a-a0b2cdf62766.pid.haproxy
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 932c7aba-dc47-4543-928a-a0b2cdf62766
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:13:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:58.913 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766', 'env', 'PROCESS_TAG=haproxy-932c7aba-dc47-4543-928a-a0b2cdf62766', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/932c7aba-dc47-4543-928a-a0b2cdf62766.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:13:59 compute-0 kernel: tapfcb6e165-bc (unregistering): left promiscuous mode
Dec 05 12:13:59 compute-0 NetworkManager[55691]: <info>  [1764936839.0861] device (tapfcb6e165-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.095 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:59 compute-0 ovn_controller[95610]: 2025-12-05T12:13:59Z|00984|binding|INFO|Releasing lport fcb6e165-bcf0-439d-849c-dc8819a32db9 from this chassis (sb_readonly=0)
Dec 05 12:13:59 compute-0 ovn_controller[95610]: 2025-12-05T12:13:59Z|00985|binding|INFO|Setting lport fcb6e165-bcf0-439d-849c-dc8819a32db9 down in Southbound
Dec 05 12:13:59 compute-0 ovn_controller[95610]: 2025-12-05T12:13:59Z|00986|binding|INFO|Removing iface tapfcb6e165-bc ovn-installed in OVS
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.099 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.111 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.115 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:d5:72 10.100.0.12'], port_security=['fa:16:3e:39:d5:72 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a9c0d69f-7894-4e3f-a056-4225da882a38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=fcb6e165-bcf0-439d-849c-dc8819a32db9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:13:59 compute-0 systemd[1]: machine-qemu\x2d109\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Dec 05 12:13:59 compute-0 systemd[1]: machine-qemu\x2d109\x2dinstance\x2d0000005e.scope: Consumed 13.147s CPU time.
Dec 05 12:13:59 compute-0 systemd-machined[153543]: Machine qemu-109-instance-0000005e terminated.
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.225 187212 DEBUG nova.compute.manager [req-c1cc832b-9167-43a2-9b1c-a9bf4fd1d737 req-809ebcd8-28af-4640-bac0-f6e063de3f4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-changed-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.226 187212 DEBUG nova.compute.manager [req-c1cc832b-9167-43a2-9b1c-a9bf4fd1d737 req-809ebcd8-28af-4640-bac0-f6e063de3f4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Refreshing instance network info cache due to event network-changed-e30774db-d3d3-4438-b68a-6f7855f55128. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.226 187212 DEBUG oslo_concurrency.lockutils [req-c1cc832b-9167-43a2-9b1c-a9bf4fd1d737 req-809ebcd8-28af-4640-bac0-f6e063de3f4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:13:59 compute-0 NetworkManager[55691]: <info>  [1764936839.3172] manager: (tapfcb6e165-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/379)
Dec 05 12:13:59 compute-0 podman[237578]: 2025-12-05 12:13:59.371809988 +0000 UTC m=+0.081738934 container create 053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.385 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936839.3851273, 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.386 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] VM Started (Lifecycle Event)
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.415 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:59 compute-0 systemd[1]: Started libpod-conmon-053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8.scope.
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.419 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936839.3851695, 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.419 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] VM Paused (Lifecycle Event)
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.422 187212 DEBUG nova.compute.manager [req-af6485dd-dc3b-4fbe-8e9f-19bc93bd14e2 req-dfb43af0-52b4-4683-9949-09845fa8f82c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.422 187212 DEBUG oslo_concurrency.lockutils [req-af6485dd-dc3b-4fbe-8e9f-19bc93bd14e2 req-dfb43af0-52b4-4683-9949-09845fa8f82c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.422 187212 DEBUG oslo_concurrency.lockutils [req-af6485dd-dc3b-4fbe-8e9f-19bc93bd14e2 req-dfb43af0-52b4-4683-9949-09845fa8f82c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.423 187212 DEBUG oslo_concurrency.lockutils [req-af6485dd-dc3b-4fbe-8e9f-19bc93bd14e2 req-dfb43af0-52b4-4683-9949-09845fa8f82c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.423 187212 DEBUG nova.compute.manager [req-af6485dd-dc3b-4fbe-8e9f-19bc93bd14e2 req-dfb43af0-52b4-4683-9949-09845fa8f82c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.423 187212 WARNING nova.compute.manager [req-af6485dd-dc3b-4fbe-8e9f-19bc93bd14e2 req-dfb43af0-52b4-4683-9949-09845fa8f82c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:13:59 compute-0 podman[237578]: 2025-12-05 12:13:59.333195641 +0000 UTC m=+0.043124597 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.442 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.447 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:13:59 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:13:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc8262d2f54083b31d291a2dbe31e430191bd54a85bf51d494d995c7b9c8bc74/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.469 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:13:59 compute-0 podman[237578]: 2025-12-05 12:13:59.474179587 +0000 UTC m=+0.184108553 container init 053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 12:13:59 compute-0 podman[237578]: 2025-12-05 12:13:59.480621914 +0000 UTC m=+0.190550850 container start 053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 12:13:59 compute-0 neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766[237611]: [NOTICE]   (237615) : New worker (237617) forked
Dec 05 12:13:59 compute-0 neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766[237611]: [NOTICE]   (237615) : Loading success.
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.682 104471 INFO neutron.agent.ovn.metadata.agent [-] Port fcb6e165-bcf0-439d-849c-dc8819a32db9 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 unbound from our chassis
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.686 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c025e40-a124-4810-9d75-2a59e91db1b3
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.706 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aae77b06-650f-4f49-ac61-a5cd6c9b8746]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.737 187212 INFO nova.virt.libvirt.driver [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Instance shutdown successfully after 13 seconds.
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.741 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[370c86ff-2479-4b32-b9e3-e00488446562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.745 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[46bef1a9-2e93-4159-bc36-b77cf77cd608]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.750 187212 INFO nova.virt.libvirt.driver [-] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Instance destroyed successfully.
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.751 187212 DEBUG nova.objects.instance [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'numa_topology' on Instance uuid a9c0d69f-7894-4e3f-a056-4225da882a38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.764 187212 DEBUG nova.compute.manager [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.771 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca8faf9-e97c-45c5-9176-f5eee70d5fd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.788 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4aba60b2-930e-4f9e-80c0-d1224589ddaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c025e40-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:a4:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 27, 'rx_bytes': 700, 'tx_bytes': 1278, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 27, 'rx_bytes': 700, 'tx_bytes': 1278, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402470, 'reachable_time': 26223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237631, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.806 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2de19379-4f56-4ac5-9eb2-e80e29e00ef1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402483, 'tstamp': 402483}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237632, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0c025e40-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402485, 'tstamp': 402485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237632, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.808 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.810 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.816 187212 DEBUG oslo_concurrency.lockutils [None req-eea621a0-24fa-4344-8292-a288e35e1430 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.817 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c025e40-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.817 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:13:59 compute-0 nova_compute[187208]: 2025-12-05 12:13:59.817 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.817 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c025e40-a0, col_values=(('external_ids', {'iface-id': 'c15f026e-161e-4d8d-81ec-2dd0eb1e85f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:13:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:13:59.818 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.572 187212 DEBUG nova.network.neutron [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.598 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Releasing lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.600 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.601 187212 INFO nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Creating image(s)
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.602 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.603 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.605 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.605 187212 DEBUG nova.objects.instance [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.606 187212 DEBUG oslo_concurrency.lockutils [req-c1cc832b-9167-43a2-9b1c-a9bf4fd1d737 req-809ebcd8-28af-4640-bac0-f6e063de3f4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.607 187212 DEBUG nova.network.neutron [req-c1cc832b-9167-43a2-9b1c-a9bf4fd1d737 req-809ebcd8-28af-4640-bac0-f6e063de3f4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Refreshing network info cache for port e30774db-d3d3-4438-b68a-6f7855f55128 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.630 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "f6efb356df4fb897f7bcc70928d045e44798ba61" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.631 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "f6efb356df4fb897f7bcc70928d045e44798ba61" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.954 187212 DEBUG nova.network.neutron [req-baffa397-948d-4325-b26d-c8bf5d18f351 req-844ce7eb-181c-443f-aa4a-9e5e9bab2563 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Updated VIF entry in instance network info cache for port 39c92a24-4461-4692-8f17-0b72bbaff52f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.955 187212 DEBUG nova.network.neutron [req-baffa397-948d-4325-b26d-c8bf5d18f351 req-844ce7eb-181c-443f-aa4a-9e5e9bab2563 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Updating instance_info_cache with network_info: [{"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:00 compute-0 nova_compute[187208]: 2025-12-05 12:14:00.974 187212 DEBUG oslo_concurrency.lockutils [req-baffa397-948d-4325-b26d-c8bf5d18f351 req-844ce7eb-181c-443f-aa4a-9e5e9bab2563 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:14:01 compute-0 podman[237634]: 2025-12-05 12:14:01.294956229 +0000 UTC m=+0.144282813 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3)
Dec 05 12:14:02 compute-0 nova_compute[187208]: 2025-12-05 12:14:02.395 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:02 compute-0 nova_compute[187208]: 2025-12-05 12:14:02.633 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:02 compute-0 nova_compute[187208]: 2025-12-05 12:14:02.712 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61.part --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:02 compute-0 nova_compute[187208]: 2025-12-05 12:14:02.714 187212 DEBUG nova.virt.images [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] e12053fb-5eb2-4850-82fb-a7e9b54de98a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 05 12:14:02 compute-0 nova_compute[187208]: 2025-12-05 12:14:02.715 187212 DEBUG nova.privsep.utils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 12:14:02 compute-0 nova_compute[187208]: 2025-12-05 12:14:02.715 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61.part /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:03.018 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:03.020 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:03.021 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.195 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61.part /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61.converted" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.206 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.276 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61.converted --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.278 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "f6efb356df4fb897f7bcc70928d045e44798ba61" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.296 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.364 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.366 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "f6efb356df4fb897f7bcc70928d045e44798ba61" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.366 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "f6efb356df4fb897f7bcc70928d045e44798ba61" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.378 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.462 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.463 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61,backing_fmt=raw /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.501 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61,backing_fmt=raw /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.502 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "f6efb356df4fb897f7bcc70928d045e44798ba61" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.503 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.582 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.583 187212 DEBUG nova.objects.instance [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'migration_context' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.601 187212 INFO nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Rebasing disk image.
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.602 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.664 187212 DEBUG nova.network.neutron [req-c1cc832b-9167-43a2-9b1c-a9bf4fd1d737 req-809ebcd8-28af-4640-bac0-f6e063de3f4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updated VIF entry in instance network info cache for port e30774db-d3d3-4438-b68a-6f7855f55128. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.665 187212 DEBUG nova.network.neutron [req-c1cc832b-9167-43a2-9b1c-a9bf4fd1d737 req-809ebcd8-28af-4640-bac0-f6e063de3f4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.669 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.670 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.693 187212 DEBUG oslo_concurrency.lockutils [req-c1cc832b-9167-43a2-9b1c-a9bf4fd1d737 req-809ebcd8-28af-4640-bac0-f6e063de3f4f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.694 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.727 187212 DEBUG nova.compute.manager [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Received event network-vif-unplugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.728 187212 DEBUG oslo_concurrency.lockutils [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.728 187212 DEBUG oslo_concurrency.lockutils [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.728 187212 DEBUG oslo_concurrency.lockutils [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.729 187212 DEBUG nova.compute.manager [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] No waiting events found dispatching network-vif-unplugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.729 187212 WARNING nova.compute.manager [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Received unexpected event network-vif-unplugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 for instance with vm_state stopped and task_state None.
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.729 187212 DEBUG nova.compute.manager [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Received event network-vif-plugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.729 187212 DEBUG oslo_concurrency.lockutils [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.729 187212 DEBUG oslo_concurrency.lockutils [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.730 187212 DEBUG oslo_concurrency.lockutils [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.730 187212 DEBUG nova.compute.manager [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] No waiting events found dispatching network-vif-plugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.730 187212 WARNING nova.compute.manager [req-4bc0678e-cb0e-4cad-8bb1-e268ecbdf99c req-bf6bb253-38c3-4723-9d41-735d3c8560c0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Received unexpected event network-vif-plugged-fcb6e165-bcf0-439d-849c-dc8819a32db9 for instance with vm_state stopped and task_state None.
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.809 187212 DEBUG nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.809 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.809 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.810 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.810 187212 DEBUG nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.810 187212 WARNING nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.810 187212 DEBUG nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.810 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.811 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.811 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.811 187212 DEBUG nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.811 187212 WARNING nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.811 187212 DEBUG nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.811 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.812 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.812 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.812 187212 DEBUG nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.812 187212 WARNING nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.812 187212 DEBUG nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Received event network-vif-plugged-39c92a24-4461-4692-8f17-0b72bbaff52f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.812 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.813 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.813 187212 DEBUG oslo_concurrency.lockutils [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.813 187212 DEBUG nova.compute.manager [req-fea426a8-5b6f-4f59-8664-0f25c3d5da29 req-b61448ba-cf18-456a-8e46-51b4942a4436 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Processing event network-vif-plugged-39c92a24-4461-4692-8f17-0b72bbaff52f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.814 187212 DEBUG nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.817 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936843.8170645, 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.817 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] VM Resumed (Lifecycle Event)
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.819 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.848 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.852 187212 INFO nova.virt.libvirt.driver [-] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Instance spawned successfully.
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.852 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.857 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.874 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.878 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.879 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.879 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.880 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.880 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.881 187212 DEBUG nova.virt.libvirt.driver [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.934 187212 INFO nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Took 15.92 seconds to spawn the instance on the hypervisor.
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.935 187212 DEBUG nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:03 compute-0 nova_compute[187208]: 2025-12-05 12:14:03.998 187212 INFO nova.compute.manager [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Took 17.05 seconds to build instance.
Dec 05 12:14:04 compute-0 nova_compute[187208]: 2025-12-05 12:14:04.017 187212 DEBUG oslo_concurrency.lockutils [None req-51aadd24-733d-4662-b4a2-b26264c10a1c 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:04 compute-0 ovn_controller[95610]: 2025-12-05T12:14:04Z|00117|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:14:04 compute-0 nova_compute[187208]: 2025-12-05 12:14:04.509 187212 DEBUG oslo_concurrency.lockutils [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:04 compute-0 nova_compute[187208]: 2025-12-05 12:14:04.510 187212 DEBUG oslo_concurrency.lockutils [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:04 compute-0 nova_compute[187208]: 2025-12-05 12:14:04.511 187212 DEBUG nova.compute.manager [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:04 compute-0 nova_compute[187208]: 2025-12-05 12:14:04.514 187212 DEBUG nova.compute.manager [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Dec 05 12:14:04 compute-0 nova_compute[187208]: 2025-12-05 12:14:04.516 187212 DEBUG nova.objects.instance [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:04 compute-0 nova_compute[187208]: 2025-12-05 12:14:04.546 187212 DEBUG nova.virt.libvirt.driver [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.115 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk" returned: 0 in 1.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.115 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.115 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Ensure instance console log exists: /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.116 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.116 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.116 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.118 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Start _get_guest_xml network_info=[{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c5e2407a973c16cad34cbd551c25b4c1',container_format='bare',created_at=2025-12-05T12:13:33Z,direct_url=<?>,disk_format='qcow2',id=e12053fb-5eb2-4850-82fb-a7e9b54de98a,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-826937421-shelved',owner='c5b34686513f4abc8165113eb8c6831e',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-12-05T12:13:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.123 187212 WARNING nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.127 187212 DEBUG nova.virt.libvirt.host [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.128 187212 DEBUG nova.virt.libvirt.host [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.130 187212 DEBUG nova.virt.libvirt.host [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.130 187212 DEBUG nova.virt.libvirt.host [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.131 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.131 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c5e2407a973c16cad34cbd551c25b4c1',container_format='bare',created_at=2025-12-05T12:13:33Z,direct_url=<?>,disk_format='qcow2',id=e12053fb-5eb2-4850-82fb-a7e9b54de98a,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-826937421-shelved',owner='c5b34686513f4abc8165113eb8c6831e',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-12-05T12:13:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.131 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.131 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.132 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.132 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.132 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.132 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.132 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.133 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.133 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.133 187212 DEBUG nova.virt.hardware [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.133 187212 DEBUG nova.objects.instance [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.154 187212 DEBUG nova.virt.libvirt.vif [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-826937421',display_name='tempest-ServersNegativeTestJSON-server-826937421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-826937421',id=80,image_ref='e12053fb-5eb2-4850-82fb-a7e9b54de98a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-snx0qylv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member',shelved_at='2025-12-05T12:13:40.769586',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='e12053fb-5eb2-4850-82fb-a7e9b54de98a'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:56Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=28e48516-8665-4d98-a92d-c84b7da9a284,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.155 187212 DEBUG nova.network.os_vif_util [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.156 187212 DEBUG nova.network.os_vif_util [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.156 187212 DEBUG nova.objects.instance [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'pci_devices' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.168 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <uuid>28e48516-8665-4d98-a92d-c84b7da9a284</uuid>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <name>instance-00000050</name>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersNegativeTestJSON-server-826937421</nova:name>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:14:05</nova:creationTime>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:14:05 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:14:05 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:14:05 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:14:05 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:14:05 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:14:05 compute-0 nova_compute[187208]:         <nova:user uuid="e90fa3a379b4494c84626bb6a761cd30">tempest-ServersNegativeTestJSON-1063007033-project-member</nova:user>
Dec 05 12:14:05 compute-0 nova_compute[187208]:         <nova:project uuid="c5b34686513f4abc8165113eb8c6831e">tempest-ServersNegativeTestJSON-1063007033</nova:project>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="e12053fb-5eb2-4850-82fb-a7e9b54de98a"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:14:05 compute-0 nova_compute[187208]:         <nova:port uuid="e30774db-d3d3-4438-b68a-6f7855f55128">
Dec 05 12:14:05 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <system>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <entry name="serial">28e48516-8665-4d98-a92d-c84b7da9a284</entry>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <entry name="uuid">28e48516-8665-4d98-a92d-c84b7da9a284</entry>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     </system>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <os>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   </os>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <features>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   </features>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.config"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:50:8e:78"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <target dev="tape30774db-d3"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/console.log" append="off"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <video>
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     </video>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:14:05 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:14:05 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:14:05 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:14:05 compute-0 nova_compute[187208]: </domain>
Dec 05 12:14:05 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.169 187212 DEBUG nova.compute.manager [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Preparing to wait for external event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.169 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.170 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.170 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.171 187212 DEBUG nova.virt.libvirt.vif [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-826937421',display_name='tempest-ServersNegativeTestJSON-server-826937421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-826937421',id=80,image_ref='e12053fb-5eb2-4850-82fb-a7e9b54de98a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-snx0qylv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member',shelved_at='2025-12-05T12:13:40.769586',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='e12053fb-5eb2-4850-82fb-a7e9b54de98a'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:13:56Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=28e48516-8665-4d98-a92d-c84b7da9a284,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.171 187212 DEBUG nova.network.os_vif_util [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.171 187212 DEBUG nova.network.os_vif_util [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.172 187212 DEBUG os_vif [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.173 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.173 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.176 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.176 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape30774db-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.177 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape30774db-d3, col_values=(('external_ids', {'iface-id': 'e30774db-d3d3-4438-b68a-6f7855f55128', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:8e:78', 'vm-uuid': '28e48516-8665-4d98-a92d-c84b7da9a284'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:05 compute-0 NetworkManager[55691]: <info>  [1764936845.1797] manager: (tape30774db-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/380)
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.187 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.189 187212 INFO os_vif [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3')
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.300 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.301 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.302 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] No VIF found with MAC fa:16:3e:50:8e:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.302 187212 INFO nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Using config drive
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.319 187212 DEBUG nova.objects.instance [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.357 187212 DEBUG nova.objects.instance [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'keypairs' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.449 187212 DEBUG oslo_concurrency.lockutils [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.450 187212 DEBUG oslo_concurrency.lockutils [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.450 187212 DEBUG oslo_concurrency.lockutils [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.451 187212 DEBUG oslo_concurrency.lockutils [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.451 187212 DEBUG oslo_concurrency.lockutils [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.452 187212 INFO nova.compute.manager [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Terminating instance
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.453 187212 DEBUG nova.compute.manager [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.459 187212 INFO nova.virt.libvirt.driver [-] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Instance destroyed successfully.
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.460 187212 DEBUG nova.objects.instance [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'resources' on Instance uuid a9c0d69f-7894-4e3f-a056-4225da882a38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.470 187212 DEBUG nova.virt.libvirt.vif [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:13:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1799618215',display_name='tempest-Íñstáñcé-865479247',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1799618215',id=94,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-1w6sm2sq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:14:00Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=a9c0d69f-7894-4e3f-a056-4225da882a38,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.471 187212 DEBUG nova.network.os_vif_util [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "address": "fa:16:3e:39:d5:72", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcb6e165-bc", "ovs_interfaceid": "fcb6e165-bcf0-439d-849c-dc8819a32db9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.471 187212 DEBUG nova.network.os_vif_util [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:d5:72,bridge_name='br-int',has_traffic_filtering=True,id=fcb6e165-bcf0-439d-849c-dc8819a32db9,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcb6e165-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.472 187212 DEBUG os_vif [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:d5:72,bridge_name='br-int',has_traffic_filtering=True,id=fcb6e165-bcf0-439d-849c-dc8819a32db9,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcb6e165-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.474 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfcb6e165-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.475 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.477 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.483 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.486 187212 INFO os_vif [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:d5:72,bridge_name='br-int',has_traffic_filtering=True,id=fcb6e165-bcf0-439d-849c-dc8819a32db9,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcb6e165-bc')
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.486 187212 INFO nova.virt.libvirt.driver [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Deleting instance files /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38_del
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.487 187212 INFO nova.virt.libvirt.driver [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Deletion of /var/lib/nova/instances/a9c0d69f-7894-4e3f-a056-4225da882a38_del complete
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.546 187212 INFO nova.compute.manager [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Took 0.09 seconds to destroy the instance on the hypervisor.
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.546 187212 DEBUG oslo.service.loopingcall [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.548 187212 DEBUG nova.compute.manager [-] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:14:05 compute-0 nova_compute[187208]: 2025-12-05 12:14:05.549 187212 DEBUG nova.network.neutron [-] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.100 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.132 187212 WARNING nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] While synchronizing instance power states, found 5 instances in the database and 4 instances on the hypervisor.
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.133 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 30cb83d4-3a34-4420-bc83-099b266da48c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.133 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 28e48516-8665-4d98-a92d-c84b7da9a284 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.133 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.133 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid a9c0d69f-7894-4e3f-a056-4225da882a38 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.134 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.134 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.134 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "30cb83d4-3a34-4420-bc83-099b266da48c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.135 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.135 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.135 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "a9c0d69f-7894-4e3f-a056-4225da882a38" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.135 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.135 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.190 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.191 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "30cb83d4-3a34-4420-bc83-099b266da48c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.256 187212 DEBUG nova.compute.manager [req-58fe6739-2dce-49f5-abf4-23db37101b67 req-c07e992e-9e89-45e9-a994-c7340dfcce41 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Received event network-vif-plugged-39c92a24-4461-4692-8f17-0b72bbaff52f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.256 187212 DEBUG oslo_concurrency.lockutils [req-58fe6739-2dce-49f5-abf4-23db37101b67 req-c07e992e-9e89-45e9-a994-c7340dfcce41 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.256 187212 DEBUG oslo_concurrency.lockutils [req-58fe6739-2dce-49f5-abf4-23db37101b67 req-c07e992e-9e89-45e9-a994-c7340dfcce41 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.257 187212 DEBUG oslo_concurrency.lockutils [req-58fe6739-2dce-49f5-abf4-23db37101b67 req-c07e992e-9e89-45e9-a994-c7340dfcce41 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.257 187212 DEBUG nova.compute.manager [req-58fe6739-2dce-49f5-abf4-23db37101b67 req-c07e992e-9e89-45e9-a994-c7340dfcce41 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] No waiting events found dispatching network-vif-plugged-39c92a24-4461-4692-8f17-0b72bbaff52f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.257 187212 WARNING nova.compute.manager [req-58fe6739-2dce-49f5-abf4-23db37101b67 req-c07e992e-9e89-45e9-a994-c7340dfcce41 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Received unexpected event network-vif-plugged-39c92a24-4461-4692-8f17-0b72bbaff52f for instance with vm_state active and task_state None.
Dec 05 12:14:06 compute-0 podman[237704]: 2025-12-05 12:14:06.27321694 +0000 UTC m=+0.063086604 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:14:06 compute-0 podman[237703]: 2025-12-05 12:14:06.292997552 +0000 UTC m=+0.082519796 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.575 187212 INFO nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Creating config drive at /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.config
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.579 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv5joepb5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.709 187212 DEBUG oslo_concurrency.processutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv5joepb5" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:06 compute-0 kernel: tape30774db-d3: entered promiscuous mode
Dec 05 12:14:06 compute-0 NetworkManager[55691]: <info>  [1764936846.7857] manager: (tape30774db-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/381)
Dec 05 12:14:06 compute-0 ovn_controller[95610]: 2025-12-05T12:14:06Z|00987|binding|INFO|Claiming lport e30774db-d3d3-4438-b68a-6f7855f55128 for this chassis.
Dec 05 12:14:06 compute-0 ovn_controller[95610]: 2025-12-05T12:14:06Z|00988|binding|INFO|e30774db-d3d3-4438-b68a-6f7855f55128: Claiming fa:16:3e:50:8e:78 10.100.0.9
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.786 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.801 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8e:78 10.100.0.9'], port_security=['fa:16:3e:50:8e:78 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '7', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e30774db-d3d3-4438-b68a-6f7855f55128) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.802 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e30774db-d3d3-4438-b68a-6f7855f55128 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be bound to our chassis
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.803 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:06 compute-0 ovn_controller[95610]: 2025-12-05T12:14:06Z|00989|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 ovn-installed in OVS
Dec 05 12:14:06 compute-0 ovn_controller[95610]: 2025-12-05T12:14:06Z|00990|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 up in Southbound
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.805 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.805 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.822 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd19460-bf5d-48d8-b068-367e858c270d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.823 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82130d25-f1 in ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:14:06 compute-0 systemd-udevd[237759]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.826 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82130d25-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.827 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[68067698-bdd3-426a-b9d4-a88dd08916ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.829 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[03938ae5-cbb3-498e-9fca-629b7c5ab5df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:06 compute-0 systemd-machined[153543]: New machine qemu-112-instance-00000050.
Dec 05 12:14:06 compute-0 NetworkManager[55691]: <info>  [1764936846.8409] device (tape30774db-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:14:06 compute-0 NetworkManager[55691]: <info>  [1764936846.8421] device (tape30774db-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.840 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f53c9a-1cbf-40c9-b746-a4f2cfa13e94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:06 compute-0 systemd[1]: Started Virtual Machine qemu-112-instance-00000050.
Dec 05 12:14:06 compute-0 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec 05 12:14:06 compute-0 NetworkManager[55691]: <info>  [1764936846.8751] device (tap5316adeb-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.872 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[49039c66-5ba1-43d3-a89a-856856fc6d07]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.905 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2bf88935-92a0-4864-9410-c40d3eb6ce7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:06 compute-0 systemd-udevd[237763]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.946 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[75dbccda-4101-4f6f-a831-3d02a29eddb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:06 compute-0 NetworkManager[55691]: <info>  [1764936846.9485] manager: (tap82130d25-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/382)
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:06 compute-0 ovn_controller[95610]: 2025-12-05T12:14:06Z|00991|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec 05 12:14:06 compute-0 ovn_controller[95610]: 2025-12-05T12:14:06Z|00992|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec 05 12:14:06 compute-0 ovn_controller[95610]: 2025-12-05T12:14:06Z|00993|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.951 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.967 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:06 compute-0 nova_compute[187208]: 2025-12-05 12:14:06.975 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.987 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5fafa0e5-18e5-4499-8699-6c2ee8be8e79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:06.990 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3a144b-b1c7-4102-8356-d000d6bb35a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:07 compute-0 systemd[1]: machine-qemu\x2d110\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Dec 05 12:14:07 compute-0 systemd[1]: machine-qemu\x2d110\x2dinstance\x2d0000005c.scope: Consumed 13.682s CPU time.
Dec 05 12:14:07 compute-0 systemd-machined[153543]: Machine qemu-110-instance-0000005c terminated.
Dec 05 12:14:07 compute-0 NetworkManager[55691]: <info>  [1764936847.0140] device (tap82130d25-f0): carrier: link connected
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.019 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e3721263-973e-4cca-ae09-3609ac402f41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.036 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8a893e7f-b5ac-41ec-ab9a-69167bda5657]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82130d25-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:36:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 272], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422757, 'reachable_time': 37563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237799, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.059 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:8b:a2 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-7ef5fe5e-41c6-4a9e-a350-0883b0f491ae', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ef5fe5e-41c6-4a9e-a350-0883b0f491ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ff94c302f4541f9bdb0a79ab2b69a76', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=df7566ba-990c-4b19-8e24-00b407f0dad0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cfb4e190-f9b4-40e4-a648-9909f14d38e4) old=Port_Binding(mac=['fa:16:3e:3a:8b:a2 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7ef5fe5e-41c6-4a9e-a350-0883b0f491ae', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ef5fe5e-41c6-4a9e-a350-0883b0f491ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ff94c302f4541f9bdb0a79ab2b69a76', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.058 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[841b2096-0f03-4661-b0c4-8580eeea1518]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:36e4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422757, 'tstamp': 422757}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237800, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.076 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72df3a03-4501-4f45-b24e-5543aaa7606b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82130d25-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:36:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 272], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422757, 'reachable_time': 37563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237801, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.108 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac5832cd-409b-41e0-a0f7-8ffa4eed422d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:07 compute-0 NetworkManager[55691]: <info>  [1764936847.1635] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/383)
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.170 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.174 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[22307fb4-18f4-4963-8cde-985996c9866c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.176 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82130d25-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.176 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.176 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82130d25-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 NetworkManager[55691]: <info>  [1764936847.1794] manager: (tap82130d25-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/384)
Dec 05 12:14:07 compute-0 kernel: tap82130d25-f0: entered promiscuous mode
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.187 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.191 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82130d25-f0, col_values=(('external_ids', {'iface-id': 'f81c4a80-27d3-4231-a37a-7c231838aca7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.192 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 ovn_controller[95610]: 2025-12-05T12:14:07Z|00994|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.205 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.210 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.211 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82130d25-ff6c-480e-884d-f3d97b6fd9be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82130d25-ff6c-480e-884d-f3d97b6fd9be.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.212 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[389a94f1-ace7-4960-ad61-03260a93fd43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.213 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/82130d25-ff6c-480e-884d-f3d97b6fd9be.pid.haproxy
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.213 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'env', 'PROCESS_TAG=haproxy-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82130d25-ff6c-480e-884d-f3d97b6fd9be.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.307 187212 DEBUG nova.compute.manager [req-ab9af26f-8ac5-4fa7-9ecb-0cae72bc3205 req-7f78fef2-f655-4de9-8509-e3c351448bc2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.308 187212 DEBUG oslo_concurrency.lockutils [req-ab9af26f-8ac5-4fa7-9ecb-0cae72bc3205 req-7f78fef2-f655-4de9-8509-e3c351448bc2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.308 187212 DEBUG oslo_concurrency.lockutils [req-ab9af26f-8ac5-4fa7-9ecb-0cae72bc3205 req-7f78fef2-f655-4de9-8509-e3c351448bc2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.308 187212 DEBUG oslo_concurrency.lockutils [req-ab9af26f-8ac5-4fa7-9ecb-0cae72bc3205 req-7f78fef2-f655-4de9-8509-e3c351448bc2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.309 187212 DEBUG nova.compute.manager [req-ab9af26f-8ac5-4fa7-9ecb-0cae72bc3205 req-7f78fef2-f655-4de9-8509-e3c351448bc2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Processing event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.422 187212 DEBUG nova.network.neutron [-] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.433 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936847.432463, 28e48516-8665-4d98-a92d-c84b7da9a284 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.433 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Started (Lifecycle Event)
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.435 187212 DEBUG nova.compute.manager [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.439 187212 DEBUG nova.virt.libvirt.driver [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.446 187212 INFO nova.virt.libvirt.driver [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance spawned successfully.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.450 187212 INFO nova.compute.manager [-] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Took 1.90 seconds to deallocate network for instance.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.461 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.472 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.519 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.520 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936847.4331663, 28e48516-8665-4d98-a92d-c84b7da9a284 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.520 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Paused (Lifecycle Event)
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.521 187212 DEBUG oslo_concurrency.lockutils [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.522 187212 DEBUG oslo_concurrency.lockutils [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.522 187212 DEBUG oslo_concurrency.lockutils [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.522 187212 DEBUG oslo_concurrency.lockutils [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.522 187212 DEBUG oslo_concurrency.lockutils [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.524 187212 INFO nova.compute.manager [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Terminating instance
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.525 187212 DEBUG nova.compute.manager [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:14:07 compute-0 kernel: tap39c92a24-44 (unregistering): left promiscuous mode
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.544 187212 DEBUG oslo_concurrency.lockutils [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.545 187212 DEBUG oslo_concurrency.lockutils [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.547 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:07 compute-0 NetworkManager[55691]: <info>  [1764936847.5491] device (tap39c92a24-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:14:07 compute-0 ovn_controller[95610]: 2025-12-05T12:14:07Z|00995|binding|INFO|Releasing lport 39c92a24-4461-4692-8f17-0b72bbaff52f from this chassis (sb_readonly=0)
Dec 05 12:14:07 compute-0 ovn_controller[95610]: 2025-12-05T12:14:07Z|00996|binding|INFO|Setting lport 39c92a24-4461-4692-8f17-0b72bbaff52f down in Southbound
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.563 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 ovn_controller[95610]: 2025-12-05T12:14:07Z|00997|binding|INFO|Removing iface tap39c92a24-44 ovn-installed in OVS
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.566 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.569 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936847.439155, 28e48516-8665-4d98-a92d-c84b7da9a284 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.570 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Resumed (Lifecycle Event)
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.573 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:c4:84 10.100.0.9'], port_security=['fa:16:3e:10:c4:84 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '67b4beef-63ef-4afd-8a2b-35c28d4f1e0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-932c7aba-dc47-4543-928a-a0b2cdf62766', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'edb8f9390e454b10b9cc67dd88ba920b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '037bf34d-aae3-4661-a7d3-ff4e792f2db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c82e8883-5380-4e89-b64c-1fe3f5c34dee, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=39c92a24-4461-4692-8f17-0b72bbaff52f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.574 187212 INFO nova.virt.libvirt.driver [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance shutdown successfully after 3 seconds.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.583 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.583 187212 DEBUG nova.objects.instance [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.598 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.599 187212 DEBUG nova.compute.manager [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.602 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:07 compute-0 systemd[1]: machine-qemu\x2d111\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Dec 05 12:14:07 compute-0 systemd[1]: machine-qemu\x2d111\x2dinstance\x2d0000005f.scope: Consumed 4.423s CPU time.
Dec 05 12:14:07 compute-0 systemd-machined[153543]: Machine qemu-111-instance-0000005f terminated.
Dec 05 12:14:07 compute-0 podman[237857]: 2025-12-05 12:14:07.631056278 +0000 UTC m=+0.065238287 container create ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.632 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.653 187212 DEBUG oslo_concurrency.lockutils [None req-92f6dfac-0bf9-4e56-94a6-84ec962fc90f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.654 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.654 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (powering-off). Skip.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.655 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:07 compute-0 systemd[1]: Started libpod-conmon-ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a.scope.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.678 187212 DEBUG nova.compute.provider_tree [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:14:07 compute-0 podman[237857]: 2025-12-05 12:14:07.600933747 +0000 UTC m=+0.035115776 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:14:07 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.697 187212 DEBUG nova.scheduler.client.report [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82c4320461bf4b5fdde7d3f6f24c7e9f8bf85dc2b87ae4144fbfa739d759c5c5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:14:07 compute-0 podman[237857]: 2025-12-05 12:14:07.713599153 +0000 UTC m=+0.147781172 container init ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 12:14:07 compute-0 podman[237857]: 2025-12-05 12:14:07.718495055 +0000 UTC m=+0.152677064 container start ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.721 187212 DEBUG oslo_concurrency.lockutils [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:07 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[237876]: [NOTICE]   (237880) : New worker (237883) forked
Dec 05 12:14:07 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[237876]: [NOTICE]   (237880) : Loading success.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.755 187212 INFO nova.scheduler.client.report [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Deleted allocations for instance a9c0d69f-7894-4e3f-a056-4225da882a38
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.792 187212 INFO nova.virt.libvirt.driver [-] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Instance destroyed successfully.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.793 187212 DEBUG nova.objects.instance [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lazy-loading 'resources' on Instance uuid 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.796 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.798 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.799 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d03c84a-f7b1-4746-8e0a-63797e42e0d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:07.799 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace which is not needed anymore
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.813 187212 DEBUG nova.virt.libvirt.vif [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:13:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-426042225',display_name='tempest-ServerAddressesTestJSON-server-426042225',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-426042225',id=95,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:14:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='edb8f9390e454b10b9cc67dd88ba920b',ramdisk_id='',reservation_id='r-kbmw8qwa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-1789644686',owner_user_name='tempest-ServerAddressesTestJSON-1789644686-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:14:03Z,user_data=None,user_id='536970a3e3b745ee970e691d562540eb',uuid=67b4beef-63ef-4afd-8a2b-35c28d4f1e0b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.813 187212 DEBUG nova.network.os_vif_util [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Converting VIF {"id": "39c92a24-4461-4692-8f17-0b72bbaff52f", "address": "fa:16:3e:10:c4:84", "network": {"id": "932c7aba-dc47-4543-928a-a0b2cdf62766", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1572825884-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edb8f9390e454b10b9cc67dd88ba920b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c92a24-44", "ovs_interfaceid": "39c92a24-4461-4692-8f17-0b72bbaff52f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.814 187212 DEBUG nova.network.os_vif_util [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:c4:84,bridge_name='br-int',has_traffic_filtering=True,id=39c92a24-4461-4692-8f17-0b72bbaff52f,network=Network(932c7aba-dc47-4543-928a-a0b2cdf62766),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c92a24-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.815 187212 DEBUG os_vif [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:c4:84,bridge_name='br-int',has_traffic_filtering=True,id=39c92a24-4461-4692-8f17-0b72bbaff52f,network=Network(932c7aba-dc47-4543-928a-a0b2cdf62766),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c92a24-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.817 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.817 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39c92a24-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.821 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.824 187212 INFO os_vif [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:c4:84,bridge_name='br-int',has_traffic_filtering=True,id=39c92a24-4461-4692-8f17-0b72bbaff52f,network=Network(932c7aba-dc47-4543-928a-a0b2cdf62766),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c92a24-44')
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.825 187212 INFO nova.virt.libvirt.driver [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Deleting instance files /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b_del
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.825 187212 INFO nova.virt.libvirt.driver [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Deletion of /var/lib/nova/instances/67b4beef-63ef-4afd-8a2b-35c28d4f1e0b_del complete
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.848 187212 DEBUG oslo_concurrency.lockutils [None req-f86f6207-de87-409c-ad10-19105fdd9a38 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.850 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.715s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.850 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] During sync_power_state the instance has a pending task (deleting). Skip.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.850 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "a9c0d69f-7894-4e3f-a056-4225da882a38" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.883 187212 INFO nova.compute.manager [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.883 187212 DEBUG oslo.service.loopingcall [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.883 187212 DEBUG nova.compute.manager [-] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:14:07 compute-0 nova_compute[187208]: 2025-12-05 12:14:07.884 187212 DEBUG nova.network.neutron [-] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[237419]: [NOTICE]   (237423) : haproxy version is 2.8.14-c23fe91
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[237419]: [NOTICE]   (237423) : path to executable is /usr/sbin/haproxy
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[237419]: [WARNING]  (237423) : Exiting Master process...
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[237419]: [ALERT]    (237423) : Current worker (237425) exited with code 143 (Terminated)
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[237419]: [WARNING]  (237423) : All workers exited. Exiting... (0)
Dec 05 12:14:08 compute-0 systemd[1]: libpod-3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc.scope: Deactivated successfully.
Dec 05 12:14:08 compute-0 podman[237928]: 2025-12-05 12:14:08.046758886 +0000 UTC m=+0.171579322 container died 3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:14:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc-userdata-shm.mount: Deactivated successfully.
Dec 05 12:14:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-f31c5dbfdf4f785140ff1530ce746992dd132ed0b29b916c458115e2234ee385-merged.mount: Deactivated successfully.
Dec 05 12:14:08 compute-0 podman[237928]: 2025-12-05 12:14:08.392941405 +0000 UTC m=+0.517761851 container cleanup 3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:14:08 compute-0 systemd[1]: libpod-conmon-3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc.scope: Deactivated successfully.
Dec 05 12:14:08 compute-0 podman[237956]: 2025-12-05 12:14:08.528149564 +0000 UTC m=+0.107474108 container remove 3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.534 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d07c9bc6-9cff-48a7-ad07-98694bad26c1]: (4, ('Fri Dec  5 12:14:07 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc)\n3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc\nFri Dec  5 12:14:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc)\n3be351edd4e3e248bd89378743b983627fa03c864882abab6d845b32bf3b7fbc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.536 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[252027dc-4da0-4882-914a-81b24ed254aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.537 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.539 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:08 compute-0 kernel: tapf9ed41c2-b0: left promiscuous mode
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.557 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f1953d0b-9faa-42ee-a74e-3f4864f05faf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.576 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5413cc8f-b0b2-4be8-b5e4-e74273030fc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.580 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a52bdd9-b8ec-449f-ab31-28f2ed739842]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.600 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.600 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a60a0df4-cd54-4c6b-b0ba-1cb1fb5ac151]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421069, 'reachable_time': 39685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237976, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.603 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.603 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[250c3fbe-e4ad-4a57-a671-994889dae674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 systemd[1]: run-netns-ovnmeta\x2df9ed41c2\x2db085\x2d41ff\x2dac71\x2d6256a4e30e85.mount: Deactivated successfully.
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.603 104471 INFO neutron.agent.ovn.metadata.agent [-] Port cfb4e190-f9b4-40e4-a648-9909f14d38e4 in datapath 7ef5fe5e-41c6-4a9e-a350-0883b0f491ae unbound from our chassis
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.606 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7ef5fe5e-41c6-4a9e-a350-0883b0f491ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.607 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a534c6df-78b2-4bce-9be2-157901421bdf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.607 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 39c92a24-4461-4692-8f17-0b72bbaff52f in datapath 932c7aba-dc47-4543-928a-a0b2cdf62766 unbound from our chassis
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.609 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 932c7aba-dc47-4543-928a-a0b2cdf62766, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.610 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bb142a2b-f6d0-4be8-ac35-f93663c7f0a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.610 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766 namespace which is not needed anymore
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.617 187212 DEBUG nova.compute.manager [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.696 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.698 187212 DEBUG oslo_concurrency.lockutils [None req-eea5713f-7f33-4601-97d9-20451497338a e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 12.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.699 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 2.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.699 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.700 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766[237611]: [NOTICE]   (237615) : haproxy version is 2.8.14-c23fe91
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766[237611]: [NOTICE]   (237615) : path to executable is /usr/sbin/haproxy
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766[237611]: [WARNING]  (237615) : Exiting Master process...
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766[237611]: [WARNING]  (237615) : Exiting Master process...
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766[237611]: [ALERT]    (237615) : Current worker (237617) exited with code 143 (Terminated)
Dec 05 12:14:08 compute-0 neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766[237611]: [WARNING]  (237615) : All workers exited. Exiting... (0)
Dec 05 12:14:08 compute-0 systemd[1]: libpod-053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8.scope: Deactivated successfully.
Dec 05 12:14:08 compute-0 podman[237994]: 2025-12-05 12:14:08.831755352 +0000 UTC m=+0.112761561 container died 053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.862 187212 DEBUG nova.compute.manager [req-2a4f8dff-cf2a-4c9e-aea6-4a2ad4ff0dc7 req-7ab9a168-23e3-4aa4-9a9e-0a85eeef4f8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Received event network-vif-deleted-fcb6e165-bcf0-439d-849c-dc8819a32db9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8-userdata-shm.mount: Deactivated successfully.
Dec 05 12:14:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-cc8262d2f54083b31d291a2dbe31e430191bd54a85bf51d494d995c7b9c8bc74-merged.mount: Deactivated successfully.
Dec 05 12:14:08 compute-0 podman[237994]: 2025-12-05 12:14:08.892977782 +0000 UTC m=+0.173983981 container cleanup 053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 12:14:08 compute-0 systemd[1]: libpod-conmon-053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8.scope: Deactivated successfully.
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.929 187212 DEBUG nova.objects.instance [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.950 187212 DEBUG oslo_concurrency.lockutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.950 187212 DEBUG oslo_concurrency.lockutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.951 187212 DEBUG nova.network.neutron [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.951 187212 DEBUG nova.objects.instance [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'info_cache' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:08 compute-0 podman[238024]: 2025-12-05 12:14:08.97935982 +0000 UTC m=+0.063456896 container remove 053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.984 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[341fa5e9-1636-4f2e-8dd4-4a2e123ed90e]: (4, ('Fri Dec  5 12:14:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766 (053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8)\n053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8\nFri Dec  5 12:14:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766 (053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8)\n053f8b13821acd81c7b5c355a4a3275d42188e71c3f08cace4ecaf91dcd51aa8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.986 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7ecd3bcd-f03a-4d82-a136-a19ccf16cbc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:08.987 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap932c7aba-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:08 compute-0 nova_compute[187208]: 2025-12-05 12:14:08.988 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:08 compute-0 kernel: tap932c7aba-d0: left promiscuous mode
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.005 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:09.009 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa406d7f-545e-46c7-befe-1a85569593bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:09.022 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[00b9fca8-7514-44fa-bec8-48b5072939d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:09.025 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7efe6a25-0357-47af-bebd-009e6b3ce02f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:09.045 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[29ca77f3-d7ca-4098-a9a2-0fc261013322]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421917, 'reachable_time': 15004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238044, 'error': None, 'target': 'ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:09.047 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-932c7aba-dc47-4543-928a-a0b2cdf62766 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:14:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:09.047 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[11e258df-365e-415d-b81d-6567b4b5299b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d932c7aba\x2ddc47\x2d4543\x2d928a\x2da0b2cdf62766.mount: Deactivated successfully.
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.513 187212 DEBUG nova.network.neutron [-] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.536 187212 INFO nova.compute.manager [-] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Took 1.65 seconds to deallocate network for instance.
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.653 187212 DEBUG oslo_concurrency.lockutils [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.654 187212 DEBUG oslo_concurrency.lockutils [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.668 187212 DEBUG nova.compute.manager [req-2a34c3a4-32d5-4e83-8cb3-c7208f6c91f2 req-60b5d5d5-f2f2-4ee7-997c-f1cc3fe396c8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.668 187212 DEBUG oslo_concurrency.lockutils [req-2a34c3a4-32d5-4e83-8cb3-c7208f6c91f2 req-60b5d5d5-f2f2-4ee7-997c-f1cc3fe396c8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.668 187212 DEBUG oslo_concurrency.lockutils [req-2a34c3a4-32d5-4e83-8cb3-c7208f6c91f2 req-60b5d5d5-f2f2-4ee7-997c-f1cc3fe396c8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.669 187212 DEBUG oslo_concurrency.lockutils [req-2a34c3a4-32d5-4e83-8cb3-c7208f6c91f2 req-60b5d5d5-f2f2-4ee7-997c-f1cc3fe396c8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.669 187212 DEBUG nova.compute.manager [req-2a34c3a4-32d5-4e83-8cb3-c7208f6c91f2 req-60b5d5d5-f2f2-4ee7-997c-f1cc3fe396c8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.669 187212 WARNING nova.compute.manager [req-2a34c3a4-32d5-4e83-8cb3-c7208f6c91f2 req-60b5d5d5-f2f2-4ee7-997c-f1cc3fe396c8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state active and task_state None.
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.773 187212 DEBUG nova.compute.provider_tree [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.791 187212 DEBUG nova.scheduler.client.report [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.830 187212 DEBUG oslo_concurrency.lockutils [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.857 187212 INFO nova.scheduler.client.report [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Deleted allocations for instance 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b
Dec 05 12:14:09 compute-0 nova_compute[187208]: 2025-12-05 12:14:09.920 187212 DEBUG oslo_concurrency.lockutils [None req-cd8d15b4-2b52-4be2-9270-052065367950 536970a3e3b745ee970e691d562540eb edb8f9390e454b10b9cc67dd88ba920b - - default default] Lock "67b4beef-63ef-4afd-8a2b-35c28d4f1e0b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.317 187212 DEBUG nova.network.neutron [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.335 187212 DEBUG oslo_concurrency.lockutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.369 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.370 187212 DEBUG nova.objects.instance [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.382 187212 DEBUG nova.objects.instance [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.397 187212 DEBUG nova.virt.libvirt.vif [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:14:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.398 187212 DEBUG nova.network.os_vif_util [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.399 187212 DEBUG nova.network.os_vif_util [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.400 187212 DEBUG os_vif [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.401 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.401 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5316adeb-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.403 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.404 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.408 187212 INFO os_vif [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.414 187212 DEBUG nova.virt.libvirt.driver [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start _get_guest_xml network_info=[{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.418 187212 WARNING nova.virt.libvirt.driver [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.424 187212 DEBUG nova.virt.libvirt.host [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.425 187212 DEBUG nova.virt.libvirt.host [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.428 187212 DEBUG nova.virt.libvirt.host [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.429 187212 DEBUG nova.virt.libvirt.host [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.429 187212 DEBUG nova.virt.libvirt.driver [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.429 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.430 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.430 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.431 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.431 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.431 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.432 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.432 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.432 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.432 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.433 187212 DEBUG nova.virt.hardware [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.433 187212 DEBUG nova.objects.instance [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.450 187212 DEBUG nova.virt.libvirt.vif [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:14:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.451 187212 DEBUG nova.network.os_vif_util [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.452 187212 DEBUG nova.network.os_vif_util [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.453 187212 DEBUG nova.objects.instance [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.469 187212 DEBUG nova.virt.libvirt.driver [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <uuid>2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</uuid>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <name>instance-0000005c</name>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestJSON-server-954339420</nova:name>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:14:11</nova:creationTime>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:14:11 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:14:11 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:14:11 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:14:11 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:14:11 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:14:11 compute-0 nova_compute[187208]:         <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec 05 12:14:11 compute-0 nova_compute[187208]:         <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:14:11 compute-0 nova_compute[187208]:         <nova:port uuid="5316adeb-5a49-4a58-b997-f132a083ff13">
Dec 05 12:14:11 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <system>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <entry name="serial">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <entry name="uuid">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     </system>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <os>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   </os>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <features>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   </features>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:9a:d0:34"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <target dev="tap5316adeb-5a"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/console.log" append="off"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <video>
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     </video>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:14:11 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:14:11 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:14:11 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:14:11 compute-0 nova_compute[187208]: </domain>
Dec 05 12:14:11 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.475 187212 DEBUG oslo_concurrency.processutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.543 187212 DEBUG oslo_concurrency.processutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.545 187212 DEBUG oslo_concurrency.processutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.613 187212 DEBUG oslo_concurrency.processutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.615 187212 DEBUG nova.objects.instance [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.630 187212 DEBUG oslo_concurrency.processutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.699 187212 DEBUG oslo_concurrency.processutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.700 187212 DEBUG nova.virt.disk.api [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.701 187212 DEBUG oslo_concurrency.processutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.776 187212 DEBUG oslo_concurrency.processutils [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.777 187212 DEBUG nova.virt.disk.api [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.778 187212 DEBUG nova.objects.instance [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.799 187212 DEBUG nova.virt.libvirt.vif [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:14:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.799 187212 DEBUG nova.network.os_vif_util [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.801 187212 DEBUG nova.network.os_vif_util [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.801 187212 DEBUG os_vif [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.802 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.803 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.803 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.806 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5316adeb-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.807 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5316adeb-5a, col_values=(('external_ids', {'iface-id': '5316adeb-5a49-4a58-b997-f132a083ff13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:d0:34', 'vm-uuid': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 NetworkManager[55691]: <info>  [1764936851.8103] manager: (tap5316adeb-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.811 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.814 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.815 187212 INFO os_vif [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:14:11 compute-0 kernel: tap5316adeb-5a: entered promiscuous mode
Dec 05 12:14:11 compute-0 NetworkManager[55691]: <info>  [1764936851.8964] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/386)
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.897 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 ovn_controller[95610]: 2025-12-05T12:14:11Z|00998|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec 05 12:14:11 compute-0 ovn_controller[95610]: 2025-12-05T12:14:11Z|00999|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.910 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.912 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.916 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:14:11 compute-0 ovn_controller[95610]: 2025-12-05T12:14:11Z|01000|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec 05 12:14:11 compute-0 ovn_controller[95610]: 2025-12-05T12:14:11Z|01001|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec 05 12:14:11 compute-0 nova_compute[187208]: 2025-12-05 12:14:11.920 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.929 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a74621f4-2b4d-4528-9790-2a691f616d27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.930 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9ed41c2-b1 in ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.932 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9ed41c2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.933 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7002b03a-eb91-4d40-a875-e9bd207d4925]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.933 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f6918e2e-9b26-4223-950c-63c0c5a59967]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:11 compute-0 systemd-udevd[238076]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.944 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[53222100-afa0-45f4-9e6a-a97d31135922]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:11 compute-0 systemd-machined[153543]: New machine qemu-113-instance-0000005c.
Dec 05 12:14:11 compute-0 NetworkManager[55691]: <info>  [1764936851.9547] device (tap5316adeb-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:14:11 compute-0 NetworkManager[55691]: <info>  [1764936851.9558] device (tap5316adeb-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:14:11 compute-0 systemd[1]: Started Virtual Machine qemu-113-instance-0000005c.
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.969 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e9c8c97e-20b6-4bbe-b41d-5fff4290b937]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:11 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:11.998 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[caf66c3d-d744-4208-b83f-661830050c94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.005 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9f505ea0-43f6-4db3-a678-f34893b59bef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 NetworkManager[55691]: <info>  [1764936852.0069] manager: (tapf9ed41c2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/387)
Dec 05 12:14:12 compute-0 systemd-udevd[238079]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.056 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a297b6e4-57e1-4d81-a83f-46fa6fe88a12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.060 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f87b7c82-1648-46a2-8216-e606b1a0858c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.077 187212 DEBUG nova.compute.manager [req-bbe9352d-426e-4517-960c-cca5e0e4e0dd req-07c7360e-31a2-4d39-a064-fbe8e8dc8fad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Received event network-vif-deleted-39c92a24-4461-4692-8f17-0b72bbaff52f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:12 compute-0 NetworkManager[55691]: <info>  [1764936852.0845] device (tapf9ed41c2-b0): carrier: link connected
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.089 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5e2f73-0f2e-48be-8810-c6054e92bbd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.108 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3b72bdf1-e7a9-4d10-84a6-503b7e39e496]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 276], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423264, 'reachable_time': 17167, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238109, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.122 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b59af8dd-cadc-4097-854b-1a20e7fddfb7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:9111'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423264, 'tstamp': 423264}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238110, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.141 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c6f886-e486-4d71-8975-15257d606ea3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 276], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423264, 'reachable_time': 17167, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238111, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.173 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af6e9ed2-6ac0-47c5-a99e-99f346388536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.231 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[74f0b438-89c8-4690-82d9-445a7c101e85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.233 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.234 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.234 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.237 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:12 compute-0 NetworkManager[55691]: <info>  [1764936852.2377] manager: (tapf9ed41c2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/388)
Dec 05 12:14:12 compute-0 kernel: tapf9ed41c2-b0: entered promiscuous mode
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.241 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:12 compute-0 ovn_controller[95610]: 2025-12-05T12:14:12Z|01002|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.243 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.244 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.245 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd9cbbf-3c18-495f-a4f6-03aede002659]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.246 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:14:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:12.247 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'env', 'PROCESS_TAG=haproxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9ed41c2-b085-41ff-ac71-6256a4e30e85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.254 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.390 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.391 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936852.389854, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.392 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Resumed (Lifecycle Event)
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.394 187212 DEBUG nova.compute.manager [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.397 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance rebooted successfully.
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.398 187212 DEBUG nova.compute.manager [None req-44d07662-f3db-417f-a7a1-665d92a36a0d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.412 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.415 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.438 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (powering-on). Skip.
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.439 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936852.3910913, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.439 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Started (Lifecycle Event)
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.462 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:12 compute-0 nova_compute[187208]: 2025-12-05 12:14:12.466 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:12 compute-0 podman[238150]: 2025-12-05 12:14:12.617920607 +0000 UTC m=+0.056641649 container create dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 05 12:14:12 compute-0 systemd[1]: Started libpod-conmon-dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588.scope.
Dec 05 12:14:12 compute-0 podman[238150]: 2025-12-05 12:14:12.587874688 +0000 UTC m=+0.026595750 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:14:12 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:14:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4f61021eca0a148ad38618e27fbc51c8fc72b7b15ec883b20606746789f0aea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:14:12 compute-0 podman[238150]: 2025-12-05 12:14:12.716114076 +0000 UTC m=+0.154835138 container init dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:14:12 compute-0 podman[238150]: 2025-12-05 12:14:12.725567689 +0000 UTC m=+0.164288731 container start dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:14:12 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[238165]: [NOTICE]   (238169) : New worker (238171) forked
Dec 05 12:14:12 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[238165]: [NOTICE]   (238169) : Loading success.
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.603 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "6926ef53-01fc-476e-a6af-82edff2ead1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.603 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.624 187212 DEBUG nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.689 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.690 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.701 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.701 187212 INFO nova.compute.claims [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:13 compute-0 ovn_controller[95610]: 2025-12-05T12:14:13Z|01003|binding|INFO|Releasing lport c15f026e-161e-4d8d-81ec-2dd0eb1e85f6 from this chassis (sb_readonly=0)
Dec 05 12:14:13 compute-0 ovn_controller[95610]: 2025-12-05T12:14:13Z|01004|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:14:13 compute-0 ovn_controller[95610]: 2025-12-05T12:14:13Z|01005|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.893 187212 DEBUG nova.compute.provider_tree [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.897 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.920 187212 DEBUG nova.scheduler.client.report [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.946 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:13 compute-0 nova_compute[187208]: 2025-12-05 12:14:13.947 187212 DEBUG nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.003 187212 DEBUG nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.004 187212 DEBUG nova.network.neutron [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.024 187212 INFO nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.051 187212 DEBUG nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.143 187212 DEBUG nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.145 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.146 187212 INFO nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Creating image(s)
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.146 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "/var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.147 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "/var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.148 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "/var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.166 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.246 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.247 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.248 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.262 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.324 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.325 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.384 187212 DEBUG nova.policy [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '24c692cffbb647a3978075eca36d4254', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '911e6d79fd1248ed827eff507ac9b603', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.387 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936839.38496, a9c0d69f-7894-4e3f-a056-4225da882a38 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.388 187212 INFO nova.compute.manager [-] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] VM Stopped (Lifecycle Event)
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.411 187212 DEBUG nova.compute.manager [None req-50872df1-d44b-4670-ba72-1ea74d2dadea - - - - - -] [instance: a9c0d69f-7894-4e3f-a056-4225da882a38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.653 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk 1073741824" returned: 0 in 0.327s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.655 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.407s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.656 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.735 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.736 187212 DEBUG nova.virt.disk.api [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Checking if we can resize image /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.737 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.764 187212 DEBUG nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.765 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.766 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.767 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.767 187212 DEBUG nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.768 187212 WARNING nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.768 187212 DEBUG nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.768 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.769 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.769 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.770 187212 DEBUG nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.770 187212 WARNING nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.771 187212 DEBUG nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.771 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.772 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.772 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.772 187212 DEBUG nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.773 187212 WARNING nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.773 187212 DEBUG nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.774 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.774 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.774 187212 DEBUG oslo_concurrency.lockutils [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.775 187212 DEBUG nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.775 187212 WARNING nova.compute.manager [req-c49e45b3-2f73-42d7-8875-d6e42240e0bf req-2a8a27ef-ae4a-4ed7-8262-774ef97f7d8a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.798 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.799 187212 DEBUG nova.virt.disk.api [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Cannot resize image /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.800 187212 DEBUG nova.objects.instance [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lazy-loading 'migration_context' on Instance uuid 6926ef53-01fc-476e-a6af-82edff2ead1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.819 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.820 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Ensure instance console log exists: /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.820 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.821 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:14 compute-0 nova_compute[187208]: 2025-12-05 12:14:14.821 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:15 compute-0 nova_compute[187208]: 2025-12-05 12:14:15.000 187212 DEBUG nova.network.neutron [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Successfully created port: 7b34ea0e-cc76-42fc-bf46-afb764b183a3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:14:15 compute-0 podman[238196]: 2025-12-05 12:14:15.222294534 +0000 UTC m=+0.070938302 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:14:15 compute-0 podman[238197]: 2025-12-05 12:14:15.248503591 +0000 UTC m=+0.098195540 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:14:15 compute-0 podman[238241]: 2025-12-05 12:14:15.323943042 +0000 UTC m=+0.081638681 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:14:15 compute-0 nova_compute[187208]: 2025-12-05 12:14:15.794 187212 DEBUG nova.network.neutron [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Successfully updated port: 7b34ea0e-cc76-42fc-bf46-afb764b183a3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:14:15 compute-0 nova_compute[187208]: 2025-12-05 12:14:15.811 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "refresh_cache-6926ef53-01fc-476e-a6af-82edff2ead1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:14:15 compute-0 nova_compute[187208]: 2025-12-05 12:14:15.811 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquired lock "refresh_cache-6926ef53-01fc-476e-a6af-82edff2ead1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:14:15 compute-0 nova_compute[187208]: 2025-12-05 12:14:15.811 187212 DEBUG nova.network.neutron [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:14:16 compute-0 nova_compute[187208]: 2025-12-05 12:14:16.163 187212 DEBUG nova.network.neutron [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:14:16 compute-0 nova_compute[187208]: 2025-12-05 12:14:16.809 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.216 187212 DEBUG nova.compute.manager [req-d46f3650-92da-4eda-96ee-9f5ade7dd8d4 req-08c8cd71-0e9a-4029-9d13-9edc6254e19a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Received event network-changed-7b34ea0e-cc76-42fc-bf46-afb764b183a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.216 187212 DEBUG nova.compute.manager [req-d46f3650-92da-4eda-96ee-9f5ade7dd8d4 req-08c8cd71-0e9a-4029-9d13-9edc6254e19a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Refreshing instance network info cache due to event network-changed-7b34ea0e-cc76-42fc-bf46-afb764b183a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.217 187212 DEBUG oslo_concurrency.lockutils [req-d46f3650-92da-4eda-96ee-9f5ade7dd8d4 req-08c8cd71-0e9a-4029-9d13-9edc6254e19a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-6926ef53-01fc-476e-a6af-82edff2ead1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.228 187212 DEBUG oslo_concurrency.lockutils [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.228 187212 DEBUG oslo_concurrency.lockutils [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.229 187212 DEBUG oslo_concurrency.lockutils [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.229 187212 DEBUG oslo_concurrency.lockutils [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.230 187212 DEBUG oslo_concurrency.lockutils [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.231 187212 INFO nova.compute.manager [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Terminating instance
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.231 187212 DEBUG nova.compute.manager [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:14:17 compute-0 kernel: tap96dab709-f4 (unregistering): left promiscuous mode
Dec 05 12:14:17 compute-0 NetworkManager[55691]: <info>  [1764936857.2780] device (tap96dab709-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.279 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:17 compute-0 ovn_controller[95610]: 2025-12-05T12:14:17Z|01006|binding|INFO|Releasing lport 96dab709-f4e0-48a6-ab76-0b13fdf97017 from this chassis (sb_readonly=0)
Dec 05 12:14:17 compute-0 ovn_controller[95610]: 2025-12-05T12:14:17Z|01007|binding|INFO|Setting lport 96dab709-f4e0-48a6-ab76-0b13fdf97017 down in Southbound
Dec 05 12:14:17 compute-0 ovn_controller[95610]: 2025-12-05T12:14:17Z|01008|binding|INFO|Removing iface tap96dab709-f4 ovn-installed in OVS
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.288 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:a5:e5 10.100.0.9'], port_security=['fa:16:3e:82:a5:e5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '30cb83d4-3a34-4420-bc83-099b266da48c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c025e40-a124-4810-9d75-2a59e91db1b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb6153ad-93a7-415e-b3e6-b8e71463232b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a67b06f-79a0-439a-99ee-b21f00b866a6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=96dab709-f4e0-48a6-ab76-0b13fdf97017) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.301 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.304 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 96dab709-f4e0-48a6-ab76-0b13fdf97017 in datapath 0c025e40-a124-4810-9d75-2a59e91db1b3 unbound from our chassis
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.306 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0c025e40-a124-4810-9d75-2a59e91db1b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.308 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b61fe566-28b1-4c0d-afc5-df562066c761]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.308 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3 namespace which is not needed anymore
Dec 05 12:14:17 compute-0 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Dec 05 12:14:17 compute-0 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d0000004f.scope: Consumed 20.004s CPU time.
Dec 05 12:14:17 compute-0 systemd-machined[153543]: Machine qemu-89-instance-0000004f terminated.
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.514 187212 INFO nova.virt.libvirt.driver [-] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Instance destroyed successfully.
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.515 187212 DEBUG nova.objects.instance [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'resources' on Instance uuid 30cb83d4-3a34-4420-bc83-099b266da48c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:17 compute-0 neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3[232643]: [NOTICE]   (232647) : haproxy version is 2.8.14-c23fe91
Dec 05 12:14:17 compute-0 neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3[232643]: [NOTICE]   (232647) : path to executable is /usr/sbin/haproxy
Dec 05 12:14:17 compute-0 neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3[232643]: [WARNING]  (232647) : Exiting Master process...
Dec 05 12:14:17 compute-0 neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3[232643]: [ALERT]    (232647) : Current worker (232649) exited with code 143 (Terminated)
Dec 05 12:14:17 compute-0 neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3[232643]: [WARNING]  (232647) : All workers exited. Exiting... (0)
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.529 187212 DEBUG nova.virt.libvirt.vif [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:10:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-₡-1444488967',display_name='tempest-₡-1444488967',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest--1444488967',id=79,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c982a61e3fc4c8da9248076bb0361ac',ramdisk_id='',reservation_id='r-o6d8hryc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1492365581',owner_user_name='tempest-ServersTestJSON-1492365581-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:47Z,user_data=None,user_id='62153b585ecc4e6fa2ad567851d49081',uuid=30cb83d4-3a34-4420-bc83-099b266da48c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:14:17 compute-0 systemd[1]: libpod-9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33.scope: Deactivated successfully.
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.530 187212 DEBUG nova.network.os_vif_util [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converting VIF {"id": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "address": "fa:16:3e:82:a5:e5", "network": {"id": "0c025e40-a124-4810-9d75-2a59e91db1b3", "bridge": "br-int", "label": "tempest-ServersTestJSON-754247120-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c982a61e3fc4c8da9248076bb0361ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96dab709-f4", "ovs_interfaceid": "96dab709-f4e0-48a6-ab76-0b13fdf97017", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.533 187212 DEBUG nova.network.os_vif_util [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:a5:e5,bridge_name='br-int',has_traffic_filtering=True,id=96dab709-f4e0-48a6-ab76-0b13fdf97017,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96dab709-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.534 187212 DEBUG os_vif [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:a5:e5,bridge_name='br-int',has_traffic_filtering=True,id=96dab709-f4e0-48a6-ab76-0b13fdf97017,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96dab709-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:17 compute-0 podman[238290]: 2025-12-05 12:14:17.539177679 +0000 UTC m=+0.107071196 container died 9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.541 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96dab709-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.548 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.552 187212 INFO os_vif [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:a5:e5,bridge_name='br-int',has_traffic_filtering=True,id=96dab709-f4e0-48a6-ab76-0b13fdf97017,network=Network(0c025e40-a124-4810-9d75-2a59e91db1b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96dab709-f4')
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.553 187212 INFO nova.virt.libvirt.driver [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Deleting instance files /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c_del
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.554 187212 INFO nova.virt.libvirt.driver [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Deletion of /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c_del complete
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.610 187212 INFO nova.compute.manager [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.610 187212 DEBUG oslo.service.loopingcall [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.611 187212 DEBUG nova.compute.manager [-] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.611 187212 DEBUG nova.network.neutron [-] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:14:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33-userdata-shm.mount: Deactivated successfully.
Dec 05 12:14:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-1fff199011b698c5a7285fa6b201980fd6fde3394f58b1348881bea980a3c7d7-merged.mount: Deactivated successfully.
Dec 05 12:14:17 compute-0 podman[238290]: 2025-12-05 12:14:17.713547451 +0000 UTC m=+0.281440968 container cleanup 9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:14:17 compute-0 systemd[1]: libpod-conmon-9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33.scope: Deactivated successfully.
Dec 05 12:14:17 compute-0 podman[238331]: 2025-12-05 12:14:17.800363891 +0000 UTC m=+0.060218952 container remove 9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.808 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a0bfa3-d844-4db4-80d9-1a64e6905c00]: (4, ('Fri Dec  5 12:14:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3 (9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33)\n9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33\nFri Dec  5 12:14:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3 (9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33)\n9c06506e836ae6ce5b010196648a9faba9e9fbb716c99a2a096517fcee642c33\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.813 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4e92b812-be96-4fa7-b981-2a9ad83354b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.816 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c025e40-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:17 compute-0 kernel: tap0c025e40-a0: left promiscuous mode
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.839 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[acf4e18e-29d0-4956-8a6d-398d944e9fb1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.858 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1e3999-4e90-48f0-991d-1f70b262bdb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.860 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[038cd524-feb5-4ab0-a003-3b2c01d5a53a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.879 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f0746ead-bf07-49c1-864e-7f2346c483fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402463, 'reachable_time': 21371, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238348, 'error': None, 'target': 'ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.882 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0c025e40-a124-4810-9d75-2a59e91db1b3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:14:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:17.882 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[4b5bc13a-a39d-42a1-b3f5-2988eac7c680]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d0c025e40\x2da124\x2d4810\x2d9d75\x2d2a59e91db1b3.mount: Deactivated successfully.
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.968 187212 DEBUG nova.network.neutron [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Updating instance_info_cache with network_info: [{"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.989 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Releasing lock "refresh_cache-6926ef53-01fc-476e-a6af-82edff2ead1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.990 187212 DEBUG nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Instance network_info: |[{"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.991 187212 DEBUG oslo_concurrency.lockutils [req-d46f3650-92da-4eda-96ee-9f5ade7dd8d4 req-08c8cd71-0e9a-4029-9d13-9edc6254e19a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-6926ef53-01fc-476e-a6af-82edff2ead1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.991 187212 DEBUG nova.network.neutron [req-d46f3650-92da-4eda-96ee-9f5ade7dd8d4 req-08c8cd71-0e9a-4029-9d13-9edc6254e19a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Refreshing network info cache for port 7b34ea0e-cc76-42fc-bf46-afb764b183a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:14:17 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.995 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Start _get_guest_xml network_info=[{"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:17.999 187212 WARNING nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.012 187212 DEBUG nova.virt.libvirt.host [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.013 187212 DEBUG nova.virt.libvirt.host [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.017 187212 DEBUG nova.virt.libvirt.host [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.018 187212 DEBUG nova.virt.libvirt.host [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.019 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.019 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.020 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.020 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.020 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.020 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.020 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.021 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.021 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.021 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.021 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.022 187212 DEBUG nova.virt.hardware [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.027 187212 DEBUG nova.virt.libvirt.vif [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1009711222',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1009711222',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1009711222',id=96,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='911e6d79fd1248ed827eff507ac9b603',ramdisk_id='',reservation_id='r-1n8kh764',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-166421396',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-166421396-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:14:14Z,user_data=None,user_id='24c692cffbb647a3978075eca36d4254',uuid=6926ef53-01fc-476e-a6af-82edff2ead1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.028 187212 DEBUG nova.network.os_vif_util [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Converting VIF {"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.029 187212 DEBUG nova.network.os_vif_util [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:26:b9,bridge_name='br-int',has_traffic_filtering=True,id=7b34ea0e-cc76-42fc-bf46-afb764b183a3,network=Network(523b0e7b-0762-4f8e-862f-6106fd936b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b34ea0e-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.030 187212 DEBUG nova.objects.instance [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6926ef53-01fc-476e-a6af-82edff2ead1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.049 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <uuid>6926ef53-01fc-476e-a6af-82edff2ead1c</uuid>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <name>instance-00000060</name>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <nova:name>tempest-ServersNegativeTestMultiTenantJSON-server-1009711222</nova:name>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:14:18</nova:creationTime>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:14:18 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:14:18 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:14:18 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:14:18 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:14:18 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:14:18 compute-0 nova_compute[187208]:         <nova:user uuid="24c692cffbb647a3978075eca36d4254">tempest-ServersNegativeTestMultiTenantJSON-166421396-project-member</nova:user>
Dec 05 12:14:18 compute-0 nova_compute[187208]:         <nova:project uuid="911e6d79fd1248ed827eff507ac9b603">tempest-ServersNegativeTestMultiTenantJSON-166421396</nova:project>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:14:18 compute-0 nova_compute[187208]:         <nova:port uuid="7b34ea0e-cc76-42fc-bf46-afb764b183a3">
Dec 05 12:14:18 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <system>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <entry name="serial">6926ef53-01fc-476e-a6af-82edff2ead1c</entry>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <entry name="uuid">6926ef53-01fc-476e-a6af-82edff2ead1c</entry>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     </system>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <os>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   </os>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <features>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   </features>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk.config"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:ec:26:b9"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <target dev="tap7b34ea0e-cc"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/console.log" append="off"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <video>
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     </video>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:14:18 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:14:18 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:14:18 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:14:18 compute-0 nova_compute[187208]: </domain>
Dec 05 12:14:18 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.049 187212 DEBUG nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Preparing to wait for external event network-vif-plugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.049 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.050 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.050 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.051 187212 DEBUG nova.virt.libvirt.vif [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1009711222',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1009711222',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1009711222',id=96,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='911e6d79fd1248ed827eff507ac9b603',ramdisk_id='',reservation_id='r-1n8kh764',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-166421396',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-166421396-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:14:14Z,user_data=None,user_id='24c692cffbb647a3978075eca36d4254',uuid=6926ef53-01fc-476e-a6af-82edff2ead1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.051 187212 DEBUG nova.network.os_vif_util [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Converting VIF {"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.051 187212 DEBUG nova.network.os_vif_util [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:26:b9,bridge_name='br-int',has_traffic_filtering=True,id=7b34ea0e-cc76-42fc-bf46-afb764b183a3,network=Network(523b0e7b-0762-4f8e-862f-6106fd936b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b34ea0e-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.052 187212 DEBUG os_vif [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:26:b9,bridge_name='br-int',has_traffic_filtering=True,id=7b34ea0e-cc76-42fc-bf46-afb764b183a3,network=Network(523b0e7b-0762-4f8e-862f-6106fd936b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b34ea0e-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.053 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.053 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.070 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.071 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b34ea0e-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.071 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b34ea0e-cc, col_values=(('external_ids', {'iface-id': '7b34ea0e-cc76-42fc-bf46-afb764b183a3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:26:b9', 'vm-uuid': '6926ef53-01fc-476e-a6af-82edff2ead1c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.073 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:18 compute-0 NetworkManager[55691]: <info>  [1764936858.0744] manager: (tap7b34ea0e-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/389)
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.076 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.079 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.080 187212 INFO os_vif [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:26:b9,bridge_name='br-int',has_traffic_filtering=True,id=7b34ea0e-cc76-42fc-bf46-afb764b183a3,network=Network(523b0e7b-0762-4f8e-862f-6106fd936b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b34ea0e-cc')
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.136 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.136 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.136 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] No VIF found with MAC fa:16:3e:ec:26:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.137 187212 INFO nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Using config drive
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.701 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.758 187212 DEBUG nova.compute.manager [req-9f751178-b9fb-49ab-943a-3e9f688f9e71 req-666d5d30-cad1-42fb-8afb-dac3cbdf086e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Received event network-vif-unplugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.758 187212 DEBUG oslo_concurrency.lockutils [req-9f751178-b9fb-49ab-943a-3e9f688f9e71 req-666d5d30-cad1-42fb-8afb-dac3cbdf086e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.759 187212 DEBUG oslo_concurrency.lockutils [req-9f751178-b9fb-49ab-943a-3e9f688f9e71 req-666d5d30-cad1-42fb-8afb-dac3cbdf086e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.759 187212 DEBUG oslo_concurrency.lockutils [req-9f751178-b9fb-49ab-943a-3e9f688f9e71 req-666d5d30-cad1-42fb-8afb-dac3cbdf086e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.759 187212 DEBUG nova.compute.manager [req-9f751178-b9fb-49ab-943a-3e9f688f9e71 req-666d5d30-cad1-42fb-8afb-dac3cbdf086e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] No waiting events found dispatching network-vif-unplugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:18 compute-0 nova_compute[187208]: 2025-12-05 12:14:18.759 187212 DEBUG nova.compute.manager [req-9f751178-b9fb-49ab-943a-3e9f688f9e71 req-666d5d30-cad1-42fb-8afb-dac3cbdf086e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Received event network-vif-unplugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.042 187212 INFO nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Creating config drive at /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk.config
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.049 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpokp155qt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.141 187212 DEBUG nova.network.neutron [-] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.168 187212 INFO nova.compute.manager [-] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Took 1.56 seconds to deallocate network for instance.
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.188 187212 DEBUG oslo_concurrency.processutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpokp155qt" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.226 187212 DEBUG oslo_concurrency.lockutils [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.227 187212 DEBUG oslo_concurrency.lockutils [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:19 compute-0 kernel: tap7b34ea0e-cc: entered promiscuous mode
Dec 05 12:14:19 compute-0 NetworkManager[55691]: <info>  [1764936859.2749] manager: (tap7b34ea0e-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/390)
Dec 05 12:14:19 compute-0 ovn_controller[95610]: 2025-12-05T12:14:19Z|01009|binding|INFO|Claiming lport 7b34ea0e-cc76-42fc-bf46-afb764b183a3 for this chassis.
Dec 05 12:14:19 compute-0 ovn_controller[95610]: 2025-12-05T12:14:19Z|01010|binding|INFO|7b34ea0e-cc76-42fc-bf46-afb764b183a3: Claiming fa:16:3e:ec:26:b9 10.100.0.5
Dec 05 12:14:19 compute-0 systemd-udevd[238271]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.284 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:26:b9 10.100.0.5'], port_security=['fa:16:3e:ec:26:b9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6926ef53-01fc-476e-a6af-82edff2ead1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-523b0e7b-0762-4f8e-862f-6106fd936b93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '911e6d79fd1248ed827eff507ac9b603', 'neutron:revision_number': '2', 'neutron:security_group_ids': '918631d7-a624-4f54-ad5e-2edce0704980', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f96be3e-e12b-443d-bfc0-ddb84ec79d39, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7b34ea0e-cc76-42fc-bf46-afb764b183a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.285 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7b34ea0e-cc76-42fc-bf46-afb764b183a3 in datapath 523b0e7b-0762-4f8e-862f-6106fd936b93 bound to our chassis
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.288 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 523b0e7b-0762-4f8e-862f-6106fd936b93
Dec 05 12:14:19 compute-0 ovn_controller[95610]: 2025-12-05T12:14:19Z|01011|binding|INFO|Setting lport 7b34ea0e-cc76-42fc-bf46-afb764b183a3 ovn-installed in OVS
Dec 05 12:14:19 compute-0 ovn_controller[95610]: 2025-12-05T12:14:19Z|01012|binding|INFO|Setting lport 7b34ea0e-cc76-42fc-bf46-afb764b183a3 up in Southbound
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.289 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:19 compute-0 NetworkManager[55691]: <info>  [1764936859.3018] device (tap7b34ea0e-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:14:19 compute-0 NetworkManager[55691]: <info>  [1764936859.3032] device (tap7b34ea0e-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.312 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee03cd9-d731-4edd-b945-ff358558b53e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.313 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap523b0e7b-01 in ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.315 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap523b0e7b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.315 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dce46a30-547a-4936-919e-8d27c1fbbdfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.317 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a9610909-a8e0-4419-9ec3-e9478ab601b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 systemd-machined[153543]: New machine qemu-114-instance-00000060.
Dec 05 12:14:19 compute-0 systemd[1]: Started Virtual Machine qemu-114-instance-00000060.
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.337 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[36c06038-7885-4d5f-a22e-c3c6629db6e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.358 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[67e3a477-21df-42ac-ad3a-995c6cab1156]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.399 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[778e1e68-c596-4e79-8af2-b1c0a3bc072f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 NetworkManager[55691]: <info>  [1764936859.4073] manager: (tap523b0e7b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/391)
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.409 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb49b34-781d-45fe-bd66-3b565d72e6fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.415 187212 DEBUG nova.compute.provider_tree [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.437 187212 DEBUG nova.scheduler.client.report [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.451 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a0bb415a-1e32-495a-ab3b-569bade8815f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.454 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d08d39e1-b7a5-4d85-890a-722265922df1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 NetworkManager[55691]: <info>  [1764936859.4837] device (tap523b0e7b-00): carrier: link connected
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.488 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9e21ca26-f583-4531-955b-1aceaac8ffa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.505 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b225f6-5e58-42d5-91e1-014b458776fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap523b0e7b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:c9:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 279], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424004, 'reachable_time': 34605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238404, 'error': None, 'target': 'ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.524 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a3325e-ac1b-48c4-94e9-c7bf9c8f5235]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec9:c9f9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424004, 'tstamp': 424004}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238405, 'error': None, 'target': 'ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.540 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9885b177-2659-4950-b6d3-2779c5119c3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap523b0e7b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:c9:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 279], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424004, 'reachable_time': 34605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238406, 'error': None, 'target': 'ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.578 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b74e6255-d563-4af7-ac59-5c0ab894e561]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.605 187212 DEBUG oslo_concurrency.lockutils [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.378s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.650 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2ac141-2861-4921-84c5-40f79bdb3a5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.651 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap523b0e7b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.651 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.652 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap523b0e7b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:19 compute-0 NetworkManager[55691]: <info>  [1764936859.6543] manager: (tap523b0e7b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/392)
Dec 05 12:14:19 compute-0 kernel: tap523b0e7b-00: entered promiscuous mode
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.658 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap523b0e7b-00, col_values=(('external_ids', {'iface-id': '7727e77e-1213-44ea-8a4c-fc20ad3df080'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:19 compute-0 ovn_controller[95610]: 2025-12-05T12:14:19Z|01013|binding|INFO|Releasing lport 7727e77e-1213-44ea-8a4c-fc20ad3df080 from this chassis (sb_readonly=0)
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.662 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/523b0e7b-0762-4f8e-862f-6106fd936b93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/523b0e7b-0762-4f8e-862f-6106fd936b93.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.663 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f86afe90-55e7-425d-a98f-32fda76fd86b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.664 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-523b0e7b-0762-4f8e-862f-6106fd936b93
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/523b0e7b-0762-4f8e-862f-6106fd936b93.pid.haproxy
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 523b0e7b-0762-4f8e-862f-6106fd936b93
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:14:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:19.664 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93', 'env', 'PROCESS_TAG=haproxy-523b0e7b-0762-4f8e-862f-6106fd936b93', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/523b0e7b-0762-4f8e-862f-6106fd936b93.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.674 187212 INFO nova.scheduler.client.report [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Deleted allocations for instance 30cb83d4-3a34-4420-bc83-099b266da48c
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.727 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936859.726822, 6926ef53-01fc-476e-a6af-82edff2ead1c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.727 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] VM Started (Lifecycle Event)
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.751 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.757 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936859.7291079, 6926ef53-01fc-476e-a6af-82edff2ead1c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.757 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] VM Paused (Lifecycle Event)
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.761 187212 DEBUG oslo_concurrency.lockutils [None req-3e28edfb-7819-4729-a50a-98994bf85e46 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.785 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.788 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:19 compute-0 nova_compute[187208]: 2025-12-05 12:14:19.808 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:14:20 compute-0 podman[238445]: 2025-12-05 12:14:20.01709114 +0000 UTC m=+0.023497300 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:14:20 compute-0 podman[238445]: 2025-12-05 12:14:20.11499132 +0000 UTC m=+0.121397460 container create 376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 12:14:20 compute-0 systemd[1]: Started libpod-conmon-376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7.scope.
Dec 05 12:14:20 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:14:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df66501b2d0845d8c3252f40c59edfed532cea70ab994ada636319c8fb762bf3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:14:20 compute-0 ovn_controller[95610]: 2025-12-05T12:14:20Z|00118|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:50:8e:78 10.100.0.9
Dec 05 12:14:20 compute-0 nova_compute[187208]: 2025-12-05 12:14:20.313 187212 DEBUG nova.compute.manager [req-4a8c051c-613b-4c6d-b59a-fd2fe3a6c6d4 req-32ba792a-223e-49a1-b658-428e46eda02e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Received event network-vif-deleted-96dab709-f4e0-48a6-ab76-0b13fdf97017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:20 compute-0 podman[238445]: 2025-12-05 12:14:20.458477871 +0000 UTC m=+0.464884021 container init 376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 12:14:20 compute-0 podman[238445]: 2025-12-05 12:14:20.464855856 +0000 UTC m=+0.471261996 container start 376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:14:20 compute-0 neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93[238460]: [NOTICE]   (238464) : New worker (238466) forked
Dec 05 12:14:20 compute-0 neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93[238460]: [NOTICE]   (238464) : Loading success.
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.275 187212 DEBUG nova.network.neutron [req-d46f3650-92da-4eda-96ee-9f5ade7dd8d4 req-08c8cd71-0e9a-4029-9d13-9edc6254e19a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Updated VIF entry in instance network info cache for port 7b34ea0e-cc76-42fc-bf46-afb764b183a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.276 187212 DEBUG nova.network.neutron [req-d46f3650-92da-4eda-96ee-9f5ade7dd8d4 req-08c8cd71-0e9a-4029-9d13-9edc6254e19a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Updating instance_info_cache with network_info: [{"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.292 187212 DEBUG oslo_concurrency.lockutils [req-d46f3650-92da-4eda-96ee-9f5ade7dd8d4 req-08c8cd71-0e9a-4029-9d13-9edc6254e19a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-6926ef53-01fc-476e-a6af-82edff2ead1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.550 187212 DEBUG nova.compute.manager [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Received event network-vif-plugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.551 187212 DEBUG oslo_concurrency.lockutils [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.551 187212 DEBUG oslo_concurrency.lockutils [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.552 187212 DEBUG oslo_concurrency.lockutils [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.552 187212 DEBUG nova.compute.manager [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] No waiting events found dispatching network-vif-plugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.552 187212 WARNING nova.compute.manager [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Received unexpected event network-vif-plugged-96dab709-f4e0-48a6-ab76-0b13fdf97017 for instance with vm_state deleted and task_state None.
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.553 187212 DEBUG nova.compute.manager [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Received event network-vif-plugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.553 187212 DEBUG oslo_concurrency.lockutils [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.554 187212 DEBUG oslo_concurrency.lockutils [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.554 187212 DEBUG oslo_concurrency.lockutils [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.555 187212 DEBUG nova.compute.manager [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Processing event network-vif-plugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.555 187212 DEBUG nova.compute.manager [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Received event network-vif-plugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.556 187212 DEBUG oslo_concurrency.lockutils [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.556 187212 DEBUG oslo_concurrency.lockutils [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.557 187212 DEBUG oslo_concurrency.lockutils [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.557 187212 DEBUG nova.compute.manager [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] No waiting events found dispatching network-vif-plugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.557 187212 WARNING nova.compute.manager [req-d7b7f5ee-bcce-4f89-ab54-55def02ec51f req-1d308f5f-5620-48aa-ae5c-dc43409c31de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Received unexpected event network-vif-plugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 for instance with vm_state building and task_state spawning.
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.559 187212 DEBUG nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.563 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936861.5635886, 6926ef53-01fc-476e-a6af-82edff2ead1c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.564 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] VM Resumed (Lifecycle Event)
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.583 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.584 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.588 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.591 187212 INFO nova.virt.libvirt.driver [-] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Instance spawned successfully.
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.592 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.614 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.618 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.619 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.619 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.619 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.620 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.620 187212 DEBUG nova.virt.libvirt.driver [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.701 187212 INFO nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Took 7.56 seconds to spawn the instance on the hypervisor.
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.701 187212 DEBUG nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.767 187212 INFO nova.compute.manager [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Took 8.10 seconds to build instance.
Dec 05 12:14:21 compute-0 nova_compute[187208]: 2025-12-05 12:14:21.791 187212 DEBUG oslo_concurrency.lockutils [None req-3ed5d757-a2b2-487b-aee4-6f8cc3dd4b54 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.009 187212 INFO nova.compute.manager [None req-6c7cebad-3292-4fc1-ba45-c97e850b33dd 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Pausing
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.010 187212 DEBUG nova.objects.instance [None req-6c7cebad-3292-4fc1-ba45-c97e850b33dd 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.048 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936862.0481753, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.049 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Paused (Lifecycle Event)
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.053 187212 DEBUG nova.compute.manager [None req-6c7cebad-3292-4fc1-ba45-c97e850b33dd 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.072 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.077 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.110 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (pausing). Skip.
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.250 187212 DEBUG nova.objects.instance [None req-bff0ddcf-35e7-40f1-93dd-51a378d02b59 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'pci_devices' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.279 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936862.2785325, 28e48516-8665-4d98-a92d-c84b7da9a284 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.279 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Paused (Lifecycle Event)
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.297 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.301 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.323 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] During sync_power_state the instance has a pending task (suspending). Skip.
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.792 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936847.7908332, 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.792 187212 INFO nova.compute.manager [-] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] VM Stopped (Lifecycle Event)
Dec 05 12:14:22 compute-0 nova_compute[187208]: 2025-12-05 12:14:22.813 187212 DEBUG nova.compute.manager [None req-b1c16422-c042-48c7-ba0e-9182c2c46012 - - - - - -] [instance: 67b4beef-63ef-4afd-8a2b-35c28d4f1e0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.075 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:23 compute-0 kernel: tape30774db-d3 (unregistering): left promiscuous mode
Dec 05 12:14:23 compute-0 NetworkManager[55691]: <info>  [1764936863.5577] device (tape30774db-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.569 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01014|binding|INFO|Releasing lport e30774db-d3d3-4438-b68a-6f7855f55128 from this chassis (sb_readonly=0)
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01015|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 down in Southbound
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01016|binding|INFO|Removing iface tape30774db-d3 ovn-installed in OVS
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.572 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.584 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:23.580 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8e:78 10.100.0.9'], port_security=['fa:16:3e:50:8e:78 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '9', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e30774db-d3d3-4438-b68a-6f7855f55128) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:23.581 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e30774db-d3d3-4438-b68a-6f7855f55128 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be unbound from our chassis
Dec 05 12:14:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:23.583 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82130d25-ff6c-480e-884d-f3d97b6fd9be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:14:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:23.584 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[82c1d4f5-34c4-407d-869c-702ddabef173]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:23.585 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be namespace which is not needed anymore
Dec 05 12:14:23 compute-0 systemd[1]: machine-qemu\x2d112\x2dinstance\x2d00000050.scope: Deactivated successfully.
Dec 05 12:14:23 compute-0 systemd[1]: machine-qemu\x2d112\x2dinstance\x2d00000050.scope: Consumed 13.203s CPU time.
Dec 05 12:14:23 compute-0 systemd-machined[153543]: Machine qemu-112-instance-00000050 terminated.
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.741 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:23 compute-0 kernel: tape30774db-d3: entered promiscuous mode
Dec 05 12:14:23 compute-0 NetworkManager[55691]: <info>  [1764936863.7627] manager: (tape30774db-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/393)
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.762 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01017|binding|INFO|Claiming lport e30774db-d3d3-4438-b68a-6f7855f55128 for this chassis.
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01018|binding|INFO|e30774db-d3d3-4438-b68a-6f7855f55128: Claiming fa:16:3e:50:8e:78 10.100.0.9
Dec 05 12:14:23 compute-0 systemd-udevd[238484]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:14:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:23.771 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8e:78 10.100.0.9'], port_security=['fa:16:3e:50:8e:78 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '9', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e30774db-d3d3-4438-b68a-6f7855f55128) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:23 compute-0 kernel: tape30774db-d3 (unregistering): left promiscuous mode
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01019|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 ovn-installed in OVS
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01020|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 up in Southbound
Dec 05 12:14:23 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[237876]: [NOTICE]   (237880) : haproxy version is 2.8.14-c23fe91
Dec 05 12:14:23 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[237876]: [NOTICE]   (237880) : path to executable is /usr/sbin/haproxy
Dec 05 12:14:23 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[237876]: [WARNING]  (237880) : Exiting Master process...
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:23 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[237876]: [ALERT]    (237880) : Current worker (237883) exited with code 143 (Terminated)
Dec 05 12:14:23 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[237876]: [WARNING]  (237880) : All workers exited. Exiting... (0)
Dec 05 12:14:23 compute-0 systemd[1]: libpod-ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a.scope: Deactivated successfully.
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.788 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01021|binding|INFO|Releasing lport e30774db-d3d3-4438-b68a-6f7855f55128 from this chassis (sb_readonly=0)
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01022|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 down in Southbound
Dec 05 12:14:23 compute-0 ovn_controller[95610]: 2025-12-05T12:14:23Z|01023|binding|INFO|Removing iface tape30774db-d3 ovn-installed in OVS
Dec 05 12:14:23 compute-0 podman[238503]: 2025-12-05 12:14:23.795793669 +0000 UTC m=+0.110489046 container died ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:14:23 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:23.802 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8e:78 10.100.0.9'], port_security=['fa:16:3e:50:8e:78 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '9', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e30774db-d3d3-4438-b68a-6f7855f55128) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:23 compute-0 nova_compute[187208]: 2025-12-05 12:14:23.840 187212 DEBUG nova.compute.manager [None req-bff0ddcf-35e7-40f1-93dd-51a378d02b59 e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a-userdata-shm.mount: Deactivated successfully.
Dec 05 12:14:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-82c4320461bf4b5fdde7d3f6f24c7e9f8bf85dc2b87ae4144fbfa739d759c5c5-merged.mount: Deactivated successfully.
Dec 05 12:14:23 compute-0 podman[238503]: 2025-12-05 12:14:23.906881061 +0000 UTC m=+0.221576418 container cleanup ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 12:14:23 compute-0 systemd[1]: libpod-conmon-ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a.scope: Deactivated successfully.
Dec 05 12:14:24 compute-0 podman[238539]: 2025-12-05 12:14:24.10849333 +0000 UTC m=+0.171989414 container remove ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.113 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f4750530-a6db-4f3f-8287-50196fe82d12]: (4, ('Fri Dec  5 12:14:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be (ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a)\nad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a\nFri Dec  5 12:14:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be (ad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a)\nad10e40fb707b3fb22c2c2827f05a06196ebd742c8fc6d27e3337ea69c4b632a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.115 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c37c0df8-5680-4bae-a576-1fcd96993840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.116 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82130d25-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:24 compute-0 kernel: tap82130d25-f0: left promiscuous mode
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.136 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4aef0a19-173d-44ef-bb00-a24d7307369a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.151 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf85b827-992f-4df7-9639-261afda24e50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.154 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db8651e3-51bf-4316-80a4-1959fb81acf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.171 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[15980950-fa2f-409a-956d-c824d6663544]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422746, 'reachable_time': 25541, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238557, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.174 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.174 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab57dc2-50ba-4b4b-bbef-18748226a500]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.174 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e30774db-d3d3-4438-b68a-6f7855f55128 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be unbound from our chassis
Dec 05 12:14:24 compute-0 systemd[1]: run-netns-ovnmeta\x2d82130d25\x2dff6c\x2d480e\x2d884d\x2df3d97b6fd9be.mount: Deactivated successfully.
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.176 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82130d25-ff6c-480e-884d-f3d97b6fd9be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.177 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a744cbd1-47bf-4b5e-b23b-032d1f035699]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.178 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e30774db-d3d3-4438-b68a-6f7855f55128 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be unbound from our chassis
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.179 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82130d25-ff6c-480e-884d-f3d97b6fd9be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:14:24 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:24.179 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea55668-9f64-49ad-920e-cb5b09ac6a67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:24 compute-0 ovn_controller[95610]: 2025-12-05T12:14:24Z|01024|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:14:24 compute-0 ovn_controller[95610]: 2025-12-05T12:14:24Z|01025|binding|INFO|Releasing lport 7727e77e-1213-44ea-8a4c-fc20ad3df080 from this chassis (sb_readonly=0)
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.556 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.683 187212 DEBUG nova.compute.manager [req-b5143ab8-433c-4f0b-8144-a64e8d3d59bd req-7bb0f2d4-a37d-4e12-af4c-8ae34acea527 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.684 187212 DEBUG oslo_concurrency.lockutils [req-b5143ab8-433c-4f0b-8144-a64e8d3d59bd req-7bb0f2d4-a37d-4e12-af4c-8ae34acea527 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.684 187212 DEBUG oslo_concurrency.lockutils [req-b5143ab8-433c-4f0b-8144-a64e8d3d59bd req-7bb0f2d4-a37d-4e12-af4c-8ae34acea527 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.684 187212 DEBUG oslo_concurrency.lockutils [req-b5143ab8-433c-4f0b-8144-a64e8d3d59bd req-7bb0f2d4-a37d-4e12-af4c-8ae34acea527 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.684 187212 DEBUG nova.compute.manager [req-b5143ab8-433c-4f0b-8144-a64e8d3d59bd req-7bb0f2d4-a37d-4e12-af4c-8ae34acea527 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.685 187212 WARNING nova.compute.manager [req-b5143ab8-433c-4f0b-8144-a64e8d3d59bd req-7bb0f2d4-a37d-4e12-af4c-8ae34acea527 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state suspended and task_state None.
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.897 187212 INFO nova.compute.manager [None req-397b4978-ad10-4b1c-aa02-cf380eb33d05 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Unpausing
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.898 187212 DEBUG nova.objects.instance [None req-397b4978-ad10-4b1c-aa02-cf380eb33d05 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.925 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936864.924147, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.925 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Resumed (Lifecycle Event)
Dec 05 12:14:24 compute-0 virtqemud[186841]: argument unsupported: QEMU guest agent is not configured
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.932 187212 DEBUG nova.virt.libvirt.guest [None req-397b4978-ad10-4b1c-aa02-cf380eb33d05 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.932 187212 DEBUG nova.compute.manager [None req-397b4978-ad10-4b1c-aa02-cf380eb33d05 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.960 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.964 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:24 compute-0 nova_compute[187208]: 2025-12-05 12:14:24.995 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (unpausing). Skip.
Dec 05 12:14:26 compute-0 podman[238561]: 2025-12-05 12:14:26.21217766 +0000 UTC m=+0.057551075 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.198 187212 DEBUG oslo_concurrency.lockutils [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "6926ef53-01fc-476e-a6af-82edff2ead1c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.199 187212 DEBUG oslo_concurrency.lockutils [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.199 187212 DEBUG oslo_concurrency.lockutils [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.199 187212 DEBUG oslo_concurrency.lockutils [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.199 187212 DEBUG oslo_concurrency.lockutils [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.200 187212 INFO nova.compute.manager [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Terminating instance
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.201 187212 DEBUG nova.compute.manager [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:14:27 compute-0 kernel: tap7b34ea0e-cc (unregistering): left promiscuous mode
Dec 05 12:14:27 compute-0 NetworkManager[55691]: <info>  [1764936867.2212] device (tap7b34ea0e-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.228 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:27 compute-0 ovn_controller[95610]: 2025-12-05T12:14:27Z|01026|binding|INFO|Releasing lport 7b34ea0e-cc76-42fc-bf46-afb764b183a3 from this chassis (sb_readonly=0)
Dec 05 12:14:27 compute-0 ovn_controller[95610]: 2025-12-05T12:14:27Z|01027|binding|INFO|Setting lport 7b34ea0e-cc76-42fc-bf46-afb764b183a3 down in Southbound
Dec 05 12:14:27 compute-0 ovn_controller[95610]: 2025-12-05T12:14:27Z|01028|binding|INFO|Removing iface tap7b34ea0e-cc ovn-installed in OVS
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.241 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:26:b9 10.100.0.5'], port_security=['fa:16:3e:ec:26:b9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6926ef53-01fc-476e-a6af-82edff2ead1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-523b0e7b-0762-4f8e-862f-6106fd936b93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '911e6d79fd1248ed827eff507ac9b603', 'neutron:revision_number': '4', 'neutron:security_group_ids': '918631d7-a624-4f54-ad5e-2edce0704980', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f96be3e-e12b-443d-bfc0-ddb84ec79d39, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7b34ea0e-cc76-42fc-bf46-afb764b183a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.243 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.244 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7b34ea0e-cc76-42fc-bf46-afb764b183a3 in datapath 523b0e7b-0762-4f8e-862f-6106fd936b93 unbound from our chassis
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.246 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 523b0e7b-0762-4f8e-862f-6106fd936b93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.247 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7095cfe9-d7fd-48dd-940d-b3ba0a77d1f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.248 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93 namespace which is not needed anymore
Dec 05 12:14:27 compute-0 systemd[1]: machine-qemu\x2d114\x2dinstance\x2d00000060.scope: Deactivated successfully.
Dec 05 12:14:27 compute-0 systemd[1]: machine-qemu\x2d114\x2dinstance\x2d00000060.scope: Consumed 6.012s CPU time.
Dec 05 12:14:27 compute-0 systemd-machined[153543]: Machine qemu-114-instance-00000060 terminated.
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.298 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.299 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.300 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.300 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.300 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.301 187212 WARNING nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state suspended and task_state None.
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.301 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.301 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.301 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.301 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.302 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.302 187212 WARNING nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state suspended and task_state None.
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.302 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.302 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.303 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.303 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.303 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.303 187212 WARNING nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state suspended and task_state None.
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.303 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.304 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.304 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.304 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.304 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.305 187212 WARNING nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state suspended and task_state None.
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.305 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.305 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.305 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.305 187212 DEBUG oslo_concurrency.lockutils [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.306 187212 DEBUG nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.306 187212 WARNING nova.compute.manager [req-123d38b3-3095-4ed5-b029-bab7e28718c8 req-ad672771-ee7f-43f0-9c43-1b319104323e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state suspended and task_state None.
Dec 05 12:14:27 compute-0 neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93[238460]: [NOTICE]   (238464) : haproxy version is 2.8.14-c23fe91
Dec 05 12:14:27 compute-0 neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93[238460]: [NOTICE]   (238464) : path to executable is /usr/sbin/haproxy
Dec 05 12:14:27 compute-0 neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93[238460]: [WARNING]  (238464) : Exiting Master process...
Dec 05 12:14:27 compute-0 neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93[238460]: [ALERT]    (238464) : Current worker (238466) exited with code 143 (Terminated)
Dec 05 12:14:27 compute-0 neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93[238460]: [WARNING]  (238464) : All workers exited. Exiting... (0)
Dec 05 12:14:27 compute-0 systemd[1]: libpod-376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7.scope: Deactivated successfully.
Dec 05 12:14:27 compute-0 podman[238613]: 2025-12-05 12:14:27.381703534 +0000 UTC m=+0.048143813 container died 376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.429 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.438 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-df66501b2d0845d8c3252f40c59edfed532cea70ab994ada636319c8fb762bf3-merged.mount: Deactivated successfully.
Dec 05 12:14:27 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7-userdata-shm.mount: Deactivated successfully.
Dec 05 12:14:27 compute-0 podman[238613]: 2025-12-05 12:14:27.454582791 +0000 UTC m=+0.121023040 container cleanup 376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:14:27 compute-0 systemd[1]: libpod-conmon-376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7.scope: Deactivated successfully.
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.472 187212 INFO nova.virt.libvirt.driver [-] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Instance destroyed successfully.
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.473 187212 DEBUG nova.objects.instance [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lazy-loading 'resources' on Instance uuid 6926ef53-01fc-476e-a6af-82edff2ead1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.489 187212 DEBUG nova.virt.libvirt.vif [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1009711222',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1009711222',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1009711222',id=96,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:14:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='911e6d79fd1248ed827eff507ac9b603',ramdisk_id='',reservation_id='r-1n8kh764',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-166421396',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-166421396-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:14:21Z,user_data=None,user_id='24c692cffbb647a3978075eca36d4254',uuid=6926ef53-01fc-476e-a6af-82edff2ead1c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.490 187212 DEBUG nova.network.os_vif_util [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Converting VIF {"id": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "address": "fa:16:3e:ec:26:b9", "network": {"id": "523b0e7b-0762-4f8e-862f-6106fd936b93", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-596016788-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "911e6d79fd1248ed827eff507ac9b603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b34ea0e-cc", "ovs_interfaceid": "7b34ea0e-cc76-42fc-bf46-afb764b183a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.490 187212 DEBUG nova.network.os_vif_util [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:26:b9,bridge_name='br-int',has_traffic_filtering=True,id=7b34ea0e-cc76-42fc-bf46-afb764b183a3,network=Network(523b0e7b-0762-4f8e-862f-6106fd936b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b34ea0e-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.491 187212 DEBUG os_vif [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:26:b9,bridge_name='br-int',has_traffic_filtering=True,id=7b34ea0e-cc76-42fc-bf46-afb764b183a3,network=Network(523b0e7b-0762-4f8e-862f-6106fd936b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b34ea0e-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.492 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.493 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b34ea0e-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.500 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.503 187212 INFO os_vif [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:26:b9,bridge_name='br-int',has_traffic_filtering=True,id=7b34ea0e-cc76-42fc-bf46-afb764b183a3,network=Network(523b0e7b-0762-4f8e-862f-6106fd936b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b34ea0e-cc')
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.504 187212 INFO nova.virt.libvirt.driver [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Deleting instance files /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c_del
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.505 187212 INFO nova.virt.libvirt.driver [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Deletion of /var/lib/nova/instances/6926ef53-01fc-476e-a6af-82edff2ead1c_del complete
Dec 05 12:14:27 compute-0 podman[238658]: 2025-12-05 12:14:27.542170643 +0000 UTC m=+0.066073111 container remove 376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.550 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[652a0f97-d68a-4b39-9644-40efd61046fc]: (4, ('Fri Dec  5 12:14:27 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93 (376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7)\n376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7\nFri Dec  5 12:14:27 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93 (376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7)\n376bd6e48e91024d64afcc0f2895c4a3950a652820a9c8f640c25755c8162fc7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.552 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3ec4dc-f5f3-4963-a4df-573325dd541b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.553 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap523b0e7b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:27 compute-0 kernel: tap523b0e7b-00: left promiscuous mode
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.555 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.577 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[31a0499c-6464-4fe4-b944-60d32681056f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.580 187212 INFO nova.compute.manager [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.582 187212 DEBUG oslo.service.loopingcall [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.582 187212 DEBUG nova.compute.manager [-] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.583 187212 DEBUG nova.network.neutron [-] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.598 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[abdabf6c-83f4-4fad-9632-c7112f42bc31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.600 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8cd57d21-1108-444c-a6b5-455c30d07984]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.620 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f4116f86-c067-42f1-97a7-0095e4caf89a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423995, 'reachable_time': 41065, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238680, 'error': None, 'target': 'ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.622 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-523b0e7b-0762-4f8e-862f-6106fd936b93 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:14:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:27.623 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[085c5530-a813-4478-8d71-7417d6b167b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:27 compute-0 systemd[1]: run-netns-ovnmeta\x2d523b0e7b\x2d0762\x2d4f8e\x2d862f\x2d6106fd936b93.mount: Deactivated successfully.
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.730 187212 DEBUG nova.compute.manager [req-66d4a8a8-282d-4080-a28b-6d6b2ccb028f req-1b6c2a5a-bbf9-4037-858a-2964b66e7bec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Received event network-vif-unplugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.731 187212 DEBUG oslo_concurrency.lockutils [req-66d4a8a8-282d-4080-a28b-6d6b2ccb028f req-1b6c2a5a-bbf9-4037-858a-2964b66e7bec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.731 187212 DEBUG oslo_concurrency.lockutils [req-66d4a8a8-282d-4080-a28b-6d6b2ccb028f req-1b6c2a5a-bbf9-4037-858a-2964b66e7bec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.731 187212 DEBUG oslo_concurrency.lockutils [req-66d4a8a8-282d-4080-a28b-6d6b2ccb028f req-1b6c2a5a-bbf9-4037-858a-2964b66e7bec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.731 187212 DEBUG nova.compute.manager [req-66d4a8a8-282d-4080-a28b-6d6b2ccb028f req-1b6c2a5a-bbf9-4037-858a-2964b66e7bec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] No waiting events found dispatching network-vif-unplugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:27 compute-0 nova_compute[187208]: 2025-12-05 12:14:27.732 187212 DEBUG nova.compute.manager [req-66d4a8a8-282d-4080-a28b-6d6b2ccb028f req-1b6c2a5a-bbf9-4037-858a-2964b66e7bec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Received event network-vif-unplugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.062 187212 INFO nova.compute.manager [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Resuming
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.063 187212 DEBUG nova.objects.instance [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'flavor' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.098 187212 DEBUG oslo_concurrency.lockutils [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.099 187212 DEBUG oslo_concurrency.lockutils [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquired lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.099 187212 DEBUG nova.network.neutron [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:14:28 compute-0 ovn_controller[95610]: 2025-12-05T12:14:28Z|00119|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.724 187212 DEBUG nova.network.neutron [-] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.744 187212 INFO nova.compute.manager [-] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Took 1.16 seconds to deallocate network for instance.
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.806 187212 DEBUG oslo_concurrency.lockutils [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.807 187212 DEBUG oslo_concurrency.lockutils [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.910 187212 DEBUG nova.compute.provider_tree [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.927 187212 DEBUG nova.scheduler.client.report [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.950 187212 DEBUG oslo_concurrency.lockutils [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:28 compute-0 nova_compute[187208]: 2025-12-05 12:14:28.973 187212 INFO nova.scheduler.client.report [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Deleted allocations for instance 6926ef53-01fc-476e-a6af-82edff2ead1c
Dec 05 12:14:29 compute-0 nova_compute[187208]: 2025-12-05 12:14:29.043 187212 DEBUG oslo_concurrency.lockutils [None req-f8ead4f8-741d-4386-a47d-cce288e57c4d 24c692cffbb647a3978075eca36d4254 911e6d79fd1248ed827eff507ac9b603 - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:30 compute-0 nova_compute[187208]: 2025-12-05 12:14:30.299 187212 DEBUG nova.compute.manager [req-cb096023-58c1-4500-af6c-c3a08bbf1ba1 req-19e18157-a5e1-4fa8-b716-c9d70949e1de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Received event network-vif-plugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:30 compute-0 nova_compute[187208]: 2025-12-05 12:14:30.300 187212 DEBUG oslo_concurrency.lockutils [req-cb096023-58c1-4500-af6c-c3a08bbf1ba1 req-19e18157-a5e1-4fa8-b716-c9d70949e1de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:30 compute-0 nova_compute[187208]: 2025-12-05 12:14:30.300 187212 DEBUG oslo_concurrency.lockutils [req-cb096023-58c1-4500-af6c-c3a08bbf1ba1 req-19e18157-a5e1-4fa8-b716-c9d70949e1de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:30 compute-0 nova_compute[187208]: 2025-12-05 12:14:30.300 187212 DEBUG oslo_concurrency.lockutils [req-cb096023-58c1-4500-af6c-c3a08bbf1ba1 req-19e18157-a5e1-4fa8-b716-c9d70949e1de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "6926ef53-01fc-476e-a6af-82edff2ead1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:30 compute-0 nova_compute[187208]: 2025-12-05 12:14:30.300 187212 DEBUG nova.compute.manager [req-cb096023-58c1-4500-af6c-c3a08bbf1ba1 req-19e18157-a5e1-4fa8-b716-c9d70949e1de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] No waiting events found dispatching network-vif-plugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:30 compute-0 nova_compute[187208]: 2025-12-05 12:14:30.300 187212 WARNING nova.compute.manager [req-cb096023-58c1-4500-af6c-c3a08bbf1ba1 req-19e18157-a5e1-4fa8-b716-c9d70949e1de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Received unexpected event network-vif-plugged-7b34ea0e-cc76-42fc-bf46-afb764b183a3 for instance with vm_state deleted and task_state None.
Dec 05 12:14:30 compute-0 nova_compute[187208]: 2025-12-05 12:14:30.301 187212 DEBUG nova.compute.manager [req-cb096023-58c1-4500-af6c-c3a08bbf1ba1 req-19e18157-a5e1-4fa8-b716-c9d70949e1de 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Received event network-vif-deleted-7b34ea0e-cc76-42fc-bf46-afb764b183a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:32 compute-0 podman[238681]: 2025-12-05 12:14:32.219504992 +0000 UTC m=+0.071955471 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.512 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936857.5089269, 30cb83d4-3a34-4420-bc83-099b266da48c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.512 187212 INFO nova.compute.manager [-] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] VM Stopped (Lifecycle Event)
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.530 187212 DEBUG nova.compute.manager [None req-a9599498-05c2-47d2-a6e5-22910ce3795c - - - - - -] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.910 187212 DEBUG nova.network.neutron [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [{"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.931 187212 DEBUG oslo_concurrency.lockutils [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Releasing lock "refresh_cache-28e48516-8665-4d98-a92d-c84b7da9a284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.937 187212 DEBUG nova.virt.libvirt.vif [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-826937421',display_name='tempest-ServersNegativeTestJSON-server-826937421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-826937421',id=80,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:14:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-snx0qylv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:14:23Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=28e48516-8665-4d98-a92d-c84b7da9a284,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.938 187212 DEBUG nova.network.os_vif_util [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.939 187212 DEBUG nova.network.os_vif_util [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.940 187212 DEBUG os_vif [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.940 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.941 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.942 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.944 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.945 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape30774db-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.945 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape30774db-d3, col_values=(('external_ids', {'iface-id': 'e30774db-d3d3-4438-b68a-6f7855f55128', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:8e:78', 'vm-uuid': '28e48516-8665-4d98-a92d-c84b7da9a284'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.946 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.947 187212 INFO os_vif [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3')
Dec 05 12:14:32 compute-0 nova_compute[187208]: 2025-12-05 12:14:32.968 187212 DEBUG nova.objects.instance [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'numa_topology' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:33 compute-0 kernel: tape30774db-d3: entered promiscuous mode
Dec 05 12:14:33 compute-0 NetworkManager[55691]: <info>  [1764936873.0579] manager: (tape30774db-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/394)
Dec 05 12:14:33 compute-0 ovn_controller[95610]: 2025-12-05T12:14:33Z|01029|binding|INFO|Claiming lport e30774db-d3d3-4438-b68a-6f7855f55128 for this chassis.
Dec 05 12:14:33 compute-0 ovn_controller[95610]: 2025-12-05T12:14:33Z|01030|binding|INFO|e30774db-d3d3-4438-b68a-6f7855f55128: Claiming fa:16:3e:50:8e:78 10.100.0.9
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.183 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8e:78 10.100.0.9'], port_security=['fa:16:3e:50:8e:78 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '12', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e30774db-d3d3-4438-b68a-6f7855f55128) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.184 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e30774db-d3d3-4438-b68a-6f7855f55128 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be bound to our chassis
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.186 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:14:33 compute-0 ovn_controller[95610]: 2025-12-05T12:14:33Z|01031|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 ovn-installed in OVS
Dec 05 12:14:33 compute-0 ovn_controller[95610]: 2025-12-05T12:14:33Z|01032|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 up in Southbound
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.192 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.199 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62100588-20d7-4f31-8f13-8fdd8c671d38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.200 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82130d25-f1 in ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.202 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82130d25-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.202 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[98afb098-2bcf-46de-acb8-9abd8b88d37c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.203 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[606b8e7a-f08e-4da9-bcf3-f5a6f7d7f95b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 systemd-udevd[238717]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.213 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8e028c-7a01-469a-8237-c7ace1674465]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 systemd-machined[153543]: New machine qemu-115-instance-00000050.
Dec 05 12:14:33 compute-0 NetworkManager[55691]: <info>  [1764936873.2225] device (tape30774db-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:14:33 compute-0 NetworkManager[55691]: <info>  [1764936873.2238] device (tape30774db-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:14:33 compute-0 systemd[1]: Started Virtual Machine qemu-115-instance-00000050.
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.239 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cc6efa00-fd46-40eb-a322-ed409dd55d7f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.268 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a2924b4b-b5da-4628-af8c-2eb807205ae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 NetworkManager[55691]: <info>  [1764936873.2785] manager: (tap82130d25-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/395)
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.275 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6cff26cd-48f9-4408-b160-cb270d1dcd7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.362 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[af6b059d-3a16-441f-8867-609349cb9220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.365 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[20a95372-1bca-4ac6-b7a7-1f46e3707cbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 NetworkManager[55691]: <info>  [1764936873.3923] device (tap82130d25-f0): carrier: link connected
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.400 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d18d90b7-8cd3-4beb-a5d8-3e17c49b744a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.421 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5bdf379d-73bb-43c3-9fda-2a30ae6aaf3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82130d25-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:36:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425395, 'reachable_time': 32193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238750, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.442 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bea84003-4d1f-45a1-b535-1cf166c596f8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:36e4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425395, 'tstamp': 425395}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238751, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.466 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[45703a91-227a-4eef-826b-e7df54b8d359]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82130d25-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:36:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425395, 'reachable_time': 32193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238752, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.505 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2bcc5309-bc15-4e92-bec1-dbf0c9a94c62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.567 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c84c8f8c-357e-43e9-a4a1-c7013853d839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.568 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82130d25-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.569 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.569 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82130d25-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:33 compute-0 NetworkManager[55691]: <info>  [1764936873.5725] manager: (tap82130d25-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/396)
Dec 05 12:14:33 compute-0 kernel: tap82130d25-f0: entered promiscuous mode
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.572 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.576 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82130d25-f0, col_values=(('external_ids', {'iface-id': 'f81c4a80-27d3-4231-a37a-7c231838aca7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:33 compute-0 ovn_controller[95610]: 2025-12-05T12:14:33Z|01033|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.582 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82130d25-ff6c-480e-884d-f3d97b6fd9be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82130d25-ff6c-480e-884d-f3d97b6fd9be.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.586 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e13b6959-15e9-484a-9eab-24a594e5183a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.588 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/82130d25-ff6c-480e-884d-f3d97b6fd9be.pid.haproxy
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 82130d25-ff6c-480e-884d-f3d97b6fd9be
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:14:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:33.588 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'env', 'PROCESS_TAG=haproxy-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82130d25-ff6c-480e-884d-f3d97b6fd9be.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.591 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.724 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 28e48516-8665-4d98-a92d-c84b7da9a284 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.732 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936873.7221391, 28e48516-8665-4d98-a92d-c84b7da9a284 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.733 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Started (Lifecycle Event)
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.756 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.759 187212 DEBUG nova.compute.manager [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.759 187212 DEBUG nova.objects.instance [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'pci_devices' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.764 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.784 187212 INFO nova.virt.libvirt.driver [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance running successfully.
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.785 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] During sync_power_state the instance has a pending task (resuming). Skip.
Dec 05 12:14:33 compute-0 virtqemud[186841]: argument unsupported: QEMU guest agent is not configured
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.786 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936873.7373376, 28e48516-8665-4d98-a92d-c84b7da9a284 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.788 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Resumed (Lifecycle Event)
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.790 187212 DEBUG nova.virt.libvirt.guest [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.791 187212 DEBUG nova.compute.manager [None req-24418931-c3eb-4d11-8aea-81d72596e13b e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.806 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.810 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.834 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] During sync_power_state the instance has a pending task (resuming). Skip.
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.906 187212 DEBUG nova.compute.manager [req-53257aa9-1839-4f48-b521-f6def46ecccc req-79687e77-2d6a-49f4-af32-8c2dbae175da 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.908 187212 DEBUG oslo_concurrency.lockutils [req-53257aa9-1839-4f48-b521-f6def46ecccc req-79687e77-2d6a-49f4-af32-8c2dbae175da 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.908 187212 DEBUG oslo_concurrency.lockutils [req-53257aa9-1839-4f48-b521-f6def46ecccc req-79687e77-2d6a-49f4-af32-8c2dbae175da 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.908 187212 DEBUG oslo_concurrency.lockutils [req-53257aa9-1839-4f48-b521-f6def46ecccc req-79687e77-2d6a-49f4-af32-8c2dbae175da 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.908 187212 DEBUG nova.compute.manager [req-53257aa9-1839-4f48-b521-f6def46ecccc req-79687e77-2d6a-49f4-af32-8c2dbae175da 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:33 compute-0 nova_compute[187208]: 2025-12-05 12:14:33.909 187212 WARNING nova.compute.manager [req-53257aa9-1839-4f48-b521-f6def46ecccc req-79687e77-2d6a-49f4-af32-8c2dbae175da 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state active and task_state None.
Dec 05 12:14:33 compute-0 podman[238791]: 2025-12-05 12:14:33.98652814 +0000 UTC m=+0.052838589 container create f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:14:34 compute-0 systemd[1]: Started libpod-conmon-f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879.scope.
Dec 05 12:14:34 compute-0 podman[238791]: 2025-12-05 12:14:33.955824372 +0000 UTC m=+0.022134851 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:14:34 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:14:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad5b6c5ff20a25239187ca9c26da1c7d82cdf3dd0052137420ca1cf549f4eef5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:14:34 compute-0 podman[238791]: 2025-12-05 12:14:34.081760413 +0000 UTC m=+0.148070892 container init f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Dec 05 12:14:34 compute-0 podman[238791]: 2025-12-05 12:14:34.087473259 +0000 UTC m=+0.153783708 container start f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:14:34 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[238806]: [NOTICE]   (238810) : New worker (238812) forked
Dec 05 12:14:34 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[238806]: [NOTICE]   (238810) : Loading success.
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.061 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "108114b5-8832-494c-b436-40ffa2ffb7c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.061 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "108114b5-8832-494c-b436-40ffa2ffb7c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.083 187212 DEBUG nova.compute.manager [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.177 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.177 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.186 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.187 187212 INFO nova.compute.claims [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.365 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'name': 'tempest-ServerActionsTestJSON-server-954339420', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '75752a4cc8f7487e8dc4440201f894c8', 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'hostId': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.368 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'name': 'tempest-ServersNegativeTestJSON-server-826937421', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000050', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c5b34686513f4abc8165113eb8c6831e', 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'hostId': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.369 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.369 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.369 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-954339420>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-954339420>]
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.369 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.391 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/cpu volume: 11520000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.411 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/cpu volume: 10000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fafa15d-3478-4bee-83f2-019889f51815', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11520000000, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'timestamp': '2025-12-05T12:14:35.370105', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f6979f3a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.012476309, 'message_signature': 'dda00e3f333e17723a846d0738695b5e78e42fe0971d10b68c0d24f7a638da54'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10000000, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'timestamp': '2025-12-05T12:14:35.370105', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f69a9cd0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.032364264, 'message_signature': '96615cb6549c1100e98179e67a07841998e338310720c5533d4e61ef62ecf516'}]}, 'timestamp': '2025-12-05 12:14:35.411951', '_unique_id': 'f74c3994baf84949889d0ea58c6bed4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.413 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.454 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.read.latency volume: 528106324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.455 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.read.latency volume: 72284046 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.495 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.495 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d44ae72-2fbf-4c6d-bb07-80212db17550', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 528106324, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-vda', 'timestamp': '2025-12-05T12:14:35.414707', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6a13fe0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': 'd0811283a5536453733e6bd200904f64e03cdae941cb5c2e0f518ba5d13bf25a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 72284046, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-sda', 'timestamp': '2025-12-05T12:14:35.414707', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6a14af8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': 'a121021660f75276415520114d3de32b74abbf5e66849f2ef257902ae6f635dd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:14:35.414707', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6a764a6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': '6b2289aedb378655117fc4994569b2727feb309c7280b05010a9910663e21290'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:14:35.414707', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6a77400-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': '2b2f14802ef5369a706d9964ba2362344a3bc466cdf483bb3566544d6f2e8410'}]}, 'timestamp': '2025-12-05 12:14:35.496084', '_unique_id': 'fcd4aa2f063e4e6ca5dd9b7bbbea573a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.497 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.498 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.501 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c / tap5316adeb-5a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.501 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.503 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5df1888b-c630-4b05-847d-815c7543e039', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.498598', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6a85b22-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': 'cf6c73458e8e70a31d53c37c7f2a5505c5c35bc1279bcbfc819a116a0fb1cf2f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.498598', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6a8b9dc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': 'edbc134f15382f80c3b1e4ad0a524d94241460139489014553a31ea27a59cb34'}]}, 'timestamp': '2025-12-05 12:14:35.504429', '_unique_id': 'ad5f9a0fce614996bd56ccb750ff8b23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.505 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.507 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.507 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.507 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.508 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.509 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b74eb054-c830-4e48-b885-4917f77ebfca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-vda', 'timestamp': '2025-12-05T12:14:35.507326', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6a93b0a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': '3006d82be3f3bd63535d1d9622cb41ebbe7b3083d1e33b58f7be0791c2c207b9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-sda', 'timestamp': '2025-12-05T12:14:35.507326', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6a9629c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': 'f086947607292a826ec31389edf1968b0fbcd620d34be4caa0c509c2ab22c775'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:14:35.507326', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6a97a98-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': 'c23ca263606851f947fa53f833f15deebf99ed36a7ff0b15345b56136b727adb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:14:35.507326', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6a982ae-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': 'def83c64fc59fbb2fb7d45c4a2a52d611ac64bb64a33e0b7969249d2912887a9'}]}, 'timestamp': '2025-12-05 12:14:35.509576', '_unique_id': 'b2955b1200c14bbfaf5e9b0f4b296d3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.510 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.511 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.511 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.511 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.bytes.delta volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b7c59c3-2b3c-446e-a493-5cc8df06c0f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.511383', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6a9d84e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': 'b3deb2e24108da53242a819df7b7ffe11f05c9af98ff656e9d30eaca4a5a487e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.511383', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6a9e136-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': '5165c122ebb548169a3d55b10331dc4ddb1255768a7458e0e526a52b41477fd0'}]}, 'timestamp': '2025-12-05 12:14:35.511940', '_unique_id': '368ef483667d41bcb5082b3c95182d2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.512 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.513 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.513 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.513 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cdad545-a6a9-4049-8ab2-c6d4e2b26812', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.513402', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6aa2402-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': 'd3a119171275949fcbd50bab8ce92c9693bbc5fc01840de6d3740a20cb4dd458'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.513402', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6aa2cd6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': '5ebaf5bef295f6c0987f221e600fea9419b25b10a283ae1b4d835adb7dbd76e3'}]}, 'timestamp': '2025-12-05 12:14:35.513876', '_unique_id': 'ce794643f82c4ecc9c4d402ac7c64aae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.514 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.515 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.515 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.write.latency volume: 38321408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.515 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.515 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.515 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e28b64d1-4252-45f3-8a01-82b01d319b4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38321408, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-vda', 'timestamp': '2025-12-05T12:14:35.515135', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6aa66ec-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': '461e421156ec82e2a8e65bf818a3c00ddf663c6ad81e1bd5c2d782177f5fb287'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-sda', 'timestamp': '2025-12-05T12:14:35.515135', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6aa6f16-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': 'c125dfcd3f1880ab0098094499a0d6511db486b12cb33aa8f41ddf26705e26f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:14:35.515135', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6aa76a0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': '5e24063b3e79d36ddb3adf5d430c685a3c301ab8b8911953a06f4b356e835201'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:14:35.515135', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6aa7e0c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': 'df42b5d34d9c5d82b925a0c7d2371e4517fd935c2a835eee35f27f331aa35f23'}]}, 'timestamp': '2025-12-05 12:14:35.515947', '_unique_id': 'a5a95baccf8b4d85a764e65700b0e6f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.516 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.517 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.517 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.517 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-954339420>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-954339420>]
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.517 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.517 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.517 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcacbd90-3164-40f1-b51d-38d640398449', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.517659', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6aac97a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': 'b887c49d67855344b6fad8ea1baf6cc7aefc154d66c71468eee86daecdc1f7a2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.517659', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6aad1cc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': '0f64804b60def718df7f6da1e1f4909209b56dfcc12f3089450717174f57f2c1'}]}, 'timestamp': '2025-12-05 12:14:35.518114', '_unique_id': 'e81bcb0e82cc4ddb96743b3134739f19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.518 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.519 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.519 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.write.requests volume: 44 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.519 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.519 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.519 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '179187da-10a9-469f-96b9-5a49287d7446', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 44, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-vda', 'timestamp': '2025-12-05T12:14:35.519235', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6ab06a6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': '2b41553b773d2fa422742e7baed9746120e01b7a7e3f351580d9fb3830056974'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-sda', 'timestamp': '2025-12-05T12:14:35.519235', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6ab0e58-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': 'fd88dbb11068b5f41b3f97d67a9a81d2e44f7285a1ffab2186ea86e574518a44'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:14:35.519235', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6ab15ec-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': '6a7aee95da16868bde828c027309fa9aa6ca0b047a7dc5dcff1f283c1d6fac13'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:14:35.519235', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6ab1d44-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': 'ee01fec63da968e1a59fa6e406fc3bb9ee0b4475ea03bee70a0cd4103eab6d1c'}]}, 'timestamp': '2025-12-05 12:14:35.520035', '_unique_id': 'f78a8a457bf843a990648e1c34dea7ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.520 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.521 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.521 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.incoming.bytes volume: 7043 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.521 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.bytes volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1a75e46-0024-4041-9c4d-eec3b334fdb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7043, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.521172', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6ab526e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': '90b8ae70facfe97fe1e043c38f294eff4261bd23cdb3721de84068a62e46c8b3'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.521172', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6ab5b6a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': '3bcf77e87fdab00a3027884267d519b4472ec907c040bd74c32536f309f86018'}]}, 'timestamp': '2025-12-05 12:14:35.521646', '_unique_id': '1f69fb0e82c149809e1c7385e5ac1a73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.522 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.read.bytes volume: 32081920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.523 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.523 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.523 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a54738d-ceeb-48e0-9fd0-448c15ff6dc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32081920, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-vda', 'timestamp': '2025-12-05T12:14:35.522789', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6ab9166-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': '823db1f711cc555f0a44ce028f2259261e514d33a6c09d8203dc0731cddfbc69'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-sda', 'timestamp': '2025-12-05T12:14:35.522789', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6ab99e0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': 'e0e36d6ccf39bd1aa3d6ce1c75b64a6136cc6ac3148adfd9646df7997c3b01d1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:14:35.522789', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6aba16a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': 'e53a294da791db1385830de7aec1e7552e7ae24f27f43e8b4d53c3a8e821443a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:14:35.522789', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6aba8ae-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': '62f70c7dcbbd04c52f4b7997b6597eeaafe9bfa60c6e3421c21bc90e7c4ec8cf'}]}, 'timestamp': '2025-12-05 12:14:35.523588', '_unique_id': '7e25f49e7303455684e9c250f228416a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.524 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.525 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.525 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaafc155-eb7f-4c4f-9b65-8021f0355348', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.525036', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6abe9ea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': 'f91d1fd6782da73a31fd3ccc52292d831c00adcfed883d78eacd2b5736677fea'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.525036', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6abf1e2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': '091e387355fd05f22c1c6d893e0eba0f60e9da9ff6a6c9609d06fa72069183a0'}]}, 'timestamp': '2025-12-05 12:14:35.525470', '_unique_id': '88ae86c9906e4f1081fbc25aa3897b7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.543 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.usage volume: 30277632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.543 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.560 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.561 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96313c24-5439-4fef-85ea-dfdb007db149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30277632, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-vda', 'timestamp': '2025-12-05T12:14:35.526621', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6aebd5a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.147672408, 'message_signature': '102f55b1a9f22a073d7a53ee53c95f1b29d29bfa83e49ee3343ab893e2bfcd0a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-sda', 'timestamp': '2025-12-05T12:14:35.526621', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6aed416-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.147672408, 'message_signature': '6c67a06e87707bb3ab0b57753d8bfa325325f89555f847ec11de65e46ae07fa5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:14:35.526621', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6b15916-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.165522944, 'message_signature': '95c1c690339175885ffa6d376516c97f3f69d0c11d5ab739b7ca5457b299f525'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:14:35.526621', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6b17252-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.165522944, 'message_signature': '222a361fb619be4739c81e3e26cb384a4ce2dbd58d977fac91935817282f08a5'}]}, 'timestamp': '2025-12-05 12:14:35.561687', '_unique_id': '0d1aae4b35704f61a1c6e84f1964d826'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.563 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.564 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.565 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.outgoing.bytes volume: 5828 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.565 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.bytes volume: 1222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.568 187212 DEBUG oslo_concurrency.lockutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.569 187212 DEBUG oslo_concurrency.lockutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.569 187212 INFO nova.compute.manager [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Rebooting instance
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a6e7365-7c90-403c-8167-4e75d66b4bd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5828, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.565147', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6b2146e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': '3d799179dad0db5dfda0f9cc3a6ed47bc9f0ccbfd507be29facf66b626fff2d4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1222, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.565147', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6b229c2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': '2ecee955754c1b06e9f184795f930d55470dd4f4321f04ce43e904f86ad0a044'}]}, 'timestamp': '2025-12-05 12:14:35.566473', '_unique_id': '144786ec260240af9e5af0b202392838'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.567 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.569 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.569 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.570 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-954339420>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-954339420>]
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.570 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.570 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.read.requests volume: 1216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.571 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.571 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.572 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc375704-7032-4288-97c8-be5e7eeb8711', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1216, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-vda', 'timestamp': '2025-12-05T12:14:35.570655', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6b2e65a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': '3d28685429797570fdac83e8882256e8cfc5600398931c322c9def2806b4bf51'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-sda', 'timestamp': '2025-12-05T12:14:35.570655', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6b2f938-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.035769283, 'message_signature': '6f123c902608bc55bd0d08d1d6d4b014850474e72a4c579d9341f60a3c806383'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:14:35.570655', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6b30de2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': 'dc2873daeeadd66efdcd8322a4792e76ae3bdfe8d9c446f205182bd41169c5ea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:14:35.570655', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6b32174-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.076711236, 'message_signature': 'b102eef9d6d618158cb63e2444e3b8c7715d7f1470ef38b7be4cc6b1a098a7cb'}]}, 'timestamp': '2025-12-05 12:14:35.572822', '_unique_id': '5e9883710c5b4df3b614a092428b6b89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.574 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.576 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.576 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.577 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.577 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.578 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de1a828e-9e60-42e9-aac4-9144cacb7c5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-vda', 'timestamp': '2025-12-05T12:14:35.576583', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6b3d0ec-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.147672408, 'message_signature': 'b17bc451adbc47958c7e1d53379f865b9f09fa14c5ad0c1dca99a1014a609c5f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-sda', 'timestamp': '2025-12-05T12:14:35.576583', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6b3e622-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.147672408, 'message_signature': '2e383f3ef9db5b653fd6f353cc8a666e69bff8254d0ab778de7de4b5f8f24f55'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:14:35.576583', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6b3fb3a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.165522944, 'message_signature': 'ee148811c26fcf526452baeec57f470ace766fa43814674fb0ecc74f576557f3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:14:35.576583', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6b40c10-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.165522944, 'message_signature': '7827f46c98b497323985b3d26b9eefd6a9dbb4ed69344bc3d7f282b3e9167944'}]}, 'timestamp': '2025-12-05 12:14:35.578679', '_unique_id': 'dfae0dcc553d4f88b57438d25512948b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.580 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.581 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.582 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/memory.usage volume: 42.48046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.582 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.584 187212 DEBUG oslo_concurrency.lockutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.584 187212 DEBUG oslo_concurrency.lockutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.584 187212 DEBUG nova.network.neutron [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8fd5212-b979-4235-9b7c-fe3cd1e65d28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.48046875, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'timestamp': '2025-12-05T12:14:35.582299', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f6b4acf6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.012476309, 'message_signature': '2204c0b6c36537bdf2f3b2bda60a570a539eb1aad5b1187c8ae7ce5fe906dd59'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'timestamp': '2025-12-05T12:14:35.582299', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f6b4b8b8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.032364264, 'message_signature': 'd47570e9bbc59e59bd9c1f05c8d09d1c7570165061c90ec705ab33c3074a4651'}]}, 'timestamp': '2025-12-05 12:14:35.583078', '_unique_id': 'd5777e2596e24b238bd541ed76d29c80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.584 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.585 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.585 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.bytes.delta volume: 1222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63edfd29-68dd-42ee-a41d-4db2c38d1598', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.585176', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6b51c90-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': '3e609c0c9076eb2aef7af4a34b69f006572e675be27f63cd408049e3ca10f61c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1222, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.585176', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6b528f2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': '1dddd7ce9241db4a96fee05fabe6ab847223f4656d53f2e2c576807002337d3d'}]}, 'timestamp': '2025-12-05 12:14:35.585910', '_unique_id': 'e2decd36f0154e7b8bbf2f203992232c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.586 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.587 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.587 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-954339420>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-954339420>]
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.588 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.outgoing.packets volume: 44 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.588 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08bec55f-e161-484c-990c-496d1cfc6673', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 44, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.588259', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6b59422-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': '28f8b388c162b77a8146b062a8972d09c68a106fb48fe7594e77535a5cfdcbee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.588259', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6b5a156-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': '6e11f70e424a217b50e2474a05653c5a7051caf08afeeee6972c3b0e16c07b2a'}]}, 'timestamp': '2025-12-05 12:14:35.588996', '_unique_id': '0965fa1929c444b3ba0ad35c2d67540c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.589 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.590 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.590 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.591 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.591 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.592 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f58e15cd-6873-4555-a83a-34e1ae4322bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-vda', 'timestamp': '2025-12-05T12:14:35.590947', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6b5fb10-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.147672408, 'message_signature': '84ac580fb435360873011dc94f377c7ce21ce452ad020cc7aeb408345d565218'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-sda', 'timestamp': '2025-12-05T12:14:35.590947', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'instance-0000005c', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6b608e4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.147672408, 'message_signature': 'b37c280e42d5410a6e5b48aca70440dd23b045f36a5fb3307e71dec9884a068c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-vda', 'timestamp': '2025-12-05T12:14:35.590947', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6b61bfe-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.165522944, 'message_signature': '8e2090a87e9166e2c8743b3ba3a47d11f5293fcd277228f8ec1ae8d47c88797c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': '28e48516-8665-4d98-a92d-c84b7da9a284-sda', 'timestamp': '2025-12-05T12:14:35.590947', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'instance-00000050', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6b62892-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.165522944, 'message_signature': '5ca64c7a763154037ac00484b6b9c168acdc8095fd9c2587cbed36c4ff5999f1'}]}, 'timestamp': '2025-12-05 12:14:35.592442', '_unique_id': 'b5cca7a9aade45d6a5b90bd64b4e9b3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.593 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.594 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.594 12 DEBUG ceilometer.compute.pollsters [-] 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/network.incoming.packets volume: 39 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.594 12 DEBUG ceilometer.compute.pollsters [-] 28e48516-8665-4d98-a92d-c84b7da9a284/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18a0ae0e-c7d2-4abc-963f-a44937ec8fc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 39, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_name': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_name': None, 'resource_id': 'instance-0000005c-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-tap5316adeb-5a', 'timestamp': '2025-12-05T12:14:35.594415', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-954339420', 'name': 'tap5316adeb-5a', 'instance_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'instance_type': 'm1.nano', 'host': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:d0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5316adeb-5a'}, 'message_id': 'f6b6872e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.119694869, 'message_signature': 'ef50a9a8d2263b2acf5f656a97f2be77e1971970858a92195fc746404487470d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': 'e90fa3a379b4494c84626bb6a761cd30', 'user_name': None, 'project_id': 'c5b34686513f4abc8165113eb8c6831e', 'project_name': None, 'resource_id': 'instance-00000050-28e48516-8665-4d98-a92d-c84b7da9a284-tape30774db-d3', 'timestamp': '2025-12-05T12:14:35.594415', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-826937421', 'name': 'tape30774db-d3', 'instance_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'instance_type': 'm1.nano', 'host': '6ab4e5a8a99585f308397969b8be199b4baf0a0fc18d9ed93ea0460c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a'}, 'image_ref': 'e12053fb-5eb2-4850-82fb-a7e9b54de98a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:8e:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape30774db-d3'}, 'message_id': 'f6b699b2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4256.123131719, 'message_signature': '6e9c9d23943e9495029207f8d59c699a80671cbfcc38e1d1511a01cfecde2bba'}]}, 'timestamp': '2025-12-05 12:14:35.595428', '_unique_id': '610da4f9ee704dd492b54c671214093f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 12:14:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:14:35.596 12 ERROR oslo_messaging.notify.messaging 
Dec 05 12:14:35 compute-0 ovn_controller[95610]: 2025-12-05T12:14:35Z|01034|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:14:35 compute-0 ovn_controller[95610]: 2025-12-05T12:14:35Z|01035|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.673 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.688 187212 DEBUG nova.compute.provider_tree [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.711 187212 DEBUG nova.scheduler.client.report [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.742 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.743 187212 DEBUG nova.compute.manager [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.793 187212 DEBUG nova.compute.manager [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.841 187212 INFO nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.858 187212 DEBUG nova.compute.manager [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.949 187212 DEBUG nova.compute.manager [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.951 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.952 187212 INFO nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Creating image(s)
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.953 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.953 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.954 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:35 compute-0 nova_compute[187208]: 2025-12-05 12:14:35.970 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.029 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.031 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.031 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.043 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.105 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.106 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.144 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.146 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.146 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.211 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.212 187212 DEBUG nova.virt.disk.api [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Checking if we can resize image /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.213 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.277 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.279 187212 DEBUG nova.virt.disk.api [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Cannot resize image /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.279 187212 DEBUG nova.objects.instance [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'migration_context' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.294 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.295 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Ensure instance console log exists: /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.296 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.296 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.297 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.299 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.305 187212 WARNING nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.315 187212 DEBUG nova.virt.libvirt.host [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.316 187212 DEBUG nova.virt.libvirt.host [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.322 187212 DEBUG nova.virt.libvirt.host [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.323 187212 DEBUG nova.virt.libvirt.host [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.324 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.324 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.325 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.325 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.325 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.325 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.325 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.326 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.326 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.326 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.326 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.326 187212 DEBUG nova.virt.hardware [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.332 187212 DEBUG nova.objects.instance [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'pci_devices' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.351 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <uuid>108114b5-8832-494c-b436-40ffa2ffb7c1</uuid>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <name>instance-00000061</name>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerShowV254Test-server-1119665361</nova:name>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:14:36</nova:creationTime>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:14:36 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:14:36 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:14:36 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:14:36 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:14:36 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:14:36 compute-0 nova_compute[187208]:         <nova:user uuid="43dfbe2f6638492887b1176c979cc641">tempest-ServerShowV254Test-283894818-project-member</nova:user>
Dec 05 12:14:36 compute-0 nova_compute[187208]:         <nova:project uuid="96143bdab6004f13b4ae4ed16efdbf16">tempest-ServerShowV254Test-283894818</nova:project>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <system>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <entry name="serial">108114b5-8832-494c-b436-40ffa2ffb7c1</entry>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <entry name="uuid">108114b5-8832-494c-b436-40ffa2ffb7c1</entry>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     </system>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <os>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   </os>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <features>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   </features>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.config"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/console.log" append="off"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <video>
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     </video>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:14:36 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:14:36 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:14:36 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:14:36 compute-0 nova_compute[187208]: </domain>
Dec 05 12:14:36 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.412 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.413 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:14:36 compute-0 nova_compute[187208]: 2025-12-05 12:14:36.413 187212 INFO nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Using config drive
Dec 05 12:14:36 compute-0 podman[238842]: 2025-12-05 12:14:36.471984279 +0000 UTC m=+0.073034053 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 12:14:36 compute-0 podman[238841]: 2025-12-05 12:14:36.481836304 +0000 UTC m=+0.091302281 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=)
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.138 187212 INFO nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Creating config drive at /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.config
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.143 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppfhbxjth execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.269 187212 DEBUG oslo_concurrency.processutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppfhbxjth" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.332 187212 DEBUG nova.compute.manager [req-d0f35707-a8a2-4087-8071-f75ed1d6b7f7 req-78b7d5b0-07fb-4291-a590-f66f2f7680f1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.333 187212 DEBUG oslo_concurrency.lockutils [req-d0f35707-a8a2-4087-8071-f75ed1d6b7f7 req-78b7d5b0-07fb-4291-a590-f66f2f7680f1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.333 187212 DEBUG oslo_concurrency.lockutils [req-d0f35707-a8a2-4087-8071-f75ed1d6b7f7 req-78b7d5b0-07fb-4291-a590-f66f2f7680f1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.334 187212 DEBUG oslo_concurrency.lockutils [req-d0f35707-a8a2-4087-8071-f75ed1d6b7f7 req-78b7d5b0-07fb-4291-a590-f66f2f7680f1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.334 187212 DEBUG nova.compute.manager [req-d0f35707-a8a2-4087-8071-f75ed1d6b7f7 req-78b7d5b0-07fb-4291-a590-f66f2f7680f1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.334 187212 WARNING nova.compute.manager [req-d0f35707-a8a2-4087-8071-f75ed1d6b7f7 req-78b7d5b0-07fb-4291-a590-f66f2f7680f1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state active and task_state None.
Dec 05 12:14:37 compute-0 systemd-machined[153543]: New machine qemu-116-instance-00000061.
Dec 05 12:14:37 compute-0 systemd[1]: Started Virtual Machine qemu-116-instance-00000061.
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.497 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.730 187212 DEBUG nova.compute.manager [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.732 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.732 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936877.7292576, 108114b5-8832-494c-b436-40ffa2ffb7c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.733 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] VM Resumed (Lifecycle Event)
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.739 187212 INFO nova.virt.libvirt.driver [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance spawned successfully.
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.740 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.802 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.809 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.810 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.810 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.811 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.812 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.812 187212 DEBUG nova.virt.libvirt.driver [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.817 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.856 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.856 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936877.7310736, 108114b5-8832-494c-b436-40ffa2ffb7c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:37 compute-0 nova_compute[187208]: 2025-12-05 12:14:37.856 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] VM Started (Lifecycle Event)
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.025 187212 INFO nova.compute.manager [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Took 2.08 seconds to spawn the instance on the hypervisor.
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.026 187212 DEBUG nova.compute.manager [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.035 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.037 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.080 187212 DEBUG nova.network.neutron [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.146 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.148 187212 DEBUG oslo_concurrency.lockutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.149 187212 DEBUG nova.compute.manager [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.352 187212 INFO nova.compute.manager [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Took 3.22 seconds to build instance.
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.425 187212 DEBUG oslo_concurrency.lockutils [None req-a240e01e-01a4-4ebb-891e-ee1866007347 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "108114b5-8832-494c-b436-40ffa2ffb7c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.363s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:38 compute-0 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec 05 12:14:38 compute-0 NetworkManager[55691]: <info>  [1764936878.5785] device (tap5316adeb-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.592 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 ovn_controller[95610]: 2025-12-05T12:14:38Z|01036|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec 05 12:14:38 compute-0 ovn_controller[95610]: 2025-12-05T12:14:38Z|01037|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec 05 12:14:38 compute-0 ovn_controller[95610]: 2025-12-05T12:14:38Z|01038|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.611 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.613 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '8', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.614 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.617 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.620 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.622 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c20f97-3a12-4178-83d0-5b8b38fa73e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.623 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace which is not needed anymore
Dec 05 12:14:38 compute-0 systemd[1]: machine-qemu\x2d113\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Dec 05 12:14:38 compute-0 systemd[1]: machine-qemu\x2d113\x2dinstance\x2d0000005c.scope: Consumed 13.490s CPU time.
Dec 05 12:14:38 compute-0 systemd-machined[153543]: Machine qemu-113-instance-0000005c terminated.
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[238165]: [NOTICE]   (238169) : haproxy version is 2.8.14-c23fe91
Dec 05 12:14:38 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[238165]: [NOTICE]   (238169) : path to executable is /usr/sbin/haproxy
Dec 05 12:14:38 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[238165]: [WARNING]  (238169) : Exiting Master process...
Dec 05 12:14:38 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[238165]: [ALERT]    (238169) : Current worker (238171) exited with code 143 (Terminated)
Dec 05 12:14:38 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[238165]: [WARNING]  (238169) : All workers exited. Exiting... (0)
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.776 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 systemd[1]: libpod-dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588.scope: Deactivated successfully.
Dec 05 12:14:38 compute-0 podman[238925]: 2025-12-05 12:14:38.784807487 +0000 UTC m=+0.059285105 container died dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.792 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.826 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.826 187212 DEBUG nova.objects.instance [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588-userdata-shm.mount: Deactivated successfully.
Dec 05 12:14:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-a4f61021eca0a148ad38618e27fbc51c8fc72b7b15ec883b20606746789f0aea-merged.mount: Deactivated successfully.
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.843 187212 DEBUG nova.virt.libvirt.vif [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:14:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.844 187212 DEBUG nova.network.os_vif_util [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.845 187212 DEBUG nova.network.os_vif_util [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.845 187212 DEBUG os_vif [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.848 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.848 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5316adeb-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.850 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 podman[238925]: 2025-12-05 12:14:38.852805763 +0000 UTC m=+0.127283391 container cleanup dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.853 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.855 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 systemd[1]: libpod-conmon-dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588.scope: Deactivated successfully.
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.863 187212 INFO os_vif [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.875 187212 DEBUG nova.virt.libvirt.driver [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start _get_guest_xml network_info=[{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.881 187212 WARNING nova.virt.libvirt.driver [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.886 187212 DEBUG nova.virt.libvirt.host [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.887 187212 DEBUG nova.virt.libvirt.host [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.892 187212 DEBUG nova.virt.libvirt.host [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.893 187212 DEBUG nova.virt.libvirt.host [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.893 187212 DEBUG nova.virt.libvirt.driver [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.894 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.896 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.896 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.896 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.897 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.897 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.897 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.898 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.898 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.898 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.898 187212 DEBUG nova.virt.hardware [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.899 187212 DEBUG nova.objects.instance [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.925 187212 DEBUG nova.virt.libvirt.vif [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:14:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.926 187212 DEBUG nova.network.os_vif_util [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.927 187212 DEBUG nova.network.os_vif_util [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.928 187212 DEBUG nova.objects.instance [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:38 compute-0 podman[238969]: 2025-12-05 12:14:38.942140526 +0000 UTC m=+0.049095391 container remove dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.949 187212 DEBUG nova.virt.libvirt.driver [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <uuid>2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</uuid>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <name>instance-0000005c</name>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestJSON-server-954339420</nova:name>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:14:38</nova:creationTime>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:14:38 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:14:38 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:14:38 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:14:38 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:14:38 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:14:38 compute-0 nova_compute[187208]:         <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec 05 12:14:38 compute-0 nova_compute[187208]:         <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:14:38 compute-0 nova_compute[187208]:         <nova:port uuid="5316adeb-5a49-4a58-b997-f132a083ff13">
Dec 05 12:14:38 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <system>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <entry name="serial">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <entry name="uuid">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     </system>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <os>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   </os>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <features>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   </features>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:9a:d0:34"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <target dev="tap5316adeb-5a"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/console.log" append="off"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <video>
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     </video>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:14:38 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:14:38 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:14:38 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:14:38 compute-0 nova_compute[187208]: </domain>
Dec 05 12:14:38 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.950 187212 DEBUG oslo_concurrency.processutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.954 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c9e28c-e986-43cf-b0a6-d683b9c54158]: (4, ('Fri Dec  5 12:14:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588)\ndc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588\nFri Dec  5 12:14:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (dc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588)\ndc808058b3c0906b3f68f6f20370b7ab1d028b2f666e1542572beb057025f588\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.957 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[316a5cb9-8df7-47b7-b0e4-2ccad5af907d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.958 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:38 compute-0 kernel: tapf9ed41c2-b0: left promiscuous mode
Dec 05 12:14:38 compute-0 nova_compute[187208]: 2025-12-05 12:14:38.980 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.982 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[def22c3f-d533-4fac-8837-d19d2a665934]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:38 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.997 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3035ab-c638-48bf-90ef-9011d4325e05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:38.999 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea9a5cc-aac7-45d9-9701-04dbc13db721]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.019 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[80432620-c158-4a3d-9932-794e34a7a56e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423255, 'reachable_time': 32132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238984, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 systemd[1]: run-netns-ovnmeta\x2df9ed41c2\x2db085\x2d41ff\x2dac71\x2d6256a4e30e85.mount: Deactivated successfully.
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.026 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.026 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[6d82d404-194f-4255-815b-14cd6d6d854c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.039 187212 DEBUG oslo_concurrency.processutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.040 187212 DEBUG oslo_concurrency.processutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.109 187212 DEBUG oslo_concurrency.processutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.112 187212 DEBUG nova.objects.instance [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.126 187212 DEBUG oslo_concurrency.processutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.195 187212 DEBUG oslo_concurrency.processutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.197 187212 DEBUG nova.virt.disk.api [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.198 187212 DEBUG oslo_concurrency.processutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.273 187212 DEBUG oslo_concurrency.processutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.275 187212 DEBUG nova.virt.disk.api [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.275 187212 DEBUG nova.objects.instance [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.294 187212 DEBUG nova.virt.libvirt.vif [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:14:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.295 187212 DEBUG nova.network.os_vif_util [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.296 187212 DEBUG nova.network.os_vif_util [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.296 187212 DEBUG os_vif [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.297 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.297 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.298 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.301 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.302 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5316adeb-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.302 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5316adeb-5a, col_values=(('external_ids', {'iface-id': '5316adeb-5a49-4a58-b997-f132a083ff13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:d0:34', 'vm-uuid': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.304 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 NetworkManager[55691]: <info>  [1764936879.3062] manager: (tap5316adeb-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/397)
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.308 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.310 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.311 187212 INFO os_vif [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:14:39 compute-0 kernel: tap5316adeb-5a: entered promiscuous mode
Dec 05 12:14:39 compute-0 systemd-udevd[238901]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:14:39 compute-0 NetworkManager[55691]: <info>  [1764936879.4046] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/398)
Dec 05 12:14:39 compute-0 ovn_controller[95610]: 2025-12-05T12:14:39Z|01039|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec 05 12:14:39 compute-0 ovn_controller[95610]: 2025-12-05T12:14:39Z|01040|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.411 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 NetworkManager[55691]: <info>  [1764936879.4263] device (tap5316adeb-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:14:39 compute-0 NetworkManager[55691]: <info>  [1764936879.4273] device (tap5316adeb-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.423 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '8', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.425 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.427 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:14:39 compute-0 ovn_controller[95610]: 2025-12-05T12:14:39Z|01041|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec 05 12:14:39 compute-0 ovn_controller[95610]: 2025-12-05T12:14:39Z|01042|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.436 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.438 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.442 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3945222a-b2e5-4588-8f26-8d2ab1bf8c27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.444 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9ed41c2-b1 in ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.447 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9ed41c2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.447 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a9264f-931c-4177-a80f-4e5c5048b87b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2b5583df-2cc6-45fe-9c78-bbdb34dafbe0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 systemd-machined[153543]: New machine qemu-117-instance-0000005c.
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.461 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[6c2c0668-f479-4afb-9cd0-2c43bfa3cf10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 systemd[1]: Started Virtual Machine qemu-117-instance-0000005c.
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.479 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3df36277-9590-4f5a-bc5c-eaa090ac45bd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.511 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8e39f918-cbcf-4a89-a588-effcd78263ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.517 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aa699208-4d7e-4b9e-8d24-f7dd6cbe3d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 NetworkManager[55691]: <info>  [1764936879.5182] manager: (tapf9ed41c2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/399)
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.559 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[42ed1cc8-3b4f-4ece-9faa-bdf5d92b0ff4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.561 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9ff65fd8-ad93-45da-bb81-98ebd09bc8ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 NetworkManager[55691]: <info>  [1764936879.5845] device (tapf9ed41c2-b0): carrier: link connected
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.590 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4e5da77f-e348-4223-a851-35c8943db589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.621 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5faad291-630f-451c-a611-0b53f40e75d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239043, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.641 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[89e1f456-202f-4ec0-a789-315a31d4561a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:9111'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426014, 'tstamp': 426014}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239044, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.662 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[50bb12ba-d640-455e-af0e-6af3c901bbb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239045, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.674 187212 DEBUG nova.compute.manager [req-bb0bd49f-d9ac-45c2-9341-3d49e9506a77 req-27004fd2-dbc2-41c7-be73-e189682a64ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.674 187212 DEBUG oslo_concurrency.lockutils [req-bb0bd49f-d9ac-45c2-9341-3d49e9506a77 req-27004fd2-dbc2-41c7-be73-e189682a64ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.675 187212 DEBUG oslo_concurrency.lockutils [req-bb0bd49f-d9ac-45c2-9341-3d49e9506a77 req-27004fd2-dbc2-41c7-be73-e189682a64ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.675 187212 DEBUG oslo_concurrency.lockutils [req-bb0bd49f-d9ac-45c2-9341-3d49e9506a77 req-27004fd2-dbc2-41c7-be73-e189682a64ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.675 187212 DEBUG nova.compute.manager [req-bb0bd49f-d9ac-45c2-9341-3d49e9506a77 req-27004fd2-dbc2-41c7-be73-e189682a64ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.675 187212 WARNING nova.compute.manager [req-bb0bd49f-d9ac-45c2-9341-3d49e9506a77 req-27004fd2-dbc2-41c7-be73-e189682a64ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state reboot_started_hard.
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.696 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[303539ac-ae8d-402e-9134-71096c674bb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.764 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa499cf-a9b7-4d3f-936e-2e675be82fda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.765 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.766 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.766 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 NetworkManager[55691]: <info>  [1764936879.7690] manager: (tapf9ed41c2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/400)
Dec 05 12:14:39 compute-0 kernel: tapf9ed41c2-b0: entered promiscuous mode
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.772 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.773 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:39 compute-0 ovn_controller[95610]: 2025-12-05T12:14:39Z|01043|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.776 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.776 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.777 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f004d894-71eb-45f9-8a01-95da8475cbcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.778 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:14:39 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:39.779 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'env', 'PROCESS_TAG=haproxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9ed41c2-b085-41ff-ac71-6256a4e30e85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.788 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.882 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.882 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936879.8759904, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.882 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Resumed (Lifecycle Event)
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.884 187212 DEBUG nova.compute.manager [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.890 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance rebooted successfully.
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.891 187212 DEBUG nova.compute.manager [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.920 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.924 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.958 187212 DEBUG oslo_concurrency.lockutils [None req-304fb0fd-4681-4eb7-8aee-10e612dba73f 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.961 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936879.8777168, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.961 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Started (Lifecycle Event)
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.978 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:39 compute-0 nova_compute[187208]: 2025-12-05 12:14:39.981 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:40 compute-0 podman[239084]: 2025-12-05 12:14:40.180086057 +0000 UTC m=+0.062681504 container create f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:14:40 compute-0 systemd[1]: Started libpod-conmon-f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14.scope.
Dec 05 12:14:40 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:14:40 compute-0 podman[239084]: 2025-12-05 12:14:40.146404653 +0000 UTC m=+0.029000150 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:14:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2881d06722a68d1b04f02f9f59234c376caaf5f0e8e911e5a9b452b4d2ddaaf9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:14:40 compute-0 podman[239084]: 2025-12-05 12:14:40.256818305 +0000 UTC m=+0.139413762 container init f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:14:40 compute-0 podman[239084]: 2025-12-05 12:14:40.265692202 +0000 UTC m=+0.148287649 container start f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 12:14:40 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [NOTICE]   (239103) : New worker (239105) forked
Dec 05 12:14:40 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [NOTICE]   (239103) : Loading success.
Dec 05 12:14:41 compute-0 ovn_controller[95610]: 2025-12-05T12:14:41Z|01044|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:14:41 compute-0 ovn_controller[95610]: 2025-12-05T12:14:41Z|01045|binding|INFO|Releasing lport f81c4a80-27d3-4231-a37a-7c231838aca7 from this chassis (sb_readonly=0)
Dec 05 12:14:41 compute-0 nova_compute[187208]: 2025-12-05 12:14:41.088 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:41 compute-0 nova_compute[187208]: 2025-12-05 12:14:41.919 187212 INFO nova.compute.manager [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Rebuilding instance
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.360 187212 DEBUG nova.compute.manager [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.360 187212 DEBUG oslo_concurrency.lockutils [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.361 187212 DEBUG oslo_concurrency.lockutils [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.361 187212 DEBUG oslo_concurrency.lockutils [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.361 187212 DEBUG nova.compute.manager [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.362 187212 WARNING nova.compute.manager [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.362 187212 DEBUG nova.compute.manager [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.362 187212 DEBUG oslo_concurrency.lockutils [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.363 187212 DEBUG oslo_concurrency.lockutils [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.363 187212 DEBUG oslo_concurrency.lockutils [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.363 187212 DEBUG nova.compute.manager [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.364 187212 WARNING nova.compute.manager [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.364 187212 DEBUG nova.compute.manager [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.364 187212 DEBUG oslo_concurrency.lockutils [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.365 187212 DEBUG oslo_concurrency.lockutils [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.365 187212 DEBUG oslo_concurrency.lockutils [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.365 187212 DEBUG nova.compute.manager [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.365 187212 WARNING nova.compute.manager [req-b0d9c65c-e314-4e1c-95cc-82b177e07213 req-cb6dc62c-c807-4c7e-91df-d7dd4ec0a626 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.472 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936867.4713018, 6926ef53-01fc-476e-a6af-82edff2ead1c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.472 187212 INFO nova.compute.manager [-] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] VM Stopped (Lifecycle Event)
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.499 187212 DEBUG nova.compute.manager [None req-f3faad55-a50a-441f-9f40-0483e1ccad29 - - - - - -] [instance: 6926ef53-01fc-476e-a6af-82edff2ead1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.814 187212 DEBUG nova.objects.instance [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.835 187212 DEBUG nova.compute.manager [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.909 187212 DEBUG nova.objects.instance [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'pci_requests' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.933 187212 DEBUG nova.objects.instance [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'pci_devices' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.944 187212 DEBUG nova.objects.instance [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'resources' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.957 187212 DEBUG nova.objects.instance [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'migration_context' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.978 187212 DEBUG nova.objects.instance [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:14:42 compute-0 nova_compute[187208]: 2025-12-05 12:14:42.983 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:14:43 compute-0 nova_compute[187208]: 2025-12-05 12:14:43.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:44 compute-0 nova_compute[187208]: 2025-12-05 12:14:44.095 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:44 compute-0 nova_compute[187208]: 2025-12-05 12:14:44.304 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:45 compute-0 nova_compute[187208]: 2025-12-05 12:14:45.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:45 compute-0 nova_compute[187208]: 2025-12-05 12:14:45.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:14:45 compute-0 nova_compute[187208]: 2025-12-05 12:14:45.094 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:14:46 compute-0 nova_compute[187208]: 2025-12-05 12:14:46.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:46 compute-0 nova_compute[187208]: 2025-12-05 12:14:46.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:14:46 compute-0 podman[239116]: 2025-12-05 12:14:46.209594331 +0000 UTC m=+0.062877068 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:14:46 compute-0 podman[239115]: 2025-12-05 12:14:46.247102316 +0000 UTC m=+0.102254407 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:14:46 compute-0 podman[239117]: 2025-12-05 12:14:46.280169192 +0000 UTC m=+0.126519559 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 05 12:14:48 compute-0 nova_compute[187208]: 2025-12-05 12:14:48.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:48 compute-0 nova_compute[187208]: 2025-12-05 12:14:48.755 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:49 compute-0 nova_compute[187208]: 2025-12-05 12:14:49.306 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:50 compute-0 nova_compute[187208]: 2025-12-05 12:14:50.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:51 compute-0 nova_compute[187208]: 2025-12-05 12:14:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.094 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.094 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.095 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.189 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.256 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.257 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.329 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.338 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.415 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.417 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.485 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.494 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.569 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.570 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:52 compute-0 ovn_controller[95610]: 2025-12-05T12:14:52Z|00120|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.638 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.815 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.817 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5149MB free_disk=72.96176147460938GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.817 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.817 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.991 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.991 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 28e48516-8665-4d98-a92d-c84b7da9a284 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.991 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 108114b5-8832-494c-b436-40ffa2ffb7c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.992 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:14:52 compute-0 nova_compute[187208]: 2025-12-05 12:14:52.992 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:14:53 compute-0 nova_compute[187208]: 2025-12-05 12:14:53.088 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:14:53 compute-0 nova_compute[187208]: 2025-12-05 12:14:53.104 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:14:53 compute-0 nova_compute[187208]: 2025-12-05 12:14:53.134 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:14:53 compute-0 nova_compute[187208]: 2025-12-05 12:14:53.134 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:53 compute-0 nova_compute[187208]: 2025-12-05 12:14:53.241 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:14:53 compute-0 nova_compute[187208]: 2025-12-05 12:14:53.757 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:54 compute-0 nova_compute[187208]: 2025-12-05 12:14:54.309 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:54 compute-0 nova_compute[187208]: 2025-12-05 12:14:54.995 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:54.995 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:14:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:54.998 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:14:55 compute-0 systemd[1]: machine-qemu\x2d116\x2dinstance\x2d00000061.scope: Deactivated successfully.
Dec 05 12:14:55 compute-0 systemd[1]: machine-qemu\x2d116\x2dinstance\x2d00000061.scope: Consumed 12.334s CPU time.
Dec 05 12:14:55 compute-0 systemd-machined[153543]: Machine qemu-116-instance-00000061 terminated.
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.130 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.257 187212 INFO nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance shutdown successfully after 13 seconds.
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.263 187212 INFO nova.virt.libvirt.driver [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance destroyed successfully.
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.268 187212 INFO nova.virt.libvirt.driver [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance destroyed successfully.
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.307 187212 INFO nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Deleting instance files /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1_del
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.308 187212 INFO nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Deletion of /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1_del complete
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.538 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.539 187212 INFO nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Creating image(s)
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.539 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.540 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.540 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.556 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.628 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.629 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.629 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.640 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.706 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.708 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.874 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk 1073741824" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.876 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.877 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.938 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.939 187212 DEBUG nova.virt.disk.api [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Checking if we can resize image /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:14:56 compute-0 nova_compute[187208]: 2025-12-05 12:14:56.939 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:14:57.000 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.003 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.004 187212 DEBUG nova.virt.disk.api [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Cannot resize image /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.005 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.005 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Ensure instance console log exists: /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.005 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.006 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.006 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.008 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.012 187212 WARNING nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.087 187212 DEBUG nova.virt.libvirt.host [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.087 187212 DEBUG nova.virt.libvirt.host [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.092 187212 DEBUG nova.virt.libvirt.host [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.092 187212 DEBUG nova.virt.libvirt.host [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.093 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.093 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.093 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.094 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.094 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.094 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.094 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.094 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.095 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.095 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.095 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.095 187212 DEBUG nova.virt.hardware [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.096 187212 DEBUG nova.objects.instance [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.125 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <uuid>108114b5-8832-494c-b436-40ffa2ffb7c1</uuid>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <name>instance-00000061</name>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerShowV254Test-server-1119665361</nova:name>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:14:57</nova:creationTime>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:14:57 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:14:57 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:14:57 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:14:57 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:14:57 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:14:57 compute-0 nova_compute[187208]:         <nova:user uuid="43dfbe2f6638492887b1176c979cc641">tempest-ServerShowV254Test-283894818-project-member</nova:user>
Dec 05 12:14:57 compute-0 nova_compute[187208]:         <nova:project uuid="96143bdab6004f13b4ae4ed16efdbf16">tempest-ServerShowV254Test-283894818</nova:project>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <system>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <entry name="serial">108114b5-8832-494c-b436-40ffa2ffb7c1</entry>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <entry name="uuid">108114b5-8832-494c-b436-40ffa2ffb7c1</entry>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     </system>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <os>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   </os>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <features>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   </features>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.config"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/console.log" append="off"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <video>
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     </video>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:14:57 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:14:57 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:14:57 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:14:57 compute-0 nova_compute[187208]: </domain>
Dec 05 12:14:57 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:14:57 compute-0 podman[239246]: 2025-12-05 12:14:57.2109767 +0000 UTC m=+0.051816399 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.506 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.507 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.508 187212 INFO nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Using config drive
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.523 187212 DEBUG nova.objects.instance [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.928 187212 INFO nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Creating config drive at /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.config
Dec 05 12:14:57 compute-0 nova_compute[187208]: 2025-12-05 12:14:57.933 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzcudfpkd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.064 187212 DEBUG oslo_concurrency.processutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzcudfpkd" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:14:58 compute-0 systemd-machined[153543]: New machine qemu-118-instance-00000061.
Dec 05 12:14:58 compute-0 systemd[1]: Started Virtual Machine qemu-118-instance-00000061.
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.760 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.843 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 108114b5-8832-494c-b436-40ffa2ffb7c1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.844 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936898.8428326, 108114b5-8832-494c-b436-40ffa2ffb7c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.844 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] VM Resumed (Lifecycle Event)
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.847 187212 DEBUG nova.compute.manager [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.847 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.851 187212 INFO nova.virt.libvirt.driver [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance spawned successfully.
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.852 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.872 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.878 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.882 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.882 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.883 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.885 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.885 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.886 187212 DEBUG nova.virt.libvirt.driver [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.901 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.901 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936898.8437839, 108114b5-8832-494c-b436-40ffa2ffb7c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.901 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] VM Started (Lifecycle Event)
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.925 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.928 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.955 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:14:58 compute-0 nova_compute[187208]: 2025-12-05 12:14:58.966 187212 DEBUG nova.compute.manager [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:14:59 compute-0 nova_compute[187208]: 2025-12-05 12:14:59.019 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:14:59 compute-0 nova_compute[187208]: 2025-12-05 12:14:59.019 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:14:59 compute-0 nova_compute[187208]: 2025-12-05 12:14:59.020 187212 DEBUG nova.objects.instance [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:14:59 compute-0 nova_compute[187208]: 2025-12-05 12:14:59.099 187212 DEBUG oslo_concurrency.lockutils [None req-3f8249e8-72b5-496a-8b30-f56678ff3abb 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:14:59 compute-0 nova_compute[187208]: 2025-12-05 12:14:59.311 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.026 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "108114b5-8832-494c-b436-40ffa2ffb7c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.026 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "108114b5-8832-494c-b436-40ffa2ffb7c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.027 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "108114b5-8832-494c-b436-40ffa2ffb7c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.027 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "108114b5-8832-494c-b436-40ffa2ffb7c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.027 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "108114b5-8832-494c-b436-40ffa2ffb7c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.029 187212 INFO nova.compute.manager [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Terminating instance
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.030 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "refresh_cache-108114b5-8832-494c-b436-40ffa2ffb7c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.030 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquired lock "refresh_cache-108114b5-8832-494c-b436-40ffa2ffb7c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.030 187212 DEBUG nova.network.neutron [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.635 187212 DEBUG nova.network.neutron [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.664 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.665 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.665 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.666 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.666 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.667 187212 INFO nova.compute.manager [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Terminating instance
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.668 187212 DEBUG nova.compute.manager [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:15:02 compute-0 kernel: tape30774db-d3 (unregistering): left promiscuous mode
Dec 05 12:15:02 compute-0 NetworkManager[55691]: <info>  [1764936902.7061] device (tape30774db-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.714 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:02 compute-0 ovn_controller[95610]: 2025-12-05T12:15:02Z|01046|binding|INFO|Releasing lport e30774db-d3d3-4438-b68a-6f7855f55128 from this chassis (sb_readonly=0)
Dec 05 12:15:02 compute-0 ovn_controller[95610]: 2025-12-05T12:15:02Z|01047|binding|INFO|Setting lport e30774db-d3d3-4438-b68a-6f7855f55128 down in Southbound
Dec 05 12:15:02 compute-0 ovn_controller[95610]: 2025-12-05T12:15:02Z|01048|binding|INFO|Removing iface tape30774db-d3 ovn-installed in OVS
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.717 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.730 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:02.733 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8e:78 10.100.0.9'], port_security=['fa:16:3e:50:8e:78 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '28e48516-8665-4d98-a92d-c84b7da9a284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5b34686513f4abc8165113eb8c6831e', 'neutron:revision_number': '13', 'neutron:security_group_ids': '01338859-6837-49f5-8df0-351fa8e007e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5150d7c5-48a1-4791-bdfd-ff83dc63b9cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e30774db-d3d3-4438-b68a-6f7855f55128) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:15:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:02.735 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e30774db-d3d3-4438-b68a-6f7855f55128 in datapath 82130d25-ff6c-480e-884d-f3d97b6fd9be unbound from our chassis
Dec 05 12:15:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:02.736 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82130d25-ff6c-480e-884d-f3d97b6fd9be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:15:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:02.738 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[47f685df-ca04-4b9c-81aa-82ede697d7f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:02.739 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be namespace which is not needed anymore
Dec 05 12:15:02 compute-0 systemd[1]: machine-qemu\x2d115\x2dinstance\x2d00000050.scope: Deactivated successfully.
Dec 05 12:15:02 compute-0 systemd[1]: machine-qemu\x2d115\x2dinstance\x2d00000050.scope: Consumed 1.745s CPU time.
Dec 05 12:15:02 compute-0 systemd-machined[153543]: Machine qemu-115-instance-00000050 terminated.
Dec 05 12:15:02 compute-0 podman[239299]: 2025-12-05 12:15:02.799206205 +0000 UTC m=+0.067964476 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 12:15:02 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[238806]: [NOTICE]   (238810) : haproxy version is 2.8.14-c23fe91
Dec 05 12:15:02 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[238806]: [NOTICE]   (238810) : path to executable is /usr/sbin/haproxy
Dec 05 12:15:02 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[238806]: [WARNING]  (238810) : Exiting Master process...
Dec 05 12:15:02 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[238806]: [ALERT]    (238810) : Current worker (238812) exited with code 143 (Terminated)
Dec 05 12:15:02 compute-0 neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be[238806]: [WARNING]  (238810) : All workers exited. Exiting... (0)
Dec 05 12:15:02 compute-0 systemd[1]: libpod-f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879.scope: Deactivated successfully.
Dec 05 12:15:02 compute-0 podman[239343]: 2025-12-05 12:15:02.892717089 +0000 UTC m=+0.061718666 container died f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.946 187212 INFO nova.virt.libvirt.driver [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Instance destroyed successfully.
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.947 187212 DEBUG nova.objects.instance [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lazy-loading 'resources' on Instance uuid 28e48516-8665-4d98-a92d-c84b7da9a284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.965 187212 DEBUG nova.virt.libvirt.vif [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-826937421',display_name='tempest-ServersNegativeTestJSON-server-826937421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-826937421',id=80,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:14:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c5b34686513f4abc8165113eb8c6831e',ramdisk_id='',reservation_id='r-snx0qylv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1063007033',owner_user_name='tempest-ServersNegativeTestJSON-1063007033-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:14:33Z,user_data=None,user_id='e90fa3a379b4494c84626bb6a761cd30',uuid=28e48516-8665-4d98-a92d-c84b7da9a284,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.965 187212 DEBUG nova.network.os_vif_util [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converting VIF {"id": "e30774db-d3d3-4438-b68a-6f7855f55128", "address": "fa:16:3e:50:8e:78", "network": {"id": "82130d25-ff6c-480e-884d-f3d97b6fd9be", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-112002901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5b34686513f4abc8165113eb8c6831e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape30774db-d3", "ovs_interfaceid": "e30774db-d3d3-4438-b68a-6f7855f55128", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.966 187212 DEBUG nova.network.os_vif_util [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.966 187212 DEBUG os_vif [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.968 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.969 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape30774db-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.974 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.976 187212 INFO os_vif [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8e:78,bridge_name='br-int',has_traffic_filtering=True,id=e30774db-d3d3-4438-b68a-6f7855f55128,network=Network(82130d25-ff6c-480e-884d-f3d97b6fd9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape30774db-d3')
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.976 187212 INFO nova.virt.libvirt.driver [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Deleting instance files /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284_del
Dec 05 12:15:02 compute-0 nova_compute[187208]: 2025-12-05 12:15:02.981 187212 INFO nova.virt.libvirt.driver [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Deletion of /var/lib/nova/instances/28e48516-8665-4d98-a92d-c84b7da9a284_del complete
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.019 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.020 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.020 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.081 187212 INFO nova.compute.manager [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Took 0.41 seconds to destroy the instance on the hypervisor.
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.082 187212 DEBUG oslo.service.loopingcall [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.083 187212 DEBUG nova.compute.manager [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.083 187212 DEBUG nova.network.neutron [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.094 187212 DEBUG nova.compute.manager [req-c38d9897-b02e-4feb-b828-bf4a5bc46a9a req-e9c92056-e918-48a8-b39e-a164568bf164 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.095 187212 DEBUG oslo_concurrency.lockutils [req-c38d9897-b02e-4feb-b828-bf4a5bc46a9a req-e9c92056-e918-48a8-b39e-a164568bf164 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.095 187212 DEBUG oslo_concurrency.lockutils [req-c38d9897-b02e-4feb-b828-bf4a5bc46a9a req-e9c92056-e918-48a8-b39e-a164568bf164 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.095 187212 DEBUG oslo_concurrency.lockutils [req-c38d9897-b02e-4feb-b828-bf4a5bc46a9a req-e9c92056-e918-48a8-b39e-a164568bf164 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.095 187212 DEBUG nova.compute.manager [req-c38d9897-b02e-4feb-b828-bf4a5bc46a9a req-e9c92056-e918-48a8-b39e-a164568bf164 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.096 187212 DEBUG nova.compute.manager [req-c38d9897-b02e-4feb-b828-bf4a5bc46a9a req-e9c92056-e918-48a8-b39e-a164568bf164 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-unplugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:15:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879-userdata-shm.mount: Deactivated successfully.
Dec 05 12:15:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-ad5b6c5ff20a25239187ca9c26da1c7d82cdf3dd0052137420ca1cf549f4eef5-merged.mount: Deactivated successfully.
Dec 05 12:15:03 compute-0 podman[239343]: 2025-12-05 12:15:03.220005091 +0000 UTC m=+0.389006678 container cleanup f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 05 12:15:03 compute-0 systemd[1]: libpod-conmon-f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879.scope: Deactivated successfully.
Dec 05 12:15:03 compute-0 podman[239387]: 2025-12-05 12:15:03.338853138 +0000 UTC m=+0.091787035 container remove f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.349 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb4da73-d867-4b32-b061-a871e80292d4]: (4, ('Fri Dec  5 12:15:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be (f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879)\nf2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879\nFri Dec  5 12:15:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be (f2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879)\nf2539fb519952e8bf816913f6510f9bf72907c8b0198354a6f3f1be9f2ff4879\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.351 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c19280-2fc9-4a48-9bf9-9a4ad3e7ed4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.352 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82130d25-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:03 compute-0 kernel: tap82130d25-f0: left promiscuous mode
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.355 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.370 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.375 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[be842ac5-c308-47f2-bd0f-e39fd616f192]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.392 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[08f23518-986a-485e-a0bc-967d3691d198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.395 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a69c4d6e-3527-490e-91bb-c90a749cf775]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.411 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0333d7b7-fca3-4008-a9fa-7a79d5d4dbbe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425382, 'reachable_time': 19435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239402, 'error': None, 'target': 'ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.415 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82130d25-ff6c-480e-884d-f3d97b6fd9be deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:15:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:03.415 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0e0704-665f-4bd8-8291-3349386251b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d82130d25\x2dff6c\x2d480e\x2d884d\x2df3d97b6fd9be.mount: Deactivated successfully.
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.713 187212 DEBUG nova.network.neutron [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.741 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Releasing lock "refresh_cache-108114b5-8832-494c-b436-40ffa2ffb7c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.742 187212 DEBUG nova.compute.manager [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:15:03 compute-0 nova_compute[187208]: 2025-12-05 12:15:03.760 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:03 compute-0 systemd[1]: machine-qemu\x2d118\x2dinstance\x2d00000061.scope: Deactivated successfully.
Dec 05 12:15:03 compute-0 systemd[1]: machine-qemu\x2d118\x2dinstance\x2d00000061.scope: Consumed 5.648s CPU time.
Dec 05 12:15:03 compute-0 systemd-machined[153543]: Machine qemu-118-instance-00000061 terminated.
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.004 187212 INFO nova.virt.libvirt.driver [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance destroyed successfully.
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.005 187212 DEBUG nova.objects.instance [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'resources' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:04 compute-0 rsyslogd[1004]: imjournal: 17233 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.018 187212 INFO nova.virt.libvirt.driver [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Deleting instance files /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1_del
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.019 187212 INFO nova.virt.libvirt.driver [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Deletion of /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1_del complete
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.084 187212 INFO nova.compute.manager [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.085 187212 DEBUG oslo.service.loopingcall [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.085 187212 DEBUG nova.compute.manager [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.085 187212 DEBUG nova.network.neutron [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.223 187212 DEBUG nova.network.neutron [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.273 187212 INFO nova.compute.manager [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Took 1.19 seconds to deallocate network for instance.
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.341 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.342 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.436 187212 DEBUG nova.compute.provider_tree [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.473 187212 DEBUG nova.scheduler.client.report [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.505 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.509 187212 DEBUG nova.network.neutron [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.530 187212 DEBUG nova.network.neutron [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.550 187212 INFO nova.compute.manager [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Took 0.46 seconds to deallocate network for instance.
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.560 187212 INFO nova.scheduler.client.report [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Deleted allocations for instance 28e48516-8665-4d98-a92d-c84b7da9a284
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.651 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.652 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.708 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.739 187212 DEBUG nova.compute.provider_tree [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.756 187212 DEBUG nova.scheduler.client.report [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.779 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.819 187212 INFO nova.scheduler.client.report [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Deleted allocations for instance 108114b5-8832-494c-b436-40ffa2ffb7c1
Dec 05 12:15:04 compute-0 nova_compute[187208]: 2025-12-05 12:15:04.877 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "108114b5-8832-494c-b436-40ffa2ffb7c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.851s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:05 compute-0 nova_compute[187208]: 2025-12-05 12:15:05.313 187212 DEBUG nova.compute.manager [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:05 compute-0 nova_compute[187208]: 2025-12-05 12:15:05.313 187212 DEBUG oslo_concurrency.lockutils [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:05 compute-0 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 DEBUG oslo_concurrency.lockutils [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:05 compute-0 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 DEBUG oslo_concurrency.lockutils [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:05 compute-0 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 DEBUG nova.compute.manager [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:15:05 compute-0 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 WARNING nova.compute.manager [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state deleted and task_state None.
Dec 05 12:15:05 compute-0 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 DEBUG nova.compute.manager [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-deleted-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:07 compute-0 podman[239411]: 2025-12-05 12:15:07.22276866 +0000 UTC m=+0.069623794 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 12:15:07 compute-0 podman[239412]: 2025-12-05 12:15:07.241054098 +0000 UTC m=+0.085462771 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:15:07 compute-0 nova_compute[187208]: 2025-12-05 12:15:07.973 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:08 compute-0 nova_compute[187208]: 2025-12-05 12:15:08.763 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:10 compute-0 ovn_controller[95610]: 2025-12-05T12:15:10Z|01049|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:15:10 compute-0 nova_compute[187208]: 2025-12-05 12:15:10.415 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:12 compute-0 nova_compute[187208]: 2025-12-05 12:15:12.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.363 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.364 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.387 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.478 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.479 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.486 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.487 187212 INFO nova.compute.claims [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.642 187212 DEBUG nova.compute.provider_tree [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.765 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.801 187212 DEBUG nova.scheduler.client.report [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.823 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.824 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.865 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.884 187212 INFO nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:15:13 compute-0 nova_compute[187208]: 2025-12-05 12:15:13.901 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.007 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.009 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.009 187212 INFO nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating image(s)
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.010 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.010 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.011 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.025 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.095 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.097 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.097 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.109 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.166 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.167 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.210 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.211 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.213 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.283 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.284 187212 DEBUG nova.virt.disk.api [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Checking if we can resize image /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.285 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.352 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.354 187212 DEBUG nova.virt.disk.api [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Cannot resize image /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.354 187212 DEBUG nova.objects.instance [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'migration_context' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.418 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.419 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Ensure instance console log exists: /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.419 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.420 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.420 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.422 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.426 187212 WARNING nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.433 187212 DEBUG nova.virt.libvirt.host [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.435 187212 DEBUG nova.virt.libvirt.host [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.439 187212 DEBUG nova.virt.libvirt.host [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.440 187212 DEBUG nova.virt.libvirt.host [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.444 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.444 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.446 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.446 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.446 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.446 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.447 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.453 187212 DEBUG nova.objects.instance [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.488 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <uuid>f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</uuid>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <name>instance-00000062</name>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerShowV257Test-server-228959241</nova:name>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:15:14</nova:creationTime>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:15:14 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:15:14 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:15:14 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:15:14 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:15:14 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:15:14 compute-0 nova_compute[187208]:         <nova:user uuid="3733ad965c154ae490947ad2a50e221d">tempest-ServerShowV257Test-1797821111-project-member</nova:user>
Dec 05 12:15:14 compute-0 nova_compute[187208]:         <nova:project uuid="239937ac98c24d5198788674713b75a1">tempest-ServerShowV257Test-1797821111</nova:project>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <system>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <entry name="serial">f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</entry>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <entry name="uuid">f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</entry>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     </system>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <os>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   </os>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <features>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   </features>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/console.log" append="off"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <video>
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     </video>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:15:14 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:15:14 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:15:14 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:15:14 compute-0 nova_compute[187208]: </domain>
Dec 05 12:15:14 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.564 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.564 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:15:14 compute-0 nova_compute[187208]: 2025-12-05 12:15:14.565 187212 INFO nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Using config drive
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.192 187212 INFO nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating config drive at /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.197 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ekbyhys execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.330 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ekbyhys" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:15 compute-0 systemd-machined[153543]: New machine qemu-119-instance-00000062.
Dec 05 12:15:15 compute-0 systemd[1]: Started Virtual Machine qemu-119-instance-00000062.
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.876 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936915.8756666, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.876 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Resumed (Lifecycle Event)
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.879 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.879 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.882 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance spawned successfully.
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.882 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.898 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.904 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.908 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.908 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.909 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.909 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.910 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.910 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.937 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.937 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936915.8788347, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.937 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Started (Lifecycle Event)
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.965 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.968 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.981 187212 INFO nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Took 1.97 seconds to spawn the instance on the hypervisor.
Dec 05 12:15:15 compute-0 nova_compute[187208]: 2025-12-05 12:15:15.982 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:16 compute-0 nova_compute[187208]: 2025-12-05 12:15:16.008 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:15:16 compute-0 nova_compute[187208]: 2025-12-05 12:15:16.043 187212 INFO nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Took 2.61 seconds to build instance.
Dec 05 12:15:16 compute-0 nova_compute[187208]: 2025-12-05 12:15:16.064 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:17 compute-0 podman[239493]: 2025-12-05 12:15:17.20796282 +0000 UTC m=+0.055230228 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:15:17 compute-0 podman[239492]: 2025-12-05 12:15:17.209581857 +0000 UTC m=+0.060814710 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:15:17 compute-0 podman[239494]: 2025-12-05 12:15:17.246395771 +0000 UTC m=+0.090154838 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:15:17 compute-0 nova_compute[187208]: 2025-12-05 12:15:17.944 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936902.9433396, 28e48516-8665-4d98-a92d-c84b7da9a284 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:17 compute-0 nova_compute[187208]: 2025-12-05 12:15:17.944 187212 INFO nova.compute.manager [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Stopped (Lifecycle Event)
Dec 05 12:15:17 compute-0 nova_compute[187208]: 2025-12-05 12:15:17.978 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:18 compute-0 nova_compute[187208]: 2025-12-05 12:15:18.107 187212 DEBUG nova.compute.manager [None req-c8491b5f-0679-46e8-8e6d-66a723658f43 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:18 compute-0 nova_compute[187208]: 2025-12-05 12:15:18.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:19 compute-0 nova_compute[187208]: 2025-12-05 12:15:19.002 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936904.0013607, 108114b5-8832-494c-b436-40ffa2ffb7c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:19 compute-0 nova_compute[187208]: 2025-12-05 12:15:19.002 187212 INFO nova.compute.manager [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] VM Stopped (Lifecycle Event)
Dec 05 12:15:19 compute-0 nova_compute[187208]: 2025-12-05 12:15:19.381 187212 DEBUG nova.compute.manager [None req-43d29690-47e5-4748-93bb-24bf9571babf - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:20 compute-0 nova_compute[187208]: 2025-12-05 12:15:20.061 187212 INFO nova.compute.manager [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Rebuilding instance
Dec 05 12:15:20 compute-0 nova_compute[187208]: 2025-12-05 12:15:20.778 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:20 compute-0 nova_compute[187208]: 2025-12-05 12:15:20.792 187212 DEBUG nova.compute.manager [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:20 compute-0 nova_compute[187208]: 2025-12-05 12:15:20.890 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'pci_requests' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:20 compute-0 nova_compute[187208]: 2025-12-05 12:15:20.904 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:20 compute-0 nova_compute[187208]: 2025-12-05 12:15:20.926 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'resources' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:20 compute-0 nova_compute[187208]: 2025-12-05 12:15:20.941 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'migration_context' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:20 compute-0 nova_compute[187208]: 2025-12-05 12:15:20.964 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:15:20 compute-0 nova_compute[187208]: 2025-12-05 12:15:20.968 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:15:22 compute-0 nova_compute[187208]: 2025-12-05 12:15:22.989 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:23 compute-0 nova_compute[187208]: 2025-12-05 12:15:23.770 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.359 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.360 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.391 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.464 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.465 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.473 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.473 187212 INFO nova.compute.claims [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.629 187212 DEBUG nova.compute.provider_tree [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.651 187212 DEBUG nova.scheduler.client.report [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.678 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.679 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.750 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.751 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.787 187212 INFO nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.804 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.895 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.897 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.902 187212 INFO nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating image(s)
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.903 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.903 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.904 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.916 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.983 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.985 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:25 compute-0 nova_compute[187208]: 2025-12-05 12:15:25.986 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.000 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.076 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.078 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.131 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.133 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.133 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.196 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.199 187212 DEBUG nova.virt.disk.api [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.200 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.270 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.272 187212 DEBUG nova.virt.disk.api [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.272 187212 DEBUG nova.objects.instance [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.286 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.288 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Ensure instance console log exists: /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.288 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.289 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.289 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:26 compute-0 nova_compute[187208]: 2025-12-05 12:15:26.394 187212 DEBUG nova.policy [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:15:27 compute-0 nova_compute[187208]: 2025-12-05 12:15:27.993 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:28 compute-0 nova_compute[187208]: 2025-12-05 12:15:28.162 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Successfully created port: b5a9a5df-a95c-46bb-b043-0ff6ae79599e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:15:28 compute-0 podman[239582]: 2025-12-05 12:15:28.195251592 +0000 UTC m=+0.048692249 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:15:28 compute-0 nova_compute[187208]: 2025-12-05 12:15:28.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:30 compute-0 nova_compute[187208]: 2025-12-05 12:15:30.764 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Successfully updated port: b5a9a5df-a95c-46bb-b043-0ff6ae79599e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:15:30 compute-0 nova_compute[187208]: 2025-12-05 12:15:30.788 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:15:30 compute-0 nova_compute[187208]: 2025-12-05 12:15:30.789 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:15:30 compute-0 nova_compute[187208]: 2025-12-05 12:15:30.789 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:15:31 compute-0 nova_compute[187208]: 2025-12-05 12:15:31.051 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:15:31 compute-0 nova_compute[187208]: 2025-12-05 12:15:31.122 187212 DEBUG nova.compute.manager [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-changed-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:31 compute-0 nova_compute[187208]: 2025-12-05 12:15:31.123 187212 DEBUG nova.compute.manager [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Refreshing instance network info cache due to event network-changed-b5a9a5df-a95c-46bb-b043-0ff6ae79599e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:15:31 compute-0 nova_compute[187208]: 2025-12-05 12:15:31.123 187212 DEBUG oslo_concurrency.lockutils [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:15:31 compute-0 nova_compute[187208]: 2025-12-05 12:15:31.236 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.916 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Updating instance_info_cache with network_info: [{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.938 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.938 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance network_info: |[{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.939 187212 DEBUG oslo_concurrency.lockutils [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.939 187212 DEBUG nova.network.neutron [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Refreshing network info cache for port b5a9a5df-a95c-46bb-b043-0ff6ae79599e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.942 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start _get_guest_xml network_info=[{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.946 187212 WARNING nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.950 187212 DEBUG nova.virt.libvirt.host [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.952 187212 DEBUG nova.virt.libvirt.host [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.957 187212 DEBUG nova.virt.libvirt.host [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.958 187212 DEBUG nova.virt.libvirt.host [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.958 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.959 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.959 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.960 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.960 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.960 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.960 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.961 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.961 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.961 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.961 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.962 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.968 187212 DEBUG nova.virt.libvirt.vif [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-tempest.common.compute-instance-58863967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:25Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.969 187212 DEBUG nova.network.os_vif_util [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.970 187212 DEBUG nova.network.os_vif_util [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.972 187212 DEBUG nova.objects.instance [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.989 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <uuid>b9ba9fad-eaef-4c3b-9793-23053fe1ace1</uuid>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <name>instance-00000063</name>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <nova:name>tempest-tempest.common.compute-instance-58863967</nova:name>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:15:32</nova:creationTime>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:15:32 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:15:32 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:15:32 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:15:32 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:15:32 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:15:32 compute-0 nova_compute[187208]:         <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec 05 12:15:32 compute-0 nova_compute[187208]:         <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:15:32 compute-0 nova_compute[187208]:         <nova:port uuid="b5a9a5df-a95c-46bb-b043-0ff6ae79599e">
Dec 05 12:15:32 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <system>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <entry name="serial">b9ba9fad-eaef-4c3b-9793-23053fe1ace1</entry>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <entry name="uuid">b9ba9fad-eaef-4c3b-9793-23053fe1ace1</entry>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     </system>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <os>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   </os>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <features>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   </features>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:6b:1e:ff"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <target dev="tapb5a9a5df-a9"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/console.log" append="off"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <video>
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     </video>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:15:32 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:15:32 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:15:32 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:15:32 compute-0 nova_compute[187208]: </domain>
Dec 05 12:15:32 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.990 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Preparing to wait for external event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.991 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.991 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.991 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.992 187212 DEBUG nova.virt.libvirt.vif [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-tempest.common.compute-instance-58863967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:25Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.993 187212 DEBUG nova.network.os_vif_util [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.993 187212 DEBUG nova.network.os_vif_util [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.994 187212 DEBUG os_vif [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.994 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.995 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.995 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:15:32 compute-0 nova_compute[187208]: 2025-12-05 12:15:32.997 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.003 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5a9a5df-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.003 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5a9a5df-a9, col_values=(('external_ids', {'iface-id': 'b5a9a5df-a95c-46bb-b043-0ff6ae79599e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:1e:ff', 'vm-uuid': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:33 compute-0 NetworkManager[55691]: <info>  [1764936933.0059] manager: (tapb5a9a5df-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.008 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.012 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.014 187212 INFO os_vif [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9')
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.078 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.079 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.079 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No VIF found with MAC fa:16:3e:6b:1e:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.080 187212 INFO nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Using config drive
Dec 05 12:15:33 compute-0 podman[239614]: 2025-12-05 12:15:33.139394976 +0000 UTC m=+0.087705477 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:15:33 compute-0 systemd[1]: machine-qemu\x2d119\x2dinstance\x2d00000062.scope: Deactivated successfully.
Dec 05 12:15:33 compute-0 systemd[1]: machine-qemu\x2d119\x2dinstance\x2d00000062.scope: Consumed 12.629s CPU time.
Dec 05 12:15:33 compute-0 systemd-machined[153543]: Machine qemu-119-instance-00000062 terminated.
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.684 187212 INFO nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating config drive at /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.690 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfl1mp6a_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.774 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.835 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfl1mp6a_" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:33 compute-0 kernel: tapb5a9a5df-a9: entered promiscuous mode
Dec 05 12:15:33 compute-0 NetworkManager[55691]: <info>  [1764936933.9316] manager: (tapb5a9a5df-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/402)
Dec 05 12:15:33 compute-0 systemd-udevd[239634]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:15:33 compute-0 ovn_controller[95610]: 2025-12-05T12:15:33Z|01050|binding|INFO|Claiming lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e for this chassis.
Dec 05 12:15:33 compute-0 ovn_controller[95610]: 2025-12-05T12:15:33Z|01051|binding|INFO|b5a9a5df-a95c-46bb-b043-0ff6ae79599e: Claiming fa:16:3e:6b:1e:ff 10.100.0.8
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.932 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:33.941 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:1e:ff 10.100.0.8'], port_security=['fa:16:3e:6b:1e:ff 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '10cc7c6d-475c-43b4-8ab3-df3294aff9b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5a9a5df-a95c-46bb-b043-0ff6ae79599e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:15:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:33.944 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5a9a5df-a95c-46bb-b043-0ff6ae79599e in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis
Dec 05 12:15:33 compute-0 NetworkManager[55691]: <info>  [1764936933.9468] device (tapb5a9a5df-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:15:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:33.946 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:15:33 compute-0 NetworkManager[55691]: <info>  [1764936933.9480] device (tapb5a9a5df-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:15:33 compute-0 ovn_controller[95610]: 2025-12-05T12:15:33Z|01052|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e ovn-installed in OVS
Dec 05 12:15:33 compute-0 ovn_controller[95610]: 2025-12-05T12:15:33Z|01053|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e up in Southbound
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.950 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:33 compute-0 nova_compute[187208]: 2025-12-05 12:15:33.954 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:33 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:33.971 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ceb78b0-d2b2-4059-902e-275b704344d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:33 compute-0 systemd-machined[153543]: New machine qemu-120-instance-00000063.
Dec 05 12:15:34 compute-0 systemd[1]: Started Virtual Machine qemu-120-instance-00000063.
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.012 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fd422812-7b70-497d-8907-c3d1ce3ac436]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.018 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c962baa6-8cf4-425a-85be-5bfd7b9ab9bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.052 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6c5909f0-4637-4f56-b8a9-b37ac9cfc6c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.070 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance shutdown successfully after 13 seconds.
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.077 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance destroyed successfully.
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.079 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b274f1-c12d-4221-bd3b-cff342916a79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239671, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.085 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance destroyed successfully.
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.087 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deleting instance files /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5_del
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.088 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deletion of /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5_del complete
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.098 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[75cde256-330c-4d1e-b277-35d8c18d7406]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426028, 'tstamp': 426028}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239674, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426031, 'tstamp': 426031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239674, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.100 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.103 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.104 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.104 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:34 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.105 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.297 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.298 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating image(s)
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.299 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.299 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.300 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.319 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.383 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.384 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.385 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.396 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.456 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.457 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.568 187212 DEBUG nova.compute.manager [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.569 187212 DEBUG oslo_concurrency.lockutils [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.571 187212 DEBUG oslo_concurrency.lockutils [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.571 187212 DEBUG oslo_concurrency.lockutils [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.571 187212 DEBUG nova.compute.manager [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Processing event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.584 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936934.5837123, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.585 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Started (Lifecycle Event)
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.587 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.592 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.596 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance spawned successfully.
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.597 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.643 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.651 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.652 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.653 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.654 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.654 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.655 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.661 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.710 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.711 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936934.5847793, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.711 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Paused (Lifecycle Event)
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.739 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.742 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936934.591753, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.742 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Resumed (Lifecycle Event)
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.766 187212 INFO nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Took 8.87 seconds to spawn the instance on the hypervisor.
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.766 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.768 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.778 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.811 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.863 187212 INFO nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Took 9.42 seconds to build instance.
Dec 05 12:15:34 compute-0 nova_compute[187208]: 2025-12-05 12:15:34.884 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.097 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk 1073741824" returned: 0 in 0.640s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.098 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.099 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.161 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.163 187212 DEBUG nova.virt.disk.api [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Checking if we can resize image /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.163 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.236 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.238 187212 DEBUG nova.virt.disk.api [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Cannot resize image /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.239 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.240 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Ensure instance console log exists: /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.242 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.243 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.243 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.246 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.253 187212 WARNING nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.262 187212 DEBUG nova.virt.libvirt.host [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.264 187212 DEBUG nova.virt.libvirt.host [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.267 187212 DEBUG nova.virt.libvirt.host [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.268 187212 DEBUG nova.virt.libvirt.host [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.269 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.269 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.270 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.270 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.271 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.271 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.272 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.272 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.272 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.274 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.275 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.275 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.276 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.313 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <uuid>f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</uuid>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <name>instance-00000062</name>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerShowV257Test-server-228959241</nova:name>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:15:35</nova:creationTime>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:15:35 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:15:35 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:15:35 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:15:35 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:15:35 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:15:35 compute-0 nova_compute[187208]:         <nova:user uuid="3733ad965c154ae490947ad2a50e221d">tempest-ServerShowV257Test-1797821111-project-member</nova:user>
Dec 05 12:15:35 compute-0 nova_compute[187208]:         <nova:project uuid="239937ac98c24d5198788674713b75a1">tempest-ServerShowV257Test-1797821111</nova:project>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <system>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <entry name="serial">f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</entry>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <entry name="uuid">f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</entry>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     </system>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <os>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   </os>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <features>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   </features>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/console.log" append="off"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <video>
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     </video>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:15:35 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:15:35 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:15:35 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:15:35 compute-0 nova_compute[187208]: </domain>
Dec 05 12:15:35 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.382 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.383 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.384 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Using config drive
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.406 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.502 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'keypairs' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.977 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating config drive at /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config
Dec 05 12:15:35 compute-0 nova_compute[187208]: 2025-12-05 12:15:35.984 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvxmdz1zq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.038 187212 DEBUG nova.network.neutron [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Updated VIF entry in instance network info cache for port b5a9a5df-a95c-46bb-b043-0ff6ae79599e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.039 187212 DEBUG nova.network.neutron [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Updating instance_info_cache with network_info: [{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.115 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvxmdz1zq" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.167 187212 DEBUG oslo_concurrency.lockutils [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:15:36 compute-0 systemd-machined[153543]: New machine qemu-121-instance-00000062.
Dec 05 12:15:36 compute-0 systemd[1]: Started Virtual Machine qemu-121-instance-00000062.
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.793 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.793 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936936.7925968, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.794 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Resumed (Lifecycle Event)
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.796 187212 DEBUG nova.compute.manager [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.796 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.800 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance spawned successfully.
Dec 05 12:15:36 compute-0 nova_compute[187208]: 2025-12-05 12:15:36.800 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.032 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.037 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.038 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.039 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.039 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.040 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.040 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.045 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.087 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.088 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936936.793265, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.088 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Started (Lifecycle Event)
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.111 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.114 187212 DEBUG nova.compute.manager [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.117 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.149 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.180 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.181 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.181 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.309 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.373 187212 DEBUG nova.compute.manager [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.373 187212 DEBUG oslo_concurrency.lockutils [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.374 187212 DEBUG oslo_concurrency.lockutils [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.374 187212 DEBUG oslo_concurrency.lockutils [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.374 187212 DEBUG nova.compute.manager [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:15:37 compute-0 nova_compute[187208]: 2025-12-05 12:15:37.375 187212 WARNING nova.compute.manager [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state active and task_state None.
Dec 05 12:15:38 compute-0 nova_compute[187208]: 2025-12-05 12:15:38.007 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:38 compute-0 podman[239726]: 2025-12-05 12:15:38.235143793 +0000 UTC m=+0.073252879 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc.)
Dec 05 12:15:38 compute-0 podman[239727]: 2025-12-05 12:15:38.255075439 +0000 UTC m=+0.093730091 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 05 12:15:38 compute-0 nova_compute[187208]: 2025-12-05 12:15:38.776 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.478 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.479 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.479 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.480 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.480 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.484 187212 INFO nova.compute.manager [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Terminating instance
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.485 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "refresh_cache-f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.485 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquired lock "refresh_cache-f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.486 187212 DEBUG nova.network.neutron [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:15:39 compute-0 nova_compute[187208]: 2025-12-05 12:15:39.975 187212 DEBUG nova.network.neutron [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:15:40 compute-0 nova_compute[187208]: 2025-12-05 12:15:40.556 187212 INFO nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Rebuilding instance
Dec 05 12:15:40 compute-0 nova_compute[187208]: 2025-12-05 12:15:40.890 187212 DEBUG nova.network.neutron [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:15:40 compute-0 nova_compute[187208]: 2025-12-05 12:15:40.918 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Releasing lock "refresh_cache-f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:15:40 compute-0 nova_compute[187208]: 2025-12-05 12:15:40.918 187212 DEBUG nova.compute.manager [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:15:40 compute-0 systemd[1]: machine-qemu\x2d121\x2dinstance\x2d00000062.scope: Deactivated successfully.
Dec 05 12:15:40 compute-0 systemd[1]: machine-qemu\x2d121\x2dinstance\x2d00000062.scope: Consumed 4.728s CPU time.
Dec 05 12:15:40 compute-0 systemd-machined[153543]: Machine qemu-121-instance-00000062 terminated.
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.191 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance destroyed successfully.
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.192 187212 DEBUG nova.objects.instance [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'resources' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.207 187212 INFO nova.virt.libvirt.driver [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deleting instance files /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5_del
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.208 187212 INFO nova.virt.libvirt.driver [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deletion of /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5_del complete
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.260 187212 INFO nova.compute.manager [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.261 187212 DEBUG oslo.service.loopingcall [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.261 187212 DEBUG nova.compute.manager [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.261 187212 DEBUG nova.network.neutron [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.538 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.557 187212 DEBUG nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.610 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_requests' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.630 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.645 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.659 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.666 187212 DEBUG nova.network.neutron [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.675 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.677 187212 DEBUG nova.network.neutron [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.680 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.689 187212 INFO nova.compute.manager [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Took 0.43 seconds to deallocate network for instance.
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.734 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.734 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.848 187212 DEBUG nova.compute.provider_tree [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.873 187212 DEBUG nova.scheduler.client.report [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.910 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:41 compute-0 nova_compute[187208]: 2025-12-05 12:15:41.940 187212 INFO nova.scheduler.client.report [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Deleted allocations for instance f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5
Dec 05 12:15:42 compute-0 nova_compute[187208]: 2025-12-05 12:15:42.023 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:43 compute-0 nova_compute[187208]: 2025-12-05 12:15:43.010 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:43 compute-0 nova_compute[187208]: 2025-12-05 12:15:43.778 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:46 compute-0 nova_compute[187208]: 2025-12-05 12:15:46.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:46 compute-0 nova_compute[187208]: 2025-12-05 12:15:46.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:46 compute-0 nova_compute[187208]: 2025-12-05 12:15:46.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:15:47 compute-0 ovn_controller[95610]: 2025-12-05T12:15:47Z|00121|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6b:1e:ff 10.100.0.8
Dec 05 12:15:47 compute-0 ovn_controller[95610]: 2025-12-05T12:15:47Z|00122|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6b:1e:ff 10.100.0.8
Dec 05 12:15:47 compute-0 nova_compute[187208]: 2025-12-05 12:15:47.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:47 compute-0 nova_compute[187208]: 2025-12-05 12:15:47.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:15:47 compute-0 nova_compute[187208]: 2025-12-05 12:15:47.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:15:47 compute-0 nova_compute[187208]: 2025-12-05 12:15:47.779 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:15:47 compute-0 nova_compute[187208]: 2025-12-05 12:15:47.780 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:15:47 compute-0 nova_compute[187208]: 2025-12-05 12:15:47.780 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:15:47 compute-0 nova_compute[187208]: 2025-12-05 12:15:47.781 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:48 compute-0 nova_compute[187208]: 2025-12-05 12:15:48.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:48 compute-0 podman[239792]: 2025-12-05 12:15:48.21856241 +0000 UTC m=+0.058195944 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:15:48 compute-0 podman[239791]: 2025-12-05 12:15:48.226097718 +0000 UTC m=+0.070583572 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Dec 05 12:15:48 compute-0 podman[239793]: 2025-12-05 12:15:48.261936384 +0000 UTC m=+0.091493796 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:15:48 compute-0 nova_compute[187208]: 2025-12-05 12:15:48.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:51 compute-0 nova_compute[187208]: 2025-12-05 12:15:51.731 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.017 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.672 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.695 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.696 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.696 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.696 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.697 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.697 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.820 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:53 compute-0 kernel: tapb5a9a5df-a9 (unregistering): left promiscuous mode
Dec 05 12:15:53 compute-0 NetworkManager[55691]: <info>  [1764936953.9528] device (tapb5a9a5df-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:15:53 compute-0 ovn_controller[95610]: 2025-12-05T12:15:53Z|01054|binding|INFO|Releasing lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e from this chassis (sb_readonly=0)
Dec 05 12:15:53 compute-0 ovn_controller[95610]: 2025-12-05T12:15:53Z|01055|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e down in Southbound
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:53 compute-0 ovn_controller[95610]: 2025-12-05T12:15:53Z|01056|binding|INFO|Removing iface tapb5a9a5df-a9 ovn-installed in OVS
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.964 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:53 compute-0 nova_compute[187208]: 2025-12-05 12:15:53.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:53.979 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:1e:ff 10.100.0.8'], port_security=['fa:16:3e:6b:1e:ff 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '10cc7c6d-475c-43b4-8ab3-df3294aff9b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5a9a5df-a95c-46bb-b043-0ff6ae79599e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:15:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:53.980 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5a9a5df-a95c-46bb-b043-0ff6ae79599e in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:15:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:53.983 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.001 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2b8b34-9839-4b13-8a05-f325ca71bfcb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:54 compute-0 systemd[1]: machine-qemu\x2d120\x2dinstance\x2d00000063.scope: Deactivated successfully.
Dec 05 12:15:54 compute-0 systemd[1]: machine-qemu\x2d120\x2dinstance\x2d00000063.scope: Consumed 12.567s CPU time.
Dec 05 12:15:54 compute-0 systemd-machined[153543]: Machine qemu-120-instance-00000063 terminated.
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.035 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c67f7e42-6acd-4dbc-bce1-0e90a524de48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.039 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7fcb0a-7ad6-4b5c-a359-01dbf848ff3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6d24b2a3-7940-42a7-b65a-2d28e9d91350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.087 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[925cb1c3-1299-4f12-887e-a778566ba68c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239882, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.090 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.102 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f3a2feea-1198-49a8-95b5-f4511eead67b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426028, 'tstamp': 426028}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239883, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426031, 'tstamp': 426031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239883, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.104 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.110 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.110 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.111 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.111 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.111 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.126 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.126 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.127 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.127 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.208 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.317 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.378 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.379 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.447 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.453 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.482 187212 DEBUG nova.compute.manager [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.482 187212 DEBUG oslo_concurrency.lockutils [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.482 187212 DEBUG oslo_concurrency.lockutils [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.483 187212 DEBUG oslo_concurrency.lockutils [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.483 187212 DEBUG nova.compute.manager [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.483 187212 WARNING nova.compute.manager [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state active and task_state rebuilding.
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.524 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.525 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.590 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.745 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance shutdown successfully after 13 seconds.
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.752 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance destroyed successfully.
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.761 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance destroyed successfully.
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.762 187212 DEBUG nova.virt.libvirt.vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-ServerActionsTestJSON-server-975018653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:15:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:39Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.763 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.764 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.765 187212 DEBUG os_vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.770 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5a9a5df-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.775 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.780 187212 INFO os_vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9')
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.780 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deleting instance files /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1_del
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.781 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deletion of /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1_del complete
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.807 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.808 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5412MB free_disk=72.98209381103516GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.808 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.809 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.985 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.985 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b9ba9fad-eaef-4c3b-9793-23053fe1ace1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.986 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:15:54 compute-0 nova_compute[187208]: 2025-12-05 12:15:54.986 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.058 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.059 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating image(s)
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.060 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.060 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.060 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.080 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.111 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.146 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.151 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.152 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.152 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.164 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.184 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.185 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.220 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.220 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.302 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk 1073741824" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.304 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.305 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:55.353 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.353 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:55.354 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.376 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.377 187212 DEBUG nova.virt.disk.api [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.378 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.447 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.449 187212 DEBUG nova.virt.disk.api [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.449 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.449 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Ensure instance console log exists: /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.450 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.450 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.451 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.453 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start _get_guest_xml network_info=[{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.457 187212 WARNING nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.462 187212 DEBUG nova.virt.libvirt.host [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.463 187212 DEBUG nova.virt.libvirt.host [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.466 187212 DEBUG nova.virt.libvirt.host [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.467 187212 DEBUG nova.virt.libvirt.host [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.468 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.468 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.468 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.468 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.470 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.470 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.470 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.470 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.499 187212 DEBUG nova.virt.libvirt.vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-ServerActionsTestJSON-server-975018653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:15:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:54Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.500 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.501 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.503 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <uuid>b9ba9fad-eaef-4c3b-9793-23053fe1ace1</uuid>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <name>instance-00000063</name>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestJSON-server-975018653</nova:name>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:15:55</nova:creationTime>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:15:55 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:15:55 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:15:55 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:15:55 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:15:55 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:15:55 compute-0 nova_compute[187208]:         <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec 05 12:15:55 compute-0 nova_compute[187208]:         <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:15:55 compute-0 nova_compute[187208]:         <nova:port uuid="b5a9a5df-a95c-46bb-b043-0ff6ae79599e">
Dec 05 12:15:55 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <system>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <entry name="serial">b9ba9fad-eaef-4c3b-9793-23053fe1ace1</entry>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <entry name="uuid">b9ba9fad-eaef-4c3b-9793-23053fe1ace1</entry>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     </system>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <os>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   </os>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <features>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   </features>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:6b:1e:ff"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <target dev="tapb5a9a5df-a9"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/console.log" append="off"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <video>
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     </video>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:15:55 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:15:55 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:15:55 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:15:55 compute-0 nova_compute[187208]: </domain>
Dec 05 12:15:55 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.503 187212 DEBUG nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Preparing to wait for external event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.503 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.504 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.504 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.504 187212 DEBUG nova.virt.libvirt.vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-ServerActionsTestJSON-server-975018653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:15:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:54Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.505 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.505 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.505 187212 DEBUG os_vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.506 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.506 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.507 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.510 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.510 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5a9a5df-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.511 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5a9a5df-a9, col_values=(('external_ids', {'iface-id': 'b5a9a5df-a95c-46bb-b043-0ff6ae79599e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:1e:ff', 'vm-uuid': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:55 compute-0 NetworkManager[55691]: <info>  [1764936955.5136] manager: (tapb5a9a5df-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/403)
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.515 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.521 187212 INFO os_vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9')
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.566 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.566 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.566 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No VIF found with MAC fa:16:3e:6b:1e:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.567 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Using config drive
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.582 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'ec2_ids' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:55 compute-0 nova_compute[187208]: 2025-12-05 12:15:55.628 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'keypairs' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.188 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936941.1868029, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.189 187212 INFO nova.compute.manager [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Stopped (Lifecycle Event)
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.211 187212 DEBUG nova.compute.manager [None req-9cab6225-addf-4bf1-aab8-63a0924bc801 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.595 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating config drive at /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.600 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3p2upt_q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.646 187212 DEBUG nova.compute.manager [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.646 187212 DEBUG oslo_concurrency.lockutils [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.647 187212 DEBUG oslo_concurrency.lockutils [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.647 187212 DEBUG oslo_concurrency.lockutils [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.647 187212 DEBUG nova.compute.manager [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Processing event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.724 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3p2upt_q" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:15:56 compute-0 kernel: tapb5a9a5df-a9: entered promiscuous mode
Dec 05 12:15:56 compute-0 systemd-udevd[239874]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:15:56 compute-0 NetworkManager[55691]: <info>  [1764936956.7831] manager: (tapb5a9a5df-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/404)
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.783 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:56 compute-0 ovn_controller[95610]: 2025-12-05T12:15:56Z|01057|binding|INFO|Claiming lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e for this chassis.
Dec 05 12:15:56 compute-0 ovn_controller[95610]: 2025-12-05T12:15:56Z|01058|binding|INFO|b5a9a5df-a95c-46bb-b043-0ff6ae79599e: Claiming fa:16:3e:6b:1e:ff 10.100.0.8
Dec 05 12:15:56 compute-0 NetworkManager[55691]: <info>  [1764936956.7946] device (tapb5a9a5df-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:15:56 compute-0 NetworkManager[55691]: <info>  [1764936956.7969] device (tapb5a9a5df-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:15:56 compute-0 ovn_controller[95610]: 2025-12-05T12:15:56Z|01059|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e ovn-installed in OVS
Dec 05 12:15:56 compute-0 ovn_controller[95610]: 2025-12-05T12:15:56Z|01060|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e up in Southbound
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.797 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:1e:ff 10.100.0.8'], port_security=['fa:16:3e:6b:1e:ff 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '10cc7c6d-475c-43b4-8ab3-df3294aff9b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5a9a5df-a95c-46bb-b043-0ff6ae79599e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.798 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5a9a5df-a95c-46bb-b043-0ff6ae79599e in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.800 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.832 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.835 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.839 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[64e3ec6b-8062-43bf-a3ba-3eab46166dda]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.869 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d95f8e90-f9d4-4055-9087-c41393b788f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.873 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[dea285dd-bd9a-4a7d-8a2e-7d4300c562b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:56 compute-0 systemd-machined[153543]: New machine qemu-122-instance-00000063.
Dec 05 12:15:56 compute-0 systemd[1]: Started Virtual Machine qemu-122-instance-00000063.
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.904 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[48ef2128-8382-44d0-b85d-5c94da684778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.923 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe6a555-d681-4af4-bda8-13b643060c68]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239953, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.942 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dcd47ae2-b1c5-46bc-8114-aae74769b441]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426028, 'tstamp': 426028}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239956, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426031, 'tstamp': 426031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239956, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.944 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.968 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:56 compute-0 nova_compute[187208]: 2025-12-05 12:15:56.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.973 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.974 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.974 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.974 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.138 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for b9ba9fad-eaef-4c3b-9793-23053fe1ace1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.139 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936957.1384003, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.139 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Started (Lifecycle Event)
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.142 187212 DEBUG nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.145 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.149 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance spawned successfully.
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.149 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.162 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.167 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.172 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.173 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.173 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.174 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.175 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.175 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.181 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.202 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.203 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936957.1385927, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.203 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Paused (Lifecycle Event)
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.239 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.242 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936957.1446872, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.242 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Resumed (Lifecycle Event)
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.249 187212 DEBUG nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.276 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.280 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.324 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.324 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.325 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:15:57 compute-0 nova_compute[187208]: 2025-12-05 12:15:57.419 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:15:58.356 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.975 187212 DEBUG nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 DEBUG nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 WARNING nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state active and task_state None.
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.977 187212 DEBUG nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.977 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.977 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.977 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.978 187212 DEBUG nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:15:58 compute-0 nova_compute[187208]: 2025-12-05 12:15:58.978 187212 WARNING nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state active and task_state None.
Dec 05 12:15:59 compute-0 podman[239969]: 2025-12-05 12:15:59.196837771 +0000 UTC m=+0.052575621 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:16:00 compute-0 nova_compute[187208]: 2025-12-05 12:16:00.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:03.020 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:03.021 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:03 compute-0 nova_compute[187208]: 2025-12-05 12:16:03.824 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:04 compute-0 podman[239993]: 2025-12-05 12:16:04.223321666 +0000 UTC m=+0.068921094 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.121 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.123 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.123 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.123 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.124 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.125 187212 INFO nova.compute.manager [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Terminating instance
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.126 187212 DEBUG nova.compute.manager [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:16:05 compute-0 kernel: tapb5a9a5df-a9 (unregistering): left promiscuous mode
Dec 05 12:16:05 compute-0 NetworkManager[55691]: <info>  [1764936965.1481] device (tapb5a9a5df-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.158 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:05 compute-0 ovn_controller[95610]: 2025-12-05T12:16:05Z|01061|binding|INFO|Releasing lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e from this chassis (sb_readonly=0)
Dec 05 12:16:05 compute-0 ovn_controller[95610]: 2025-12-05T12:16:05Z|01062|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e down in Southbound
Dec 05 12:16:05 compute-0 ovn_controller[95610]: 2025-12-05T12:16:05Z|01063|binding|INFO|Removing iface tapb5a9a5df-a9 ovn-installed in OVS
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.169 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:1e:ff 10.100.0.8'], port_security=['fa:16:3e:6b:1e:ff 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '10cc7c6d-475c-43b4-8ab3-df3294aff9b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5a9a5df-a95c-46bb-b043-0ff6ae79599e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.170 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5a9a5df-a95c-46bb-b043-0ff6ae79599e in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.171 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.175 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.190 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35ae5ce2-f618-4729-a1e1-912de0dc1dfa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:05 compute-0 systemd[1]: machine-qemu\x2d122\x2dinstance\x2d00000063.scope: Deactivated successfully.
Dec 05 12:16:05 compute-0 systemd[1]: machine-qemu\x2d122\x2dinstance\x2d00000063.scope: Consumed 8.309s CPU time.
Dec 05 12:16:05 compute-0 systemd-machined[153543]: Machine qemu-122-instance-00000063 terminated.
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.230 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[81581326-0e4c-4fd4-9ee1-14cda6845459]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.235 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9f247d49-73b0-4b0c-8a54-6fa0b5edc585]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.270 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a620b149-5570-4dec-9b30-442b56205d00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[52efdc18-9ab2-4a57-b6b0-e184b9bcea2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240025, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.309 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2dcd973f-d69d-4845-81ff-344def117080]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426028, 'tstamp': 426028}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240026, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426031, 'tstamp': 426031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240026, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.311 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.318 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.319 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.319 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.320 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.405 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance destroyed successfully.
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.406 187212 DEBUG nova.objects.instance [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.421 187212 DEBUG nova.virt.libvirt.vif [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-ServerActionsTestJSON-server-975018653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:15:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:15:57Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.421 187212 DEBUG nova.network.os_vif_util [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.422 187212 DEBUG nova.network.os_vif_util [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.422 187212 DEBUG os_vif [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.426 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.427 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5a9a5df-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.428 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.431 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.434 187212 INFO os_vif [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9')
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.434 187212 INFO nova.virt.libvirt.driver [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deleting instance files /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1_del
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.435 187212 INFO nova.virt.libvirt.driver [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deletion of /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1_del complete
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.484 187212 INFO nova.compute.manager [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.485 187212 DEBUG oslo.service.loopingcall [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.485 187212 DEBUG nova.compute.manager [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:16:05 compute-0 nova_compute[187208]: 2025-12-05 12:16:05.485 187212 DEBUG nova.network.neutron [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:16:06 compute-0 nova_compute[187208]: 2025-12-05 12:16:06.383 187212 DEBUG nova.compute.manager [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:06 compute-0 nova_compute[187208]: 2025-12-05 12:16:06.383 187212 DEBUG oslo_concurrency.lockutils [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:06 compute-0 nova_compute[187208]: 2025-12-05 12:16:06.383 187212 DEBUG oslo_concurrency.lockutils [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:06 compute-0 nova_compute[187208]: 2025-12-05 12:16:06.384 187212 DEBUG oslo_concurrency.lockutils [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:06 compute-0 nova_compute[187208]: 2025-12-05 12:16:06.384 187212 DEBUG nova.compute.manager [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:06 compute-0 nova_compute[187208]: 2025-12-05 12:16:06.384 187212 DEBUG nova.compute.manager [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:16:07 compute-0 nova_compute[187208]: 2025-12-05 12:16:07.794 187212 DEBUG nova.network.neutron [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:16:07 compute-0 nova_compute[187208]: 2025-12-05 12:16:07.938 187212 INFO nova.compute.manager [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Took 2.45 seconds to deallocate network for instance.
Dec 05 12:16:07 compute-0 nova_compute[187208]: 2025-12-05 12:16:07.990 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:07 compute-0 nova_compute[187208]: 2025-12-05 12:16:07.991 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.093 187212 DEBUG nova.compute.provider_tree [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.127 187212 DEBUG nova.scheduler.client.report [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.157 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.187 187212 INFO nova.scheduler.client.report [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Deleted allocations for instance b9ba9fad-eaef-4c3b-9793-23053fe1ace1
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.281 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.301 187212 DEBUG nova.compute.manager [req-7be324cb-872b-4237-9d15-923d0a0bd071 req-1d4a94ab-1f03-4afd-8bf4-5a075add4988 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-deleted-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.767 187212 DEBUG nova.compute.manager [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.768 187212 DEBUG oslo_concurrency.lockutils [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.768 187212 DEBUG oslo_concurrency.lockutils [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.768 187212 DEBUG oslo_concurrency.lockutils [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.768 187212 DEBUG nova.compute.manager [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.769 187212 WARNING nova.compute.manager [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state deleted and task_state None.
Dec 05 12:16:08 compute-0 nova_compute[187208]: 2025-12-05 12:16:08.825 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:09 compute-0 podman[240046]: 2025-12-05 12:16:09.24631107 +0000 UTC m=+0.083465395 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 12:16:09 compute-0 podman[240045]: 2025-12-05 12:16:09.258395489 +0000 UTC m=+0.101864056 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 12:16:10 compute-0 nova_compute[187208]: 2025-12-05 12:16:10.458 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:13 compute-0 nova_compute[187208]: 2025-12-05 12:16:13.827 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:14 compute-0 nova_compute[187208]: 2025-12-05 12:16:14.981 187212 DEBUG oslo_concurrency.lockutils [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:14 compute-0 nova_compute[187208]: 2025-12-05 12:16:14.982 187212 DEBUG oslo_concurrency.lockutils [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:14 compute-0 nova_compute[187208]: 2025-12-05 12:16:14.983 187212 DEBUG nova.compute.manager [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:14 compute-0 nova_compute[187208]: 2025-12-05 12:16:14.986 187212 DEBUG nova.compute.manager [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Dec 05 12:16:14 compute-0 nova_compute[187208]: 2025-12-05 12:16:14.987 187212 DEBUG nova.objects.instance [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:15 compute-0 nova_compute[187208]: 2025-12-05 12:16:15.010 187212 DEBUG nova.virt.libvirt.driver [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:16:15 compute-0 nova_compute[187208]: 2025-12-05 12:16:15.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:17 compute-0 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec 05 12:16:17 compute-0 NetworkManager[55691]: <info>  [1764936977.1558] device (tap5316adeb-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.162 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:17 compute-0 ovn_controller[95610]: 2025-12-05T12:16:17Z|01064|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec 05 12:16:17 compute-0 ovn_controller[95610]: 2025-12-05T12:16:17Z|01065|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec 05 12:16:17 compute-0 ovn_controller[95610]: 2025-12-05T12:16:17Z|01066|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.171 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.173 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.174 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.175 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ca16284a-a7e1-4abf-95de-de4c12876b25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.176 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace which is not needed anymore
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:17 compute-0 systemd[1]: machine-qemu\x2d117\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Dec 05 12:16:17 compute-0 systemd[1]: machine-qemu\x2d117\x2dinstance\x2d0000005c.scope: Consumed 17.600s CPU time.
Dec 05 12:16:17 compute-0 systemd-machined[153543]: Machine qemu-117-instance-0000005c terminated.
Dec 05 12:16:17 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [NOTICE]   (239103) : haproxy version is 2.8.14-c23fe91
Dec 05 12:16:17 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [NOTICE]   (239103) : path to executable is /usr/sbin/haproxy
Dec 05 12:16:17 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [WARNING]  (239103) : Exiting Master process...
Dec 05 12:16:17 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [ALERT]    (239103) : Current worker (239105) exited with code 143 (Terminated)
Dec 05 12:16:17 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [WARNING]  (239103) : All workers exited. Exiting... (0)
Dec 05 12:16:17 compute-0 systemd[1]: libpod-f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14.scope: Deactivated successfully.
Dec 05 12:16:17 compute-0 podman[240108]: 2025-12-05 12:16:17.314491264 +0000 UTC m=+0.048163174 container died f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:16:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-2881d06722a68d1b04f02f9f59234c376caaf5f0e8e911e5a9b452b4d2ddaaf9-merged.mount: Deactivated successfully.
Dec 05 12:16:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14-userdata-shm.mount: Deactivated successfully.
Dec 05 12:16:17 compute-0 podman[240108]: 2025-12-05 12:16:17.361677448 +0000 UTC m=+0.095349368 container cleanup f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 05 12:16:17 compute-0 systemd[1]: libpod-conmon-f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14.scope: Deactivated successfully.
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.388 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.393 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:17 compute-0 podman[240139]: 2025-12-05 12:16:17.441469695 +0000 UTC m=+0.056126474 container remove f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.448 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f2bdc7f-2d25-45f4-a0ba-da2b93424e2e]: (4, ('Fri Dec  5 12:16:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14)\nf36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14\nFri Dec  5 12:16:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14)\nf36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.451 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a8984c32-b7d6-46ef-bb3d-2a9abfdef3ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.452 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.454 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:17 compute-0 kernel: tapf9ed41c2-b0: left promiscuous mode
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.469 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.472 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb775db-bfcf-47c4-ba8d-045566accd02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.492 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ecd603-b4da-46b5-8f49-cff577a4902d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.493 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[533a20e5-5281-4523-b481-30f832faf42a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.509 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d5fc76-1341-4c23-95d3-eab39c39bb14]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426006, 'reachable_time': 16236, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240174, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.513 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:16:17 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.513 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[35de48f5-3ce2-4168-afb9-79ebf813f235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:17 compute-0 systemd[1]: run-netns-ovnmeta\x2df9ed41c2\x2db085\x2d41ff\x2dac71\x2d6256a4e30e85.mount: Deactivated successfully.
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.637 187212 DEBUG nova.compute.manager [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.638 187212 DEBUG oslo_concurrency.lockutils [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.638 187212 DEBUG oslo_concurrency.lockutils [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.639 187212 DEBUG oslo_concurrency.lockutils [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.639 187212 DEBUG nova.compute.manager [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:17 compute-0 nova_compute[187208]: 2025-12-05 12:16:17.639 187212 WARNING nova.compute.manager [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state powering-off.
Dec 05 12:16:18 compute-0 nova_compute[187208]: 2025-12-05 12:16:18.028 187212 INFO nova.virt.libvirt.driver [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance shutdown successfully after 3 seconds.
Dec 05 12:16:18 compute-0 nova_compute[187208]: 2025-12-05 12:16:18.035 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.
Dec 05 12:16:18 compute-0 nova_compute[187208]: 2025-12-05 12:16:18.036 187212 DEBUG nova.objects.instance [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:18 compute-0 nova_compute[187208]: 2025-12-05 12:16:18.051 187212 DEBUG nova.compute.manager [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:18 compute-0 nova_compute[187208]: 2025-12-05 12:16:18.100 187212 DEBUG oslo_concurrency.lockutils [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:18 compute-0 nova_compute[187208]: 2025-12-05 12:16:18.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:19 compute-0 podman[240176]: 2025-12-05 12:16:19.21150662 +0000 UTC m=+0.059322856 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:16:19 compute-0 podman[240175]: 2025-12-05 12:16:19.217425531 +0000 UTC m=+0.065563977 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 12:16:19 compute-0 podman[240177]: 2025-12-05 12:16:19.236849043 +0000 UTC m=+0.081902909 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.137 187212 DEBUG nova.compute.manager [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.138 187212 DEBUG oslo_concurrency.lockutils [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.138 187212 DEBUG oslo_concurrency.lockutils [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.139 187212 DEBUG oslo_concurrency.lockutils [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.139 187212 DEBUG nova.compute.manager [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.139 187212 WARNING nova.compute.manager [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state stopped and task_state None.
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.263 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.288 187212 DEBUG oslo_concurrency.lockutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.288 187212 DEBUG oslo_concurrency.lockutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.289 187212 DEBUG nova.network.neutron [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.289 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'info_cache' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.403 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936965.402231, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.404 187212 INFO nova.compute.manager [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Stopped (Lifecycle Event)
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.435 187212 DEBUG nova.compute.manager [None req-aff28276-aa34-4b41-bc55-5e9304435a67 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:20 compute-0 nova_compute[187208]: 2025-12-05 12:16:20.462 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:23 compute-0 nova_compute[187208]: 2025-12-05 12:16:23.831 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.188 187212 DEBUG nova.network.neutron [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.218 187212 DEBUG oslo_concurrency.lockutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.249 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.250 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.265 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.279 187212 DEBUG nova.virt.libvirt.vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:16:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.279 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.280 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.281 187212 DEBUG os_vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.282 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.282 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5316adeb-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.312 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.317 187212 INFO os_vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.324 187212 DEBUG nova.virt.libvirt.driver [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start _get_guest_xml network_info=[{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.327 187212 WARNING nova.virt.libvirt.driver [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.333 187212 DEBUG nova.virt.libvirt.host [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.334 187212 DEBUG nova.virt.libvirt.host [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.339 187212 DEBUG nova.virt.libvirt.host [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.340 187212 DEBUG nova.virt.libvirt.host [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.340 187212 DEBUG nova.virt.libvirt.driver [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.340 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.341 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.341 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.341 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.342 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.342 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.342 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.342 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.343 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.343 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.343 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.344 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.363 187212 DEBUG nova.virt.libvirt.vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:16:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.364 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.364 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.366 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.385 187212 DEBUG nova.virt.libvirt.driver [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <uuid>2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</uuid>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <name>instance-0000005c</name>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerActionsTestJSON-server-954339420</nova:name>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:16:25</nova:creationTime>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:16:25 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:16:25 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:16:25 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:16:25 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:16:25 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:16:25 compute-0 nova_compute[187208]:         <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec 05 12:16:25 compute-0 nova_compute[187208]:         <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:16:25 compute-0 nova_compute[187208]:         <nova:port uuid="5316adeb-5a49-4a58-b997-f132a083ff13">
Dec 05 12:16:25 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <system>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <entry name="serial">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <entry name="uuid">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     </system>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <os>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   </os>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <features>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   </features>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:9a:d0:34"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <target dev="tap5316adeb-5a"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/console.log" append="off"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <video>
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     </video>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <input type="keyboard" bus="usb"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:16:25 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:16:25 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:16:25 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:16:25 compute-0 nova_compute[187208]: </domain>
Dec 05 12:16:25 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.387 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.472 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.473 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.529 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.531 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.546 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.603 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.604 187212 DEBUG nova.virt.disk.api [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.605 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.664 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.665 187212 DEBUG nova.virt.disk.api [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.665 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.681 187212 DEBUG nova.virt.libvirt.vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:16:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.681 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.682 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.683 187212 DEBUG os_vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.683 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.684 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.684 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.687 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5316adeb-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.687 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5316adeb-5a, col_values=(('external_ids', {'iface-id': '5316adeb-5a49-4a58-b997-f132a083ff13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:d0:34', 'vm-uuid': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.689 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 NetworkManager[55691]: <info>  [1764936985.6903] manager: (tap5316adeb-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/405)
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.694 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.695 187212 INFO os_vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:16:25 compute-0 kernel: tap5316adeb-5a: entered promiscuous mode
Dec 05 12:16:25 compute-0 NetworkManager[55691]: <info>  [1764936985.7822] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/406)
Dec 05 12:16:25 compute-0 ovn_controller[95610]: 2025-12-05T12:16:25Z|01067|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 ovn_controller[95610]: 2025-12-05T12:16:25Z|01068|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.793 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '11', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.794 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.795 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:16:25 compute-0 ovn_controller[95610]: 2025-12-05T12:16:25Z|01069|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec 05 12:16:25 compute-0 ovn_controller[95610]: 2025-12-05T12:16:25Z|01070|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.803 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 nova_compute[187208]: 2025-12-05 12:16:25.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.809 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3c46efc6-f054-4623-9eb2-390b7baebe07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.810 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9ed41c2-b1 in ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:16:25 compute-0 systemd-udevd[240270]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.813 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9ed41c2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.813 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8a810ec2-1539-4ae1-bc12-42f3a9982761]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.814 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f05f3e-b245-4494-beae-901e51f0036d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 NetworkManager[55691]: <info>  [1764936985.8254] device (tap5316adeb-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.825 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[938b0bc1-bc0b-4aad-aefb-cb1611916b5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 NetworkManager[55691]: <info>  [1764936985.8263] device (tap5316adeb-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:16:25 compute-0 systemd-machined[153543]: New machine qemu-123-instance-0000005c.
Dec 05 12:16:25 compute-0 systemd[1]: Started Virtual Machine qemu-123-instance-0000005c.
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.841 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[10cc03c4-35b7-4ddf-a24b-d2be6b27ca31]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.869 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0f408ead-59e0-46a6-b93a-4601a0f3c570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 systemd-udevd[240275]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:16:25 compute-0 NetworkManager[55691]: <info>  [1764936985.8758] manager: (tapf9ed41c2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/407)
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.875 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eec35729-328c-4a27-88c4-1063e002ed99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.904 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3c581962-c232-4517-8ecc-f5dd67058804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.907 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e7048f33-78c0-4d14-90a1-5d0450a7d106]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 NetworkManager[55691]: <info>  [1764936985.9330] device (tapf9ed41c2-b0): carrier: link connected
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.937 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8669df66-c7a7-4fb7-b57b-b2f2f6370645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.958 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7f2f2f69-b634-4669-bd39-bb2ef0b09c7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 294], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436649, 'reachable_time': 33956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240304, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.973 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[641a07ae-4161-4222-9901-86f3c24b4f00]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:9111'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436649, 'tstamp': 436649}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240305, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.990 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af98b762-31bc-4197-9c11-ee8f6bfc439d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 294], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436649, 'reachable_time': 33956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240306, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.023 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e1321a0-4978-416c-a11b-4e725fbab641]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.082 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af89a8e6-c2fb-461e-a3fe-33508738eaac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.084 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.084 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.084 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:26 compute-0 NetworkManager[55691]: <info>  [1764936986.0871] manager: (tapf9ed41c2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/408)
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:26 compute-0 kernel: tapf9ed41c2-b0: entered promiscuous mode
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.090 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:26 compute-0 ovn_controller[95610]: 2025-12-05T12:16:26Z|01071|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.106 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0c174a89-49dc-4963-a383-aca469768b5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.108 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:16:26 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.109 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'env', 'PROCESS_TAG=haproxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9ed41c2-b085-41ff-ac71-6256a4e30e85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.189 187212 DEBUG nova.compute.manager [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.192 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.192 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936986.189252, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.193 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Resumed (Lifecycle Event)
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.197 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance rebooted successfully.
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.198 187212 DEBUG nova.compute.manager [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.211 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.214 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.234 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (powering-on). Skip.
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.235 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936986.1904628, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.235 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Started (Lifecycle Event)
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.265 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.268 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.425 187212 DEBUG nova.compute.manager [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.426 187212 DEBUG oslo_concurrency.lockutils [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.426 187212 DEBUG oslo_concurrency.lockutils [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.426 187212 DEBUG oslo_concurrency.lockutils [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.427 187212 DEBUG nova.compute.manager [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:26 compute-0 nova_compute[187208]: 2025-12-05 12:16:26.427 187212 WARNING nova.compute.manager [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:16:26 compute-0 podman[240342]: 2025-12-05 12:16:26.540585056 +0000 UTC m=+0.050826001 container create b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 12:16:26 compute-0 systemd[1]: Started libpod-conmon-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c.scope.
Dec 05 12:16:26 compute-0 podman[240342]: 2025-12-05 12:16:26.513552994 +0000 UTC m=+0.023793969 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:16:26 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:16:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c79a167d190ccd5eec04f331a8b94c8e257ebf6ac48b138dc6af8474b8be0235/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:16:26 compute-0 podman[240342]: 2025-12-05 12:16:26.623969807 +0000 UTC m=+0.134210752 container init b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:16:26 compute-0 podman[240342]: 2025-12-05 12:16:26.628544919 +0000 UTC m=+0.138785864 container start b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 12:16:26 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [NOTICE]   (240361) : New worker (240363) forked
Dec 05 12:16:26 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [NOTICE]   (240361) : Loading success.
Dec 05 12:16:28 compute-0 nova_compute[187208]: 2025-12-05 12:16:28.737 187212 DEBUG nova.compute.manager [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:28 compute-0 nova_compute[187208]: 2025-12-05 12:16:28.738 187212 DEBUG oslo_concurrency.lockutils [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:28 compute-0 nova_compute[187208]: 2025-12-05 12:16:28.739 187212 DEBUG oslo_concurrency.lockutils [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:28 compute-0 nova_compute[187208]: 2025-12-05 12:16:28.739 187212 DEBUG oslo_concurrency.lockutils [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:28 compute-0 nova_compute[187208]: 2025-12-05 12:16:28.739 187212 DEBUG nova.compute.manager [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:28 compute-0 nova_compute[187208]: 2025-12-05 12:16:28.740 187212 WARNING nova.compute.manager [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.
Dec 05 12:16:28 compute-0 nova_compute[187208]: 2025-12-05 12:16:28.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:30 compute-0 podman[240373]: 2025-12-05 12:16:30.226213164 +0000 UTC m=+0.082611650 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:16:30 compute-0 nova_compute[187208]: 2025-12-05 12:16:30.689 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:33 compute-0 nova_compute[187208]: 2025-12-05 12:16:33.835 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:34 compute-0 nova_compute[187208]: 2025-12-05 12:16:34.406 187212 DEBUG nova.objects.instance [None req-559428be-cb06-41d2-8164-1f861638a55e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:34 compute-0 nova_compute[187208]: 2025-12-05 12:16:34.437 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936994.4376903, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:16:34 compute-0 nova_compute[187208]: 2025-12-05 12:16:34.438 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Paused (Lifecycle Event)
Dec 05 12:16:34 compute-0 nova_compute[187208]: 2025-12-05 12:16:34.460 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:34 compute-0 nova_compute[187208]: 2025-12-05 12:16:34.464 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:16:34 compute-0 nova_compute[187208]: 2025-12-05 12:16:34.485 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (suspending). Skip.
Dec 05 12:16:35 compute-0 podman[240401]: 2025-12-05 12:16:35.197832091 +0000 UTC m=+0.056222096 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:16:35 compute-0 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec 05 12:16:35 compute-0 NetworkManager[55691]: <info>  [1764936995.2431] device (tap5316adeb-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:16:35 compute-0 ovn_controller[95610]: 2025-12-05T12:16:35Z|01072|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec 05 12:16:35 compute-0 ovn_controller[95610]: 2025-12-05T12:16:35Z|01073|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec 05 12:16:35 compute-0 nova_compute[187208]: 2025-12-05 12:16:35.250 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:35 compute-0 ovn_controller[95610]: 2025-12-05T12:16:35Z|01074|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec 05 12:16:35 compute-0 nova_compute[187208]: 2025-12-05 12:16:35.252 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:35 compute-0 nova_compute[187208]: 2025-12-05 12:16:35.265 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:35 compute-0 systemd[1]: machine-qemu\x2d123\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Dec 05 12:16:35 compute-0 systemd[1]: machine-qemu\x2d123\x2dinstance\x2d0000005c.scope: Consumed 9.100s CPU time.
Dec 05 12:16:35 compute-0 systemd-machined[153543]: Machine qemu-123-instance-0000005c terminated.
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.420 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '12', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.422 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.424 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.425 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eee0ce79-6154-4337-8215-35737826d9f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.426 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace which is not needed anymore
Dec 05 12:16:35 compute-0 nova_compute[187208]: 2025-12-05 12:16:35.449 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:35 compute-0 nova_compute[187208]: 2025-12-05 12:16:35.455 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.493 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'name': 'tempest-ServerActionsTestJSON-server-954339420', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '75752a4cc8f7487e8dc4440201f894c8', 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'hostId': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 12:16:35 compute-0 nova_compute[187208]: 2025-12-05 12:16:35.494 187212 DEBUG nova.compute.manager [None req-559428be-cb06-41d2-8164-1f861638a55e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.496 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.496 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.497 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.497 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.498 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.498 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.498 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.499 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.499 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.501 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.501 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.502 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.502 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.503 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.503 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.503 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.504 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.504 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.505 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.505 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.506 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.506 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.507 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.507 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.507 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.508 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.508 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.509 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.509 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.510 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.510 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.511 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.511 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.511 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.512 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.512 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 12:16:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.513 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 12:16:35 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [NOTICE]   (240361) : haproxy version is 2.8.14-c23fe91
Dec 05 12:16:35 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [NOTICE]   (240361) : path to executable is /usr/sbin/haproxy
Dec 05 12:16:35 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [WARNING]  (240361) : Exiting Master process...
Dec 05 12:16:35 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [ALERT]    (240361) : Current worker (240363) exited with code 143 (Terminated)
Dec 05 12:16:35 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [WARNING]  (240361) : All workers exited. Exiting... (0)
Dec 05 12:16:35 compute-0 systemd[1]: libpod-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c.scope: Deactivated successfully.
Dec 05 12:16:35 compute-0 conmon[240357]: conmon b81853b290644b432f54 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c.scope/container/memory.events
Dec 05 12:16:35 compute-0 podman[240463]: 2025-12-05 12:16:35.577700734 +0000 UTC m=+0.049222024 container died b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 05 12:16:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c-userdata-shm.mount: Deactivated successfully.
Dec 05 12:16:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-c79a167d190ccd5eec04f331a8b94c8e257ebf6ac48b138dc6af8474b8be0235-merged.mount: Deactivated successfully.
Dec 05 12:16:35 compute-0 podman[240463]: 2025-12-05 12:16:35.61457051 +0000 UTC m=+0.086091810 container cleanup b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:16:35 compute-0 systemd[1]: libpod-conmon-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c.scope: Deactivated successfully.
Dec 05 12:16:35 compute-0 podman[240494]: 2025-12-05 12:16:35.68165763 +0000 UTC m=+0.046723812 container remove b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.686 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[918aa1b4-9e33-4820-9396-9b7b11b400cd]: (4, ('Fri Dec  5 12:16:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c)\nb81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c\nFri Dec  5 12:16:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c)\nb81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.689 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3448f8ec-1d9c-49ff-b8d6-189bc08b5b22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.690 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:35 compute-0 nova_compute[187208]: 2025-12-05 12:16:35.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:35 compute-0 kernel: tapf9ed41c2-b0: left promiscuous mode
Dec 05 12:16:35 compute-0 nova_compute[187208]: 2025-12-05 12:16:35.707 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.710 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea667f3-a1a8-49ed-b48d-e12510ac696c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.735 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2d1a9d-6416-4ea7-856d-40305d4bce57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.736 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c03387f-6b43-4c26-8181-18102a6ae2e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.751 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[59cc5c0a-6f69-495b-aec3-8daac0df9529]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436642, 'reachable_time': 44076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240510, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:35 compute-0 systemd[1]: run-netns-ovnmeta\x2df9ed41c2\x2db085\x2d41ff\x2dac71\x2d6256a4e30e85.mount: Deactivated successfully.
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.756 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:16:35 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.757 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[de5deb9e-063b-4820-bebf-83ec00ee0a81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:36 compute-0 nova_compute[187208]: 2025-12-05 12:16:36.210 187212 DEBUG nova.compute.manager [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:36 compute-0 nova_compute[187208]: 2025-12-05 12:16:36.210 187212 DEBUG oslo_concurrency.lockutils [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:36 compute-0 nova_compute[187208]: 2025-12-05 12:16:36.211 187212 DEBUG oslo_concurrency.lockutils [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:36 compute-0 nova_compute[187208]: 2025-12-05 12:16:36.211 187212 DEBUG oslo_concurrency.lockutils [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:36 compute-0 nova_compute[187208]: 2025-12-05 12:16:36.211 187212 DEBUG nova.compute.manager [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:36 compute-0 nova_compute[187208]: 2025-12-05 12:16:36.212 187212 WARNING nova.compute.manager [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state suspended and task_state None.
Dec 05 12:16:37 compute-0 nova_compute[187208]: 2025-12-05 12:16:37.752 187212 INFO nova.compute.manager [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Resuming
Dec 05 12:16:37 compute-0 nova_compute[187208]: 2025-12-05 12:16:37.753 187212 DEBUG nova.objects.instance [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:37 compute-0 nova_compute[187208]: 2025-12-05 12:16:37.795 187212 DEBUG oslo_concurrency.lockutils [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:16:37 compute-0 nova_compute[187208]: 2025-12-05 12:16:37.795 187212 DEBUG oslo_concurrency.lockutils [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:16:37 compute-0 nova_compute[187208]: 2025-12-05 12:16:37.796 187212 DEBUG nova.network.neutron [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:16:38 compute-0 nova_compute[187208]: 2025-12-05 12:16:38.383 187212 DEBUG nova.compute.manager [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:38 compute-0 nova_compute[187208]: 2025-12-05 12:16:38.383 187212 DEBUG oslo_concurrency.lockutils [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:38 compute-0 nova_compute[187208]: 2025-12-05 12:16:38.384 187212 DEBUG oslo_concurrency.lockutils [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:38 compute-0 nova_compute[187208]: 2025-12-05 12:16:38.384 187212 DEBUG oslo_concurrency.lockutils [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:38 compute-0 nova_compute[187208]: 2025-12-05 12:16:38.384 187212 DEBUG nova.compute.manager [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:38 compute-0 nova_compute[187208]: 2025-12-05 12:16:38.384 187212 WARNING nova.compute.manager [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state suspended and task_state resuming.
Dec 05 12:16:38 compute-0 nova_compute[187208]: 2025-12-05 12:16:38.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:40 compute-0 podman[240512]: 2025-12-05 12:16:40.199937082 +0000 UTC m=+0.052607352 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Dec 05 12:16:40 compute-0 podman[240511]: 2025-12-05 12:16:40.21163203 +0000 UTC m=+0.062125887 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, architecture=x86_64, name=ubi9-minimal, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 12:16:40 compute-0 nova_compute[187208]: 2025-12-05 12:16:40.693 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.593 187212 DEBUG nova.network.neutron [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.609 187212 DEBUG oslo_concurrency.lockutils [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.615 187212 DEBUG nova.virt.libvirt.vif [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:16:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.615 187212 DEBUG nova.network.os_vif_util [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.616 187212 DEBUG nova.network.os_vif_util [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.616 187212 DEBUG os_vif [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.617 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.617 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.617 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.620 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5316adeb-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.621 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5316adeb-5a, col_values=(('external_ids', {'iface-id': '5316adeb-5a49-4a58-b997-f132a083ff13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:d0:34', 'vm-uuid': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.621 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.621 187212 INFO os_vif [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.644 187212 DEBUG nova.objects.instance [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:41 compute-0 kernel: tap5316adeb-5a: entered promiscuous mode
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.733 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:41 compute-0 ovn_controller[95610]: 2025-12-05T12:16:41Z|01075|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec 05 12:16:41 compute-0 NetworkManager[55691]: <info>  [1764937001.7346] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/409)
Dec 05 12:16:41 compute-0 ovn_controller[95610]: 2025-12-05T12:16:41Z|01076|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.748 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.748 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:41 compute-0 ovn_controller[95610]: 2025-12-05T12:16:41Z|01077|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec 05 12:16:41 compute-0 ovn_controller[95610]: 2025-12-05T12:16:41Z|01078|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.749 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.750 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis
Dec 05 12:16:41 compute-0 nova_compute[187208]: 2025-12-05 12:16:41.751 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.752 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:16:41 compute-0 systemd-udevd[240565]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.763 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a0af6a18-6ee4-473b-a073-bc864ab3afda]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.764 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9ed41c2-b1 in ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.766 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9ed41c2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.766 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54aa0737-f29d-4a5e-b405-7e6f26849f9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.767 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[925c261f-ed48-4559-a2a4-a1b0d250e209]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 NetworkManager[55691]: <info>  [1764937001.7757] device (tap5316adeb-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:16:41 compute-0 NetworkManager[55691]: <info>  [1764937001.7766] device (tap5316adeb-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.777 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[55f25acd-96b4-406d-9b17-374a46b1d4a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 systemd-machined[153543]: New machine qemu-124-instance-0000005c.
Dec 05 12:16:41 compute-0 systemd[1]: Started Virtual Machine qemu-124-instance-0000005c.
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.802 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d502a02b-83a8-4d0c-90d4-041537baf74c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.832 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7185cd-6437-42b5-9dc3-94ba8e92512c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.838 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78563484-eb77-4e6a-a5b1-c25ff9aa33e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 NetworkManager[55691]: <info>  [1764937001.8392] manager: (tapf9ed41c2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/410)
Dec 05 12:16:41 compute-0 systemd-udevd[240571]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.872 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[56cef919-5842-413b-a820-d8699b7d6c86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.876 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[48daf419-5be1-4a9d-a9ff-d9ce44c6e35d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 NetworkManager[55691]: <info>  [1764937001.8987] device (tapf9ed41c2-b0): carrier: link connected
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.904 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0c63d93f-3170-4fec-9e1f-2ee90a187a66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.921 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[00fae274-cefd-4409-bf14-0cd9e09b2eff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 297], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438246, 'reachable_time': 32086, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240600, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.937 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c21b901b-fc66-45d9-bba9-865f18c1b3fe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:9111'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438246, 'tstamp': 438246}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240601, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.952 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[890f599b-9a32-40fa-9a95-42ebfbbf30f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 297], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438246, 'reachable_time': 32086, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240602, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.984 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfab87f-55da-473d-b414-2fc3bfbf078c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.046 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[23991809-4173-46ca-acaf-4577f2b4e316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.048 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.048 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.048 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:42 compute-0 kernel: tapf9ed41c2-b0: entered promiscuous mode
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.050 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:42 compute-0 NetworkManager[55691]: <info>  [1764937002.0515] manager: (tapf9ed41c2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/411)
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.053 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.054 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:42 compute-0 ovn_controller[95610]: 2025-12-05T12:16:42Z|01079|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.055 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.056 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.056 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef4af92-579b-4858-a604-7959e7bb02de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.057 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:16:42 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.058 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'env', 'PROCESS_TAG=haproxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9ed41c2-b085-41ff-ac71-6256a4e30e85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.066 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.267 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.268 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937002.266829, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.268 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Started (Lifecycle Event)
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.285 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.295 187212 DEBUG nova.compute.manager [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.295 187212 DEBUG nova.objects.instance [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.299 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.316 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance running successfully.
Dec 05 12:16:42 compute-0 virtqemud[186841]: argument unsupported: QEMU guest agent is not configured
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.320 187212 DEBUG nova.virt.libvirt.guest [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.320 187212 DEBUG nova.compute.manager [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.327 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (resuming). Skip.
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.328 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937002.2768931, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.328 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Resumed (Lifecycle Event)
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.352 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.357 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:16:42 compute-0 nova_compute[187208]: 2025-12-05 12:16:42.380 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (resuming). Skip.
Dec 05 12:16:42 compute-0 podman[240641]: 2025-12-05 12:16:42.429312046 +0000 UTC m=+0.051891491 container create 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:16:42 compute-0 systemd[1]: Started libpod-conmon-0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14.scope.
Dec 05 12:16:42 compute-0 podman[240641]: 2025-12-05 12:16:42.401544654 +0000 UTC m=+0.024124139 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:16:42 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:16:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e761d76e7d192eec1d533f773fa97374b2a9afe07eddac794d23014a3065c410/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:16:42 compute-0 podman[240641]: 2025-12-05 12:16:42.547947847 +0000 UTC m=+0.170527322 container init 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 12:16:42 compute-0 podman[240641]: 2025-12-05 12:16:42.554139806 +0000 UTC m=+0.176719251 container start 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:16:42 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [NOTICE]   (240661) : New worker (240663) forked
Dec 05 12:16:42 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [NOTICE]   (240661) : Loading success.
Dec 05 12:16:43 compute-0 nova_compute[187208]: 2025-12-05 12:16:43.838 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.377 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.377 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.377 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.378 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.378 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.379 187212 INFO nova.compute.manager [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Terminating instance
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.380 187212 DEBUG nova.compute.manager [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:16:44 compute-0 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec 05 12:16:44 compute-0 NetworkManager[55691]: <info>  [1764937004.4010] device (tap5316adeb-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01080|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01081|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.409 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01082|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.421 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.422 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.424 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.424 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.426 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c28e901e-b859-4d86-a3d5-02cd795b1555]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.426 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace which is not needed anymore
Dec 05 12:16:44 compute-0 systemd[1]: machine-qemu\x2d124\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Dec 05 12:16:44 compute-0 systemd[1]: machine-qemu\x2d124\x2dinstance\x2d0000005c.scope: Consumed 2.573s CPU time.
Dec 05 12:16:44 compute-0 systemd-machined[153543]: Machine qemu-124-instance-0000005c terminated.
Dec 05 12:16:44 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [NOTICE]   (240661) : haproxy version is 2.8.14-c23fe91
Dec 05 12:16:44 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [NOTICE]   (240661) : path to executable is /usr/sbin/haproxy
Dec 05 12:16:44 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [WARNING]  (240661) : Exiting Master process...
Dec 05 12:16:44 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [ALERT]    (240661) : Current worker (240663) exited with code 143 (Terminated)
Dec 05 12:16:44 compute-0 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [WARNING]  (240661) : All workers exited. Exiting... (0)
Dec 05 12:16:44 compute-0 systemd[1]: libpod-0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14.scope: Deactivated successfully.
Dec 05 12:16:44 compute-0 NetworkManager[55691]: <info>  [1764937004.5987] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/412)
Dec 05 12:16:44 compute-0 kernel: tap5316adeb-5a: entered promiscuous mode
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01083|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01084|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.600 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec 05 12:16:44 compute-0 podman[240694]: 2025-12-05 12:16:44.605448033 +0000 UTC m=+0.082387003 container died 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.607 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01085|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01086|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01087|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=1)
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01088|if_status|INFO|Dropped 2 log messages in last 292 seconds (most recently, 292 seconds ago) due to excessive rate
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01089|if_status|INFO|Not setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down as sb is readonly
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.622 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01090|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01091|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec 05 12:16:44 compute-0 ovn_controller[95610]: 2025-12-05T12:16:44Z|01092|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.637 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.658 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.658 187212 DEBUG nova.objects.instance [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.674 187212 DEBUG nova.virt.libvirt.vif [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:16:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.674 187212 DEBUG nova.network.os_vif_util [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.675 187212 DEBUG nova.network.os_vif_util [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.675 187212 DEBUG os_vif [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.677 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.677 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5316adeb-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.679 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.684 187212 INFO os_vif [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.685 187212 INFO nova.virt.libvirt.driver [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Deleting instance files /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c_del
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.686 187212 INFO nova.virt.libvirt.driver [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Deletion of /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c_del complete
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.739 187212 INFO nova.compute.manager [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.739 187212 DEBUG oslo.service.loopingcall [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.740 187212 DEBUG nova.compute.manager [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.740 187212 DEBUG nova.network.neutron [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:16:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14-userdata-shm.mount: Deactivated successfully.
Dec 05 12:16:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-e761d76e7d192eec1d533f773fa97374b2a9afe07eddac794d23014a3065c410-merged.mount: Deactivated successfully.
Dec 05 12:16:44 compute-0 podman[240694]: 2025-12-05 12:16:44.847352757 +0000 UTC m=+0.324291717 container cleanup 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:16:44 compute-0 systemd[1]: libpod-conmon-0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14.scope: Deactivated successfully.
Dec 05 12:16:44 compute-0 podman[240735]: 2025-12-05 12:16:44.9605547 +0000 UTC m=+0.093523165 container remove 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.965 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7882fa2d-9b49-40e1-b4ac-09fdc604ceae]: (4, ('Fri Dec  5 12:16:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14)\n0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14\nFri Dec  5 12:16:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14)\n0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.967 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec64088-b19a-4f44-b4ec-b4241b629e98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.968 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 kernel: tapf9ed41c2-b0: left promiscuous mode
Dec 05 12:16:44 compute-0 nova_compute[187208]: 2025-12-05 12:16:44.984 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.987 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[987f076c-5bd1-454a-b806-339f43aab4a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.003 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d4caf7e3-dee0-4c99-87f0-8b74c1cb8a89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.005 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3561387f-7c50-48d7-8d56-9f69b393288d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.027 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9dbaa5a0-5e75-461b-b213-7f144554b378]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438239, 'reachable_time': 15412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240748, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.030 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.030 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a5633b-c884-44c2-98e7-96e3d699b034]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.030 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:16:45 compute-0 systemd[1]: run-netns-ovnmeta\x2df9ed41c2\x2db085\x2d41ff\x2dac71\x2d6256a4e30e85.mount: Deactivated successfully.
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.031 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.032 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b9cc72-535d-4bd1-96fd-fe7e68ee84e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.033 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.034 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:16:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.035 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[001e5d16-86cf-49a6-9189-0131c4483b8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:16:46 compute-0 nova_compute[187208]: 2025-12-05 12:16:46.110 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:46 compute-0 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG nova.compute.manager [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:46 compute-0 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG oslo_concurrency.lockutils [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:46 compute-0 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG oslo_concurrency.lockutils [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:46 compute-0 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG oslo_concurrency.lockutils [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:46 compute-0 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG nova.compute.manager [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:46 compute-0 nova_compute[187208]: 2025-12-05 12:16:46.264 187212 WARNING nova.compute.manager [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state deleting.
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.270 187212 DEBUG nova.network.neutron [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.303 187212 INFO nova.compute.manager [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Took 3.56 seconds to deallocate network for instance.
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.345 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.345 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.376 187212 DEBUG nova.compute.manager [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.377 187212 DEBUG oslo_concurrency.lockutils [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.377 187212 DEBUG oslo_concurrency.lockutils [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.377 187212 DEBUG oslo_concurrency.lockutils [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.378 187212 DEBUG nova.compute.manager [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.378 187212 WARNING nova.compute.manager [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state deleted and task_state None.
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.378 187212 DEBUG nova.compute.manager [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-deleted-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.407 187212 DEBUG nova.compute.provider_tree [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.422 187212 DEBUG nova.scheduler.client.report [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.444 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.469 187212 INFO nova.scheduler.client.report [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Deleted allocations for instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.618 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:48 compute-0 nova_compute[187208]: 2025-12-05 12:16:48.841 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:49 compute-0 nova_compute[187208]: 2025-12-05 12:16:49.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:16:49 compute-0 nova_compute[187208]: 2025-12-05 12:16:49.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:16:49 compute-0 nova_compute[187208]: 2025-12-05 12:16:49.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:16:49 compute-0 nova_compute[187208]: 2025-12-05 12:16:49.680 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:50 compute-0 podman[240755]: 2025-12-05 12:16:50.213205623 +0000 UTC m=+0.058020119 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:16:50 compute-0 podman[240754]: 2025-12-05 12:16:50.217476756 +0000 UTC m=+0.060628764 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:16:50 compute-0 podman[240756]: 2025-12-05 12:16:50.261350855 +0000 UTC m=+0.098477298 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 12:16:50 compute-0 nova_compute[187208]: 2025-12-05 12:16:50.635 187212 DEBUG nova.compute.manager [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:16:50 compute-0 nova_compute[187208]: 2025-12-05 12:16:50.636 187212 DEBUG oslo_concurrency.lockutils [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:50 compute-0 nova_compute[187208]: 2025-12-05 12:16:50.636 187212 DEBUG oslo_concurrency.lockutils [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:50 compute-0 nova_compute[187208]: 2025-12-05 12:16:50.636 187212 DEBUG oslo_concurrency.lockutils [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:50 compute-0 nova_compute[187208]: 2025-12-05 12:16:50.637 187212 DEBUG nova.compute.manager [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:16:50 compute-0 nova_compute[187208]: 2025-12-05 12:16:50.637 187212 WARNING nova.compute.manager [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state deleted and task_state None.
Dec 05 12:16:51 compute-0 nova_compute[187208]: 2025-12-05 12:16:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:16:51 compute-0 nova_compute[187208]: 2025-12-05 12:16:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:16:52 compute-0 nova_compute[187208]: 2025-12-05 12:16:52.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:16:53 compute-0 nova_compute[187208]: 2025-12-05 12:16:53.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:16:53 compute-0 nova_compute[187208]: 2025-12-05 12:16:53.843 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:54 compute-0 nova_compute[187208]: 2025-12-05 12:16:54.274 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:54 compute-0 nova_compute[187208]: 2025-12-05 12:16:54.682 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:55.472 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:16:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:16:55.473 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:16:55 compute-0 nova_compute[187208]: 2025-12-05 12:16:55.473 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:55 compute-0 nova_compute[187208]: 2025-12-05 12:16:55.718 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:55 compute-0 nova_compute[187208]: 2025-12-05 12:16:55.841 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.084 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.084 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.085 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.085 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.264 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.266 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5501MB free_disk=73.04064559936523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.267 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.267 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.343 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.343 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.378 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.396 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.422 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:16:56 compute-0 nova_compute[187208]: 2025-12-05 12:16:56.422 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:16:58 compute-0 nova_compute[187208]: 2025-12-05 12:16:58.845 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:16:59 compute-0 nova_compute[187208]: 2025-12-05 12:16:59.658 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937004.656347, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:16:59 compute-0 nova_compute[187208]: 2025-12-05 12:16:59.658 187212 INFO nova.compute.manager [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Stopped (Lifecycle Event)
Dec 05 12:16:59 compute-0 nova_compute[187208]: 2025-12-05 12:16:59.676 187212 DEBUG nova.compute.manager [None req-1c89c431-4828-46c7-b5ea-1a1384ec34e9 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:16:59 compute-0 nova_compute[187208]: 2025-12-05 12:16:59.684 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:01 compute-0 podman[240823]: 2025-12-05 12:17:01.197477618 +0000 UTC m=+0.055606639 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:17:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:03.021 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:03 compute-0 nova_compute[187208]: 2025-12-05 12:17:03.847 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:04.475 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:17:04 compute-0 nova_compute[187208]: 2025-12-05 12:17:04.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:06 compute-0 podman[240847]: 2025-12-05 12:17:06.203157771 +0000 UTC m=+0.060023757 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:17:08 compute-0 nova_compute[187208]: 2025-12-05 12:17:08.849 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:09 compute-0 nova_compute[187208]: 2025-12-05 12:17:09.688 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:11 compute-0 podman[240868]: 2025-12-05 12:17:11.200245065 +0000 UTC m=+0.058722309 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm)
Dec 05 12:17:11 compute-0 podman[240869]: 2025-12-05 12:17:11.217948657 +0000 UTC m=+0.067642507 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 12:17:13 compute-0 nova_compute[187208]: 2025-12-05 12:17:13.852 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:14 compute-0 nova_compute[187208]: 2025-12-05 12:17:14.690 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:18 compute-0 nova_compute[187208]: 2025-12-05 12:17:18.854 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:19 compute-0 nova_compute[187208]: 2025-12-05 12:17:19.693 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:21 compute-0 podman[240909]: 2025-12-05 12:17:21.20091074 +0000 UTC m=+0.052679758 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd)
Dec 05 12:17:21 compute-0 podman[240910]: 2025-12-05 12:17:21.209008882 +0000 UTC m=+0.055563251 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:17:21 compute-0 podman[240911]: 2025-12-05 12:17:21.242782558 +0000 UTC m=+0.086186586 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:17:23 compute-0 nova_compute[187208]: 2025-12-05 12:17:23.858 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:24 compute-0 nova_compute[187208]: 2025-12-05 12:17:24.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:28 compute-0 nova_compute[187208]: 2025-12-05 12:17:28.859 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:29 compute-0 nova_compute[187208]: 2025-12-05 12:17:29.732 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:32 compute-0 podman[240978]: 2025-12-05 12:17:32.201060147 +0000 UTC m=+0.053645026 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:17:33 compute-0 nova_compute[187208]: 2025-12-05 12:17:33.861 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.356 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.356 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.377 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.453 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.454 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.462 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.463 187212 INFO nova.compute.claims [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.599 187212 DEBUG nova.compute.provider_tree [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.616 187212 DEBUG nova.scheduler.client.report [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.642 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.643 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.734 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.913 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.914 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.939 187212 INFO nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:17:34 compute-0 nova_compute[187208]: 2025-12-05 12:17:34.964 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.107 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.109 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.110 187212 INFO nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Creating image(s)
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.112 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.113 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.114 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.129 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.195 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.196 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.197 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.213 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.275 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.276 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.312 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.313 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.313 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.370 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.372 187212 DEBUG nova.virt.disk.api [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Checking if we can resize image /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.372 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.447 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.449 187212 DEBUG nova.virt.disk.api [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Cannot resize image /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.449 187212 DEBUG nova.objects.instance [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lazy-loading 'migration_context' on Instance uuid 01eab75c-0be7-4ae5-8946-99edd40a7231 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.464 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.465 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Ensure instance console log exists: /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.465 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.466 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:35 compute-0 nova_compute[187208]: 2025-12-05 12:17:35.466 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:36 compute-0 nova_compute[187208]: 2025-12-05 12:17:36.024 187212 DEBUG nova.policy [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9db950f394294957891a245f192c5404', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7d9f0915cbe24ecfae713c84ca158d2c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:17:37 compute-0 podman[241018]: 2025-12-05 12:17:37.219567659 +0000 UTC m=+0.067613236 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 12:17:38 compute-0 nova_compute[187208]: 2025-12-05 12:17:38.117 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Successfully created port: c3bc0e34-ce29-4ea4-b0cb-f46472e25593 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:17:38 compute-0 nova_compute[187208]: 2025-12-05 12:17:38.896 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:39 compute-0 sshd-session[241017]: Connection reset by authenticating user root 91.202.233.33 port 39368 [preauth]
Dec 05 12:17:39 compute-0 nova_compute[187208]: 2025-12-05 12:17:39.737 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:40 compute-0 nova_compute[187208]: 2025-12-05 12:17:40.876 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Successfully updated port: c3bc0e34-ce29-4ea4-b0cb-f46472e25593 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:17:40 compute-0 nova_compute[187208]: 2025-12-05 12:17:40.893 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:17:40 compute-0 nova_compute[187208]: 2025-12-05 12:17:40.893 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquired lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:17:40 compute-0 nova_compute[187208]: 2025-12-05 12:17:40.894 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:17:41 compute-0 nova_compute[187208]: 2025-12-05 12:17:41.060 187212 DEBUG nova.compute.manager [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-changed-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:17:41 compute-0 nova_compute[187208]: 2025-12-05 12:17:41.061 187212 DEBUG nova.compute.manager [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Refreshing instance network info cache due to event network-changed-c3bc0e34-ce29-4ea4-b0cb-f46472e25593. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:17:41 compute-0 nova_compute[187208]: 2025-12-05 12:17:41.061 187212 DEBUG oslo_concurrency.lockutils [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:17:41 compute-0 nova_compute[187208]: 2025-12-05 12:17:41.203 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:17:41 compute-0 sshd-session[241039]: Connection reset by authenticating user root 91.202.233.33 port 39378 [preauth]
Dec 05 12:17:42 compute-0 ovn_controller[95610]: 2025-12-05T12:17:42Z|01093|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 05 12:17:42 compute-0 podman[241044]: 2025-12-05 12:17:42.195322156 +0000 UTC m=+0.048706914 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 12:17:42 compute-0 podman[241043]: 2025-12-05 12:17:42.206006302 +0000 UTC m=+0.064480306 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350)
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.378 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updating instance_info_cache with network_info: [{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.398 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Releasing lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.399 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance network_info: |[{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.399 187212 DEBUG oslo_concurrency.lockutils [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.399 187212 DEBUG nova.network.neutron [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Refreshing network info cache for port c3bc0e34-ce29-4ea4-b0cb-f46472e25593 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.402 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start _get_guest_xml network_info=[{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.408 187212 WARNING nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.413 187212 DEBUG nova.virt.libvirt.host [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.413 187212 DEBUG nova.virt.libvirt.host [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.416 187212 DEBUG nova.virt.libvirt.host [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.417 187212 DEBUG nova.virt.libvirt.host [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.417 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.417 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.418 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.418 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.418 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.418 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.419 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.419 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.419 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.419 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.420 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.420 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.424 187212 DEBUG nova.virt.libvirt.vif [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:17:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1871572941',display_name='tempest-VolumesActionsTest-instance-1871572941',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-1871572941',id=100,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d9f0915cbe24ecfae713c84ca158d2c',ramdisk_id='',reservation_id='r-wqaeiphu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1144255362',owner_user_name='tempest-VolumesActionsTest-1144255362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:17:35Z,user_data=None,user_id='9db950f394294957891a245f192c5404',uuid=01eab75c-0be7-4ae5-8946-99edd40a7231,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.424 187212 DEBUG nova.network.os_vif_util [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converting VIF {"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.425 187212 DEBUG nova.network.os_vif_util [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.425 187212 DEBUG nova.objects.instance [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lazy-loading 'pci_devices' on Instance uuid 01eab75c-0be7-4ae5-8946-99edd40a7231 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.441 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <uuid>01eab75c-0be7-4ae5-8946-99edd40a7231</uuid>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <name>instance-00000064</name>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <nova:name>tempest-VolumesActionsTest-instance-1871572941</nova:name>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:17:43</nova:creationTime>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:17:43 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:17:43 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:17:43 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:17:43 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:17:43 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:17:43 compute-0 nova_compute[187208]:         <nova:user uuid="9db950f394294957891a245f192c5404">tempest-VolumesActionsTest-1144255362-project-member</nova:user>
Dec 05 12:17:43 compute-0 nova_compute[187208]:         <nova:project uuid="7d9f0915cbe24ecfae713c84ca158d2c">tempest-VolumesActionsTest-1144255362</nova:project>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:17:43 compute-0 nova_compute[187208]:         <nova:port uuid="c3bc0e34-ce29-4ea4-b0cb-f46472e25593">
Dec 05 12:17:43 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <system>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <entry name="serial">01eab75c-0be7-4ae5-8946-99edd40a7231</entry>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <entry name="uuid">01eab75c-0be7-4ae5-8946-99edd40a7231</entry>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     </system>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <os>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   </os>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <features>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   </features>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.config"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:7d:f3:92"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <target dev="tapc3bc0e34-ce"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/console.log" append="off"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <video>
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     </video>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:17:43 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:17:43 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:17:43 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:17:43 compute-0 nova_compute[187208]: </domain>
Dec 05 12:17:43 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.443 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Preparing to wait for external event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.443 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.443 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.443 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.444 187212 DEBUG nova.virt.libvirt.vif [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:17:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1871572941',display_name='tempest-VolumesActionsTest-instance-1871572941',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-1871572941',id=100,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d9f0915cbe24ecfae713c84ca158d2c',ramdisk_id='',reservation_id='r-wqaeiphu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1144255362',owner_user_name='tempest-VolumesActionsTest-1144255362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:17:35Z,user_data=None,user_id='9db950f394294957891a245f192c5404',uuid=01eab75c-0be7-4ae5-8946-99edd40a7231,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.444 187212 DEBUG nova.network.os_vif_util [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converting VIF {"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.445 187212 DEBUG nova.network.os_vif_util [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.445 187212 DEBUG os_vif [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.446 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.447 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.450 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.450 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3bc0e34-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.451 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc3bc0e34-ce, col_values=(('external_ids', {'iface-id': 'c3bc0e34-ce29-4ea4-b0cb-f46472e25593', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:f3:92', 'vm-uuid': '01eab75c-0be7-4ae5-8946-99edd40a7231'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.452 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:43 compute-0 NetworkManager[55691]: <info>  [1764937063.4539] manager: (tapc3bc0e34-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/413)
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.455 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.458 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.459 187212 INFO os_vif [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce')
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.522 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.522 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.522 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] No VIF found with MAC fa:16:3e:7d:f3:92, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.523 187212 INFO nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Using config drive
Dec 05 12:17:43 compute-0 sshd-session[241041]: Connection reset by authenticating user root 91.202.233.33 port 34834 [preauth]
Dec 05 12:17:43 compute-0 nova_compute[187208]: 2025-12-05 12:17:43.897 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.403 187212 INFO nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Creating config drive at /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.config
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.408 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptf1cj36w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.537 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptf1cj36w" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:44 compute-0 kernel: tapc3bc0e34-ce: entered promiscuous mode
Dec 05 12:17:44 compute-0 NetworkManager[55691]: <info>  [1764937064.6135] manager: (tapc3bc0e34-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/414)
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:44 compute-0 ovn_controller[95610]: 2025-12-05T12:17:44Z|01094|binding|INFO|Claiming lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 for this chassis.
Dec 05 12:17:44 compute-0 ovn_controller[95610]: 2025-12-05T12:17:44Z|01095|binding|INFO|c3bc0e34-ce29-4ea4-b0cb-f46472e25593: Claiming fa:16:3e:7d:f3:92 10.100.0.6
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.617 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.633 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:f3:92 10.100.0.6'], port_security=['fa:16:3e:7d:f3:92 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '01eab75c-0be7-4ae5-8946-99edd40a7231', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d9f0915cbe24ecfae713c84ca158d2c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2eb3347a-3ab9-4f98-aee7-0fd84b6b9272', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c88129a4-38c8-4910-a8a3-2a71bf90c0f0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c3bc0e34-ce29-4ea4-b0cb-f46472e25593) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.634 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c3bc0e34-ce29-4ea4-b0cb-f46472e25593 in datapath d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 bound to our chassis
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.636 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2
Dec 05 12:17:44 compute-0 systemd-udevd[241103]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.649 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8e230793-be19-4988-988d-80f6ce06f28c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.650 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd34ebc66-b1 in ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:17:44 compute-0 NetworkManager[55691]: <info>  [1764937064.6523] device (tapc3bc0e34-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:17:44 compute-0 NetworkManager[55691]: <info>  [1764937064.6529] device (tapc3bc0e34-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.652 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd34ebc66-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.652 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e98cda9c-b968-4d66-adb5-c95bd458305a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.656 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[69d8231a-433d-4074-93da-75aec0a3d044]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 systemd-machined[153543]: New machine qemu-125-instance-00000064.
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.668 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[994b177c-c35b-4c0a-8780-79cdb2c0baf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:44 compute-0 ovn_controller[95610]: 2025-12-05T12:17:44Z|01096|binding|INFO|Setting lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 ovn-installed in OVS
Dec 05 12:17:44 compute-0 ovn_controller[95610]: 2025-12-05T12:17:44Z|01097|binding|INFO|Setting lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 up in Southbound
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:44 compute-0 systemd[1]: Started Virtual Machine qemu-125-instance-00000064.
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.691 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b7669bcf-3d10-4da4-be42-c5e3401b5c00]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.720 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[30e82bc6-d472-4b65-b155-361b33d90f1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 NetworkManager[55691]: <info>  [1764937064.7268] manager: (tapd34ebc66-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/415)
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.725 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb73988-df9e-404f-bc3a-4789bde9c300]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 systemd-udevd[241108]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.755 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9a1238e3-550d-4b6c-9e1b-dab5776708f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.758 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[60c76786-7534-4f4e-b908-b2a0443f2e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 NetworkManager[55691]: <info>  [1764937064.7793] device (tapd34ebc66-b0): carrier: link connected
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.785 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f153683d-9afd-4b2e-99a2-e9e610b19334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.801 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bba6bb69-d8de-4e0d-b70b-5573897eb0c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd34ebc66-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:33:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 300], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444534, 'reachable_time': 29614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241139, 'error': None, 'target': 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.824 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[663eb7f0-1993-4af7-9150-964564df9b91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe32:330f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444534, 'tstamp': 444534}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241144, 'error': None, 'target': 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.843 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[051d1bc6-a247-45fa-8508-87b40321badd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd34ebc66-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:33:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 300], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444534, 'reachable_time': 29614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241147, 'error': None, 'target': 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.888 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1870e749-0827-4e55-b8db-63c1b891a41b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.898 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937064.8977919, 01eab75c-0be7-4ae5-8946-99edd40a7231 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.899 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] VM Started (Lifecycle Event)
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.924 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.929 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937064.8980498, 01eab75c-0be7-4ae5-8946-99edd40a7231 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.929 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] VM Paused (Lifecycle Event)
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.947 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.952 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.969 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.969 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[58be1696-5cfe-45af-be38-560a9989a32e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.971 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd34ebc66-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.971 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.972 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd34ebc66-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.973 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:44 compute-0 NetworkManager[55691]: <info>  [1764937064.9741] manager: (tapd34ebc66-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Dec 05 12:17:44 compute-0 kernel: tapd34ebc66-b0: entered promiscuous mode
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.978 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd34ebc66-b0, col_values=(('external_ids', {'iface-id': '05d66759-c75e-464f-b667-95d69ddda49c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.979 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:44 compute-0 ovn_controller[95610]: 2025-12-05T12:17:44Z|01098|binding|INFO|Releasing lport 05d66759-c75e-464f-b667-95d69ddda49c from this chassis (sb_readonly=0)
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.982 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.983 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[acc57a29-9093-47ef-abaf-4a18ea6c17b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.983 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2.pid.haproxy
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:17:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.984 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'env', 'PROCESS_TAG=haproxy-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:17:44 compute-0 nova_compute[187208]: 2025-12-05 12:17:44.992 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:45 compute-0 podman[241180]: 2025-12-05 12:17:45.365233504 +0000 UTC m=+0.057248379 container create ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 05 12:17:45 compute-0 systemd[1]: Started libpod-conmon-ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344.scope.
Dec 05 12:17:45 compute-0 podman[241180]: 2025-12-05 12:17:45.33119584 +0000 UTC m=+0.023210725 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:17:45 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:17:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2037bdf28dc96dbeeb4800ea4a3a93355da1c29dfccc04e7f21f81071e995ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:17:45 compute-0 podman[241180]: 2025-12-05 12:17:45.460482849 +0000 UTC m=+0.152497744 container init ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 05 12:17:45 compute-0 podman[241180]: 2025-12-05 12:17:45.466734298 +0000 UTC m=+0.158749163 container start ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 12:17:45 compute-0 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [NOTICE]   (241199) : New worker (241201) forked
Dec 05 12:17:45 compute-0 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [NOTICE]   (241199) : Loading success.
Dec 05 12:17:45 compute-0 sshd-session[241086]: Invalid user admin from 91.202.233.33 port 34854
Dec 05 12:17:45 compute-0 sshd-session[241086]: Connection reset by invalid user admin 91.202.233.33 port 34854 [preauth]
Dec 05 12:17:47 compute-0 nova_compute[187208]: 2025-12-05 12:17:47.166 187212 DEBUG nova.network.neutron [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updated VIF entry in instance network info cache for port c3bc0e34-ce29-4ea4-b0cb-f46472e25593. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:17:47 compute-0 nova_compute[187208]: 2025-12-05 12:17:47.167 187212 DEBUG nova.network.neutron [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updating instance_info_cache with network_info: [{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:17:47 compute-0 nova_compute[187208]: 2025-12-05 12:17:47.202 187212 DEBUG oslo_concurrency.lockutils [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.270 187212 DEBUG nova.compute.manager [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.271 187212 DEBUG oslo_concurrency.lockutils [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.274 187212 DEBUG oslo_concurrency.lockutils [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.274 187212 DEBUG oslo_concurrency.lockutils [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.274 187212 DEBUG nova.compute.manager [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Processing event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.275 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.279 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937068.2794023, 01eab75c-0be7-4ae5-8946-99edd40a7231 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.280 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] VM Resumed (Lifecycle Event)
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.282 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.286 187212 INFO nova.virt.libvirt.driver [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance spawned successfully.
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.286 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.326 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.332 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.332 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.333 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.333 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.333 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.334 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.338 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.381 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.422 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.423 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:17:48 compute-0 sshd-session[241210]: Connection reset by authenticating user root 91.202.233.33 port 34886 [preauth]
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.443 187212 INFO nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Took 13.33 seconds to spawn the instance on the hypervisor.
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.444 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.568 187212 INFO nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Took 14.15 seconds to build instance.
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.580 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.581 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.686 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.693 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.789 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.790 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.816 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.817 187212 INFO nova.compute.claims [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:17:48 compute-0 nova_compute[187208]: 2025-12-05 12:17:48.900 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.038 187212 DEBUG nova.compute.provider_tree [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.056 187212 DEBUG nova.scheduler.client.report [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.086 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.090 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.114 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "35c29d67-cc5d-4530-ab3c-1a002a162fdb" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.115 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "35c29d67-cc5d-4530-ab3c-1a002a162fdb" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.125 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "35c29d67-cc5d-4530-ab3c-1a002a162fdb" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.126 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.178 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.179 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.205 187212 INFO nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.229 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.364 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.366 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.366 187212 INFO nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Creating image(s)
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.367 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.368 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.369 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.381 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.381 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.382 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.382 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 01eab75c-0be7-4ae5-8946-99edd40a7231 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.384 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.447 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.448 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.449 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.461 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.519 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.520 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.894 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk 1073741824" returned: 0 in 0.374s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.894 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.446s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.895 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.937 187212 DEBUG nova.policy [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '698ee3761ad948dca92f44ac1749fd10', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58210cf112da477fa142779ffcbe2b11', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.965 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.968 187212 DEBUG nova.virt.disk.api [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Checking if we can resize image /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:17:49 compute-0 nova_compute[187208]: 2025-12-05 12:17:49.970 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:50 compute-0 nova_compute[187208]: 2025-12-05 12:17:50.045 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:50 compute-0 nova_compute[187208]: 2025-12-05 12:17:50.047 187212 DEBUG nova.virt.disk.api [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Cannot resize image /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:17:50 compute-0 nova_compute[187208]: 2025-12-05 12:17:50.049 187212 DEBUG nova.objects.instance [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lazy-loading 'migration_context' on Instance uuid c8b0c32f-8175-42fc-834d-a65de5b28996 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:17:50 compute-0 nova_compute[187208]: 2025-12-05 12:17:50.706 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:17:50 compute-0 nova_compute[187208]: 2025-12-05 12:17:50.706 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Ensure instance console log exists: /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:17:50 compute-0 nova_compute[187208]: 2025-12-05 12:17:50.707 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:50 compute-0 nova_compute[187208]: 2025-12-05 12:17:50.707 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:50 compute-0 nova_compute[187208]: 2025-12-05 12:17:50.707 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:52 compute-0 podman[241229]: 2025-12-05 12:17:52.212865514 +0000 UTC m=+0.061083539 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:17:52 compute-0 podman[241228]: 2025-12-05 12:17:52.238783315 +0000 UTC m=+0.088056200 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 12:17:52 compute-0 podman[241230]: 2025-12-05 12:17:52.298505954 +0000 UTC m=+0.110790030 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Dec 05 12:17:53 compute-0 nova_compute[187208]: 2025-12-05 12:17:53.336 187212 DEBUG nova.compute.manager [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:17:53 compute-0 nova_compute[187208]: 2025-12-05 12:17:53.337 187212 DEBUG oslo_concurrency.lockutils [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:53 compute-0 nova_compute[187208]: 2025-12-05 12:17:53.337 187212 DEBUG oslo_concurrency.lockutils [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:53 compute-0 nova_compute[187208]: 2025-12-05 12:17:53.338 187212 DEBUG oslo_concurrency.lockutils [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:53 compute-0 nova_compute[187208]: 2025-12-05 12:17:53.338 187212 DEBUG nova.compute.manager [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] No waiting events found dispatching network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:17:53 compute-0 nova_compute[187208]: 2025-12-05 12:17:53.338 187212 WARNING nova.compute.manager [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received unexpected event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 for instance with vm_state active and task_state None.
Dec 05 12:17:53 compute-0 nova_compute[187208]: 2025-12-05 12:17:53.497 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:53 compute-0 nova_compute[187208]: 2025-12-05 12:17:53.902 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.054 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updating instance_info_cache with network_info: [{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.067 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Successfully created port: e59d2789-96ad-4740-8d45-d90c6b6f60ca _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.090 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.091 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.091 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.092 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.092 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.092 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:17:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:55.550 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:17:55 compute-0 nova_compute[187208]: 2025-12-05 12:17:55.551 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:55.552 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.124 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.125 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.125 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.125 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.212 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.236 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Successfully updated port: e59d2789-96ad-4740-8d45-d90c6b6f60ca _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.254 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.254 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquired lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.254 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.275 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.275 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.341 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.473 187212 DEBUG nova.compute.manager [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-changed-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.474 187212 DEBUG nova.compute.manager [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Refreshing instance network info cache due to event network-changed-e59d2789-96ad-4740-8d45-d90c6b6f60ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.474 187212 DEBUG oslo_concurrency.lockutils [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.490 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.491 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5471MB free_disk=73.03964614868164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.492 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:17:56 compute-0 nova_compute[187208]: 2025-12-05 12:17:56.492 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.501 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:17:58.553 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.731 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 01eab75c-0be7-4ae5-8946-99edd40a7231 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.732 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance c8b0c32f-8175-42fc-834d-a65de5b28996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.732 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.732 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.753 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.868 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.895 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.947 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:17:58 compute-0 nova_compute[187208]: 2025-12-05 12:17:58.947 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:17:59 compute-0 nova_compute[187208]: 2025-12-05 12:17:59.942 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:17:59 compute-0 nova_compute[187208]: 2025-12-05 12:17:59.943 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.736 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Updating instance_info_cache with network_info: [{"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.907 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Releasing lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.907 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance network_info: |[{"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.908 187212 DEBUG oslo_concurrency.lockutils [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.908 187212 DEBUG nova.network.neutron [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Refreshing network info cache for port e59d2789-96ad-4740-8d45-d90c6b6f60ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.911 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start _get_guest_xml network_info=[{"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.915 187212 WARNING nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.919 187212 DEBUG nova.virt.libvirt.host [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.919 187212 DEBUG nova.virt.libvirt.host [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.924 187212 DEBUG nova.virt.libvirt.host [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.925 187212 DEBUG nova.virt.libvirt.host [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.926 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.926 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.926 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.927 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.927 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.927 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.928 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.928 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.928 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.929 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.929 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.929 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.934 187212 DEBUG nova.virt.libvirt.vif [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-113501820',display_name='tempest-ServerGroupTestJSON-server-113501820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-113501820',id=101,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58210cf112da477fa142779ffcbe2b11',ramdisk_id='',reservation_id='r-xwaofgm0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-305842052',owner_user_name='tempest-ServerGroupTestJSON-305842052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:17:49Z,user_data=None,user_id='698ee3761ad948dca92f44ac1749fd10',uuid=c8b0c32f-8175-42fc-834d-a65de5b28996,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.934 187212 DEBUG nova.network.os_vif_util [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converting VIF {"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.935 187212 DEBUG nova.network.os_vif_util [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.936 187212 DEBUG nova.objects.instance [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lazy-loading 'pci_devices' on Instance uuid c8b0c32f-8175-42fc-834d-a65de5b28996 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.952 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <uuid>c8b0c32f-8175-42fc-834d-a65de5b28996</uuid>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <name>instance-00000065</name>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerGroupTestJSON-server-113501820</nova:name>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:18:00</nova:creationTime>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:18:00 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:18:00 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:18:00 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:18:00 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:18:00 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:18:00 compute-0 nova_compute[187208]:         <nova:user uuid="698ee3761ad948dca92f44ac1749fd10">tempest-ServerGroupTestJSON-305842052-project-member</nova:user>
Dec 05 12:18:00 compute-0 nova_compute[187208]:         <nova:project uuid="58210cf112da477fa142779ffcbe2b11">tempest-ServerGroupTestJSON-305842052</nova:project>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:18:00 compute-0 nova_compute[187208]:         <nova:port uuid="e59d2789-96ad-4740-8d45-d90c6b6f60ca">
Dec 05 12:18:00 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <system>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <entry name="serial">c8b0c32f-8175-42fc-834d-a65de5b28996</entry>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <entry name="uuid">c8b0c32f-8175-42fc-834d-a65de5b28996</entry>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     </system>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <os>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   </os>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <features>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   </features>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.config"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:ab:ce:a4"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <target dev="tape59d2789-96"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/console.log" append="off"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <video>
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     </video>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:18:00 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:18:00 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:18:00 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:18:00 compute-0 nova_compute[187208]: </domain>
Dec 05 12:18:00 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.953 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Preparing to wait for external event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.953 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.954 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.954 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.955 187212 DEBUG nova.virt.libvirt.vif [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-113501820',display_name='tempest-ServerGroupTestJSON-server-113501820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-113501820',id=101,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58210cf112da477fa142779ffcbe2b11',ramdisk_id='',reservation_id='r-xwaofgm0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-305842052',owner_user_name='tempest-ServerGroupTestJSON-305842052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:17:49Z,user_data=None,user_id='698ee3761ad948dca92f44ac1749fd10',uuid=c8b0c32f-8175-42fc-834d-a65de5b28996,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.955 187212 DEBUG nova.network.os_vif_util [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converting VIF {"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.956 187212 DEBUG nova.network.os_vif_util [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.956 187212 DEBUG os_vif [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.956 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.957 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.957 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.960 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.961 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape59d2789-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.961 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape59d2789-96, col_values=(('external_ids', {'iface-id': 'e59d2789-96ad-4740-8d45-d90c6b6f60ca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:ce:a4', 'vm-uuid': 'c8b0c32f-8175-42fc-834d-a65de5b28996'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:00 compute-0 NetworkManager[55691]: <info>  [1764937080.9635] manager: (tape59d2789-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/417)
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.965 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.970 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:00 compute-0 nova_compute[187208]: 2025-12-05 12:18:00.971 187212 INFO os_vif [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96')
Dec 05 12:18:01 compute-0 nova_compute[187208]: 2025-12-05 12:18:01.039 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:18:01 compute-0 nova_compute[187208]: 2025-12-05 12:18:01.040 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:18:01 compute-0 nova_compute[187208]: 2025-12-05 12:18:01.041 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] No VIF found with MAC fa:16:3e:ab:ce:a4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:18:01 compute-0 nova_compute[187208]: 2025-12-05 12:18:01.041 187212 INFO nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Using config drive
Dec 05 12:18:01 compute-0 ovn_controller[95610]: 2025-12-05T12:18:01Z|00123|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:f3:92 10.100.0.6
Dec 05 12:18:01 compute-0 ovn_controller[95610]: 2025-12-05T12:18:01Z|00124|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:f3:92 10.100.0.6
Dec 05 12:18:02 compute-0 nova_compute[187208]: 2025-12-05 12:18:02.107 187212 INFO nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Creating config drive at /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.config
Dec 05 12:18:02 compute-0 nova_compute[187208]: 2025-12-05 12:18:02.112 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptgdzbqqj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:02 compute-0 nova_compute[187208]: 2025-12-05 12:18:02.247 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptgdzbqqj" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:02 compute-0 kernel: tape59d2789-96: entered promiscuous mode
Dec 05 12:18:02 compute-0 NetworkManager[55691]: <info>  [1764937082.3238] manager: (tape59d2789-96): new Tun device (/org/freedesktop/NetworkManager/Devices/418)
Dec 05 12:18:02 compute-0 ovn_controller[95610]: 2025-12-05T12:18:02Z|01099|binding|INFO|Claiming lport e59d2789-96ad-4740-8d45-d90c6b6f60ca for this chassis.
Dec 05 12:18:02 compute-0 ovn_controller[95610]: 2025-12-05T12:18:02Z|01100|binding|INFO|e59d2789-96ad-4740-8d45-d90c6b6f60ca: Claiming fa:16:3e:ab:ce:a4 10.100.0.9
Dec 05 12:18:02 compute-0 nova_compute[187208]: 2025-12-05 12:18:02.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.358 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:ce:a4 10.100.0.9'], port_security=['fa:16:3e:ab:ce:a4 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c8b0c32f-8175-42fc-834d-a65de5b28996', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58210cf112da477fa142779ffcbe2b11', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a0c658e5-9568-4f6f-9218-9e1f4aa6f42f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51692341-ed5e-46b2-ae59-906d4f1865f2, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e59d2789-96ad-4740-8d45-d90c6b6f60ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.359 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e59d2789-96ad-4740-8d45-d90c6b6f60ca in datapath cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 bound to our chassis
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.376 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.390 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c73a7a-bd24-40ab-80c7-bfe3ad5bf51d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.391 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcf3ac8ba-01 in ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.394 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcf3ac8ba-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.394 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4a033502-84da-4a2f-a3d0-1cc962b1ff93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.395 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[520e1e23-07ac-4757-9881-283602144d0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 systemd-udevd[241349]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:18:02 compute-0 systemd-machined[153543]: New machine qemu-126-instance-00000065.
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.409 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[792ebb2d-c552-4f7e-835b-2915fefc25c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 NetworkManager[55691]: <info>  [1764937082.4164] device (tape59d2789-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:18:02 compute-0 nova_compute[187208]: 2025-12-05 12:18:02.415 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:02 compute-0 NetworkManager[55691]: <info>  [1764937082.4176] device (tape59d2789-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:18:02 compute-0 ovn_controller[95610]: 2025-12-05T12:18:02Z|01101|binding|INFO|Setting lport e59d2789-96ad-4740-8d45-d90c6b6f60ca ovn-installed in OVS
Dec 05 12:18:02 compute-0 ovn_controller[95610]: 2025-12-05T12:18:02Z|01102|binding|INFO|Setting lport e59d2789-96ad-4740-8d45-d90c6b6f60ca up in Southbound
Dec 05 12:18:02 compute-0 nova_compute[187208]: 2025-12-05 12:18:02.421 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:02 compute-0 systemd[1]: Started Virtual Machine qemu-126-instance-00000065.
Dec 05 12:18:02 compute-0 podman[241330]: 2025-12-05 12:18:02.437282238 +0000 UTC m=+0.116073012 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.441 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[98142ce8-2da3-42cd-9de2-cdf608e9d863]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.474 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[abe76a51-cd7c-43db-a65a-f69e9df7ed97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.480 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[709e8302-2561-47a5-b9bc-90eb6b40ab4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 NetworkManager[55691]: <info>  [1764937082.4827] manager: (tapcf3ac8ba-00): new Veth device (/org/freedesktop/NetworkManager/Devices/419)
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.513 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2975f109-ab01-4dff-9fba-21526504ee8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.516 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cee9a66e-081a-4ef2-82ab-4a44750c4d52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 NetworkManager[55691]: <info>  [1764937082.5392] device (tapcf3ac8ba-00): carrier: link connected
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.544 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cb928103-19c8-49e4-aaef-2fa81e4c69bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.561 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[da8d8282-5cda-48b8-a027-20dbaa5bcb70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcf3ac8ba-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:2b:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 302], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446310, 'reachable_time': 28332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241393, 'error': None, 'target': 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.579 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2818cb8-4833-4919-869d-ccbd0a81e18f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe64:2b73'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446310, 'tstamp': 446310}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241394, 'error': None, 'target': 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.598 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[61090776-580f-4ac1-917d-6626f739b115]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcf3ac8ba-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:2b:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 302], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446310, 'reachable_time': 28332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241395, 'error': None, 'target': 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.631 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8944fd0f-00db-467b-81c5-e1b0116b12b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.682 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e49b7e0d-a721-4164-9f2b-25240100ff58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.683 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf3ac8ba-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.684 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.684 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcf3ac8ba-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:02 compute-0 NetworkManager[55691]: <info>  [1764937082.6873] manager: (tapcf3ac8ba-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/420)
Dec 05 12:18:02 compute-0 nova_compute[187208]: 2025-12-05 12:18:02.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:02 compute-0 kernel: tapcf3ac8ba-00: entered promiscuous mode
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.690 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcf3ac8ba-00, col_values=(('external_ids', {'iface-id': '23b92e92-fbf6-41c6-bdbf-c1326bdf4966'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:02 compute-0 nova_compute[187208]: 2025-12-05 12:18:02.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:02 compute-0 ovn_controller[95610]: 2025-12-05T12:18:02Z|01103|binding|INFO|Releasing lport 23b92e92-fbf6-41c6-bdbf-c1326bdf4966 from this chassis (sb_readonly=0)
Dec 05 12:18:02 compute-0 nova_compute[187208]: 2025-12-05 12:18:02.715 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.716 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.717 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[317b6a02-7863-467f-a2a9-064f5287305b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.718 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713.pid.haproxy
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:18:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.718 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'env', 'PROCESS_TAG=haproxy-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:18:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.023 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.023 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:03 compute-0 podman[241427]: 2025-12-05 12:18:03.115495071 +0000 UTC m=+0.108899637 container create 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:18:03 compute-0 podman[241427]: 2025-12-05 12:18:03.031492888 +0000 UTC m=+0.024897474 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:18:03 compute-0 systemd[1]: Started libpod-conmon-53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee.scope.
Dec 05 12:18:03 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:18:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ffdba3fb53dbf39c5a535cbfc3cbd3cb85b10a5a7c9e9d67175042707b1883c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.201 187212 DEBUG nova.compute.manager [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.202 187212 DEBUG oslo_concurrency.lockutils [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.203 187212 DEBUG oslo_concurrency.lockutils [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.203 187212 DEBUG oslo_concurrency.lockutils [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.203 187212 DEBUG nova.compute.manager [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Processing event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:18:03 compute-0 podman[241427]: 2025-12-05 12:18:03.315294427 +0000 UTC m=+0.308699033 container init 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:18:03 compute-0 podman[241427]: 2025-12-05 12:18:03.327584308 +0000 UTC m=+0.320988894 container start 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.328 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937083.3279169, c8b0c32f-8175-42fc-834d-a65de5b28996 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.329 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] VM Started (Lifecycle Event)
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.333 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.338 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.344 187212 INFO nova.virt.libvirt.driver [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance spawned successfully.
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.345 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.354 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.358 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:18:03 compute-0 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [NOTICE]   (241454) : New worker (241456) forked
Dec 05 12:18:03 compute-0 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [NOTICE]   (241454) : Loading success.
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.372 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.373 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.374 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.374 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.375 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.375 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.385 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.386 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937083.3282623, c8b0c32f-8175-42fc-834d-a65de5b28996 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.386 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] VM Paused (Lifecycle Event)
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.424 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.428 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937083.3368895, c8b0c32f-8175-42fc-834d-a65de5b28996 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.429 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] VM Resumed (Lifecycle Event)
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.459 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.463 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.469 187212 INFO nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Took 14.10 seconds to spawn the instance on the hypervisor.
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.469 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.499 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.550 187212 INFO nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Took 14.79 seconds to build instance.
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.568 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.987s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.674 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.676 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.676 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.676 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.677 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.678 187212 INFO nova.compute.manager [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Terminating instance
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.679 187212 DEBUG nova.compute.manager [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:18:03 compute-0 kernel: tapc3bc0e34-ce (unregistering): left promiscuous mode
Dec 05 12:18:03 compute-0 NetworkManager[55691]: <info>  [1764937083.6991] device (tapc3bc0e34-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:18:03 compute-0 ovn_controller[95610]: 2025-12-05T12:18:03Z|01104|binding|INFO|Releasing lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 from this chassis (sb_readonly=0)
Dec 05 12:18:03 compute-0 ovn_controller[95610]: 2025-12-05T12:18:03Z|01105|binding|INFO|Setting lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 down in Southbound
Dec 05 12:18:03 compute-0 ovn_controller[95610]: 2025-12-05T12:18:03Z|01106|binding|INFO|Removing iface tapc3bc0e34-ce ovn-installed in OVS
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.705 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.709 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.713 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:f3:92 10.100.0.6'], port_security=['fa:16:3e:7d:f3:92 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '01eab75c-0be7-4ae5-8946-99edd40a7231', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d9f0915cbe24ecfae713c84ca158d2c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2eb3347a-3ab9-4f98-aee7-0fd84b6b9272', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c88129a4-38c8-4910-a8a3-2a71bf90c0f0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c3bc0e34-ce29-4ea4-b0cb-f46472e25593) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:18:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.714 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c3bc0e34-ce29-4ea4-b0cb-f46472e25593 in datapath d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 unbound from our chassis
Dec 05 12:18:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.716 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:18:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.718 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[de8f16d3-b36c-4de3-908d-234000a2ed6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.719 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 namespace which is not needed anymore
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.724 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:03 compute-0 systemd[1]: machine-qemu\x2d125\x2dinstance\x2d00000064.scope: Deactivated successfully.
Dec 05 12:18:03 compute-0 systemd[1]: machine-qemu\x2d125\x2dinstance\x2d00000064.scope: Consumed 12.163s CPU time.
Dec 05 12:18:03 compute-0 systemd-machined[153543]: Machine qemu-125-instance-00000064 terminated.
Dec 05 12:18:03 compute-0 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [NOTICE]   (241199) : haproxy version is 2.8.14-c23fe91
Dec 05 12:18:03 compute-0 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [NOTICE]   (241199) : path to executable is /usr/sbin/haproxy
Dec 05 12:18:03 compute-0 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [WARNING]  (241199) : Exiting Master process...
Dec 05 12:18:03 compute-0 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [ALERT]    (241199) : Current worker (241201) exited with code 143 (Terminated)
Dec 05 12:18:03 compute-0 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [WARNING]  (241199) : All workers exited. Exiting... (0)
Dec 05 12:18:03 compute-0 systemd[1]: libpod-ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344.scope: Deactivated successfully.
Dec 05 12:18:03 compute-0 podman[241487]: 2025-12-05 12:18:03.878239452 +0000 UTC m=+0.059598216 container died ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:18:03 compute-0 NetworkManager[55691]: <info>  [1764937083.9033] manager: (tapc3bc0e34-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/421)
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.911 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.938 187212 INFO nova.virt.libvirt.driver [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance destroyed successfully.
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.939 187212 DEBUG nova.objects.instance [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lazy-loading 'resources' on Instance uuid 01eab75c-0be7-4ae5-8946-99edd40a7231 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.952 187212 DEBUG nova.virt.libvirt.vif [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:17:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1871572941',display_name='tempest-VolumesActionsTest-instance-1871572941',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-1871572941',id=100,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:17:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7d9f0915cbe24ecfae713c84ca158d2c',ramdisk_id='',reservation_id='r-wqaeiphu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesActionsTest-1144255362',owner_user_name='tempest-VolumesActionsTest-1144255362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:17:48Z,user_data=None,user_id='9db950f394294957891a245f192c5404',uuid=01eab75c-0be7-4ae5-8946-99edd40a7231,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.952 187212 DEBUG nova.network.os_vif_util [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converting VIF {"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.953 187212 DEBUG nova.network.os_vif_util [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.954 187212 DEBUG os_vif [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.958 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3bc0e34-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.959 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.961 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.963 187212 INFO os_vif [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce')
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.964 187212 INFO nova.virt.libvirt.driver [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Deleting instance files /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231_del
Dec 05 12:18:03 compute-0 nova_compute[187208]: 2025-12-05 12:18:03.965 187212 INFO nova.virt.libvirt.driver [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Deletion of /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231_del complete
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.029 187212 INFO nova.compute.manager [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Took 0.35 seconds to destroy the instance on the hypervisor.
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.030 187212 DEBUG oslo.service.loopingcall [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.031 187212 DEBUG nova.compute.manager [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.031 187212 DEBUG nova.network.neutron [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:18:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344-userdata-shm.mount: Deactivated successfully.
Dec 05 12:18:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-e2037bdf28dc96dbeeb4800ea4a3a93355da1c29dfccc04e7f21f81071e995ca-merged.mount: Deactivated successfully.
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.215 187212 DEBUG nova.compute.manager [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-unplugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.216 187212 DEBUG oslo_concurrency.lockutils [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.217 187212 DEBUG oslo_concurrency.lockutils [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.217 187212 DEBUG oslo_concurrency.lockutils [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.218 187212 DEBUG nova.compute.manager [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] No waiting events found dispatching network-vif-unplugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.218 187212 DEBUG nova.compute.manager [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-unplugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:18:04 compute-0 podman[241487]: 2025-12-05 12:18:04.263000728 +0000 UTC m=+0.444359492 container cleanup ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 12:18:04 compute-0 systemd[1]: libpod-conmon-ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344.scope: Deactivated successfully.
Dec 05 12:18:04 compute-0 podman[241532]: 2025-12-05 12:18:04.333509496 +0000 UTC m=+0.045015879 container remove ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 12:18:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.339 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d1793643-7ed6-45d0-8351-7a6186603f21]: (4, ('Fri Dec  5 12:18:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 (ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344)\nca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344\nFri Dec  5 12:18:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 (ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344)\nca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.340 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4d595cde-a9cb-4652-9e8a-896a7bdc68e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.341 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd34ebc66-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.343 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:04 compute-0 kernel: tapd34ebc66-b0: left promiscuous mode
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.348 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7ecef12c-36c7-4903-8321-046f0c5182b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.359 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.372 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af550bba-89ab-46f4-92d6-75e7a2d5ab47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.375 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d45799b2-7a70-4e6c-8da2-d5b5dc856993]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.392 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35cc81f2-4ab3-457d-bb49-791de23154ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444527, 'reachable_time': 29683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241548, 'error': None, 'target': 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.395 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:18:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.395 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b4411405-b1a5-4ff2-92fb-71793e9e44cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:04 compute-0 systemd[1]: run-netns-ovnmeta\x2dd34ebc66\x2db7e3\x2d4d6f\x2db6cd\x2db40947e8fed2.mount: Deactivated successfully.
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.638 187212 DEBUG nova.network.neutron [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Updated VIF entry in instance network info cache for port e59d2789-96ad-4740-8d45-d90c6b6f60ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.638 187212 DEBUG nova.network.neutron [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Updating instance_info_cache with network_info: [{"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:04 compute-0 nova_compute[187208]: 2025-12-05 12:18:04.655 187212 DEBUG oslo_concurrency.lockutils [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.327 187212 DEBUG nova.compute.manager [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.328 187212 DEBUG oslo_concurrency.lockutils [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.329 187212 DEBUG oslo_concurrency.lockutils [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.330 187212 DEBUG oslo_concurrency.lockutils [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.331 187212 DEBUG nova.compute.manager [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] No waiting events found dispatching network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.331 187212 WARNING nova.compute.manager [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received unexpected event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca for instance with vm_state active and task_state None.
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.363 187212 DEBUG nova.network.neutron [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.396 187212 INFO nova.compute.manager [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Took 1.37 seconds to deallocate network for instance.
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.459 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.460 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.551 187212 DEBUG nova.compute.provider_tree [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.575 187212 DEBUG nova.scheduler.client.report [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.607 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.636 187212 INFO nova.scheduler.client.report [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Deleted allocations for instance 01eab75c-0be7-4ae5-8946-99edd40a7231
Dec 05 12:18:05 compute-0 nova_compute[187208]: 2025-12-05 12:18:05.753 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.674 187212 DEBUG nova.compute.manager [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.675 187212 DEBUG oslo_concurrency.lockutils [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.676 187212 DEBUG oslo_concurrency.lockutils [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.676 187212 DEBUG oslo_concurrency.lockutils [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.677 187212 DEBUG nova.compute.manager [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] No waiting events found dispatching network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.677 187212 WARNING nova.compute.manager [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received unexpected event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 for instance with vm_state deleted and task_state None.
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.678 187212 DEBUG nova.compute.manager [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-deleted-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.714 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.714 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.715 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.715 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.716 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.717 187212 INFO nova.compute.manager [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Terminating instance
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.718 187212 DEBUG nova.compute.manager [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:18:06 compute-0 kernel: tape59d2789-96 (unregistering): left promiscuous mode
Dec 05 12:18:06 compute-0 NetworkManager[55691]: <info>  [1764937086.7358] device (tape59d2789-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.741 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:06 compute-0 ovn_controller[95610]: 2025-12-05T12:18:06Z|01107|binding|INFO|Releasing lport e59d2789-96ad-4740-8d45-d90c6b6f60ca from this chassis (sb_readonly=0)
Dec 05 12:18:06 compute-0 ovn_controller[95610]: 2025-12-05T12:18:06Z|01108|binding|INFO|Setting lport e59d2789-96ad-4740-8d45-d90c6b6f60ca down in Southbound
Dec 05 12:18:06 compute-0 ovn_controller[95610]: 2025-12-05T12:18:06Z|01109|binding|INFO|Removing iface tape59d2789-96 ovn-installed in OVS
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.743 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:06 compute-0 nova_compute[187208]: 2025-12-05 12:18:06.755 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.752 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:ce:a4 10.100.0.9'], port_security=['fa:16:3e:ab:ce:a4 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c8b0c32f-8175-42fc-834d-a65de5b28996', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58210cf112da477fa142779ffcbe2b11', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a0c658e5-9568-4f6f-9218-9e1f4aa6f42f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51692341-ed5e-46b2-ae59-906d4f1865f2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e59d2789-96ad-4740-8d45-d90c6b6f60ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:18:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.754 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e59d2789-96ad-4740-8d45-d90c6b6f60ca in datapath cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 unbound from our chassis
Dec 05 12:18:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.756 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:18:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.757 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[94c510c5-31c9-4718-81e9-bb34471523ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.758 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 namespace which is not needed anymore
Dec 05 12:18:06 compute-0 systemd[1]: machine-qemu\x2d126\x2dinstance\x2d00000065.scope: Deactivated successfully.
Dec 05 12:18:06 compute-0 systemd[1]: machine-qemu\x2d126\x2dinstance\x2d00000065.scope: Consumed 4.272s CPU time.
Dec 05 12:18:06 compute-0 systemd-machined[153543]: Machine qemu-126-instance-00000065 terminated.
Dec 05 12:18:06 compute-0 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [NOTICE]   (241454) : haproxy version is 2.8.14-c23fe91
Dec 05 12:18:06 compute-0 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [NOTICE]   (241454) : path to executable is /usr/sbin/haproxy
Dec 05 12:18:06 compute-0 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [WARNING]  (241454) : Exiting Master process...
Dec 05 12:18:06 compute-0 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [ALERT]    (241454) : Current worker (241456) exited with code 143 (Terminated)
Dec 05 12:18:06 compute-0 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [WARNING]  (241454) : All workers exited. Exiting... (0)
Dec 05 12:18:06 compute-0 systemd[1]: libpod-53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee.scope: Deactivated successfully.
Dec 05 12:18:06 compute-0 podman[241574]: 2025-12-05 12:18:06.884550287 +0000 UTC m=+0.043644309 container died 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:18:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee-userdata-shm.mount: Deactivated successfully.
Dec 05 12:18:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-9ffdba3fb53dbf39c5a535cbfc3cbd3cb85b10a5a7c9e9d67175042707b1883c-merged.mount: Deactivated successfully.
Dec 05 12:18:06 compute-0 podman[241574]: 2025-12-05 12:18:06.92834066 +0000 UTC m=+0.087434702 container cleanup 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:18:06 compute-0 systemd[1]: libpod-conmon-53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee.scope: Deactivated successfully.
Dec 05 12:18:06 compute-0 podman[241608]: 2025-12-05 12:18:06.994173874 +0000 UTC m=+0.042833837 container remove 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:18:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.998 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d903fd4d-f9fc-48c2-b3e4-109cee1f5b3a]: (4, ('Fri Dec  5 12:18:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 (53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee)\n53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee\nFri Dec  5 12:18:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 (53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee)\n53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.001 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b2216e9d-0ef8-4140-91cd-2fe9a7c5039b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.002 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf3ac8ba-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:07 compute-0 kernel: tapcf3ac8ba-00: left promiscuous mode
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.009 187212 INFO nova.virt.libvirt.driver [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance destroyed successfully.
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.009 187212 DEBUG nova.objects.instance [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lazy-loading 'resources' on Instance uuid c8b0c32f-8175-42fc-834d-a65de5b28996 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.022 187212 DEBUG nova.virt.libvirt.vif [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-113501820',display_name='tempest-ServerGroupTestJSON-server-113501820',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-113501820',id=101,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:18:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58210cf112da477fa142779ffcbe2b11',ramdisk_id='',reservation_id='r-xwaofgm0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-305842052',owner_user_name='tempest-ServerGroupTestJSON-305842052-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:18:03Z,user_data=None,user_id='698ee3761ad948dca92f44ac1749fd10',uuid=c8b0c32f-8175-42fc-834d-a65de5b28996,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.023 187212 DEBUG nova.network.os_vif_util [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converting VIF {"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.023 187212 DEBUG nova.network.os_vif_util [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.024 187212 DEBUG os_vif [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:18:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.024 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d4bfdc58-c502-43d3-9407-1dc4d92ab399]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.025 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape59d2789-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.028 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.034 187212 INFO os_vif [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96')
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.034 187212 INFO nova.virt.libvirt.driver [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Deleting instance files /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996_del
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.035 187212 INFO nova.virt.libvirt.driver [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Deletion of /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996_del complete
Dec 05 12:18:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.044 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1eeb59a5-79b7-41e2-a191-f69aad1d1aa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.045 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5aeb10-6d03-4124-b705-543b63630992]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.060 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e6607f9a-588e-4444-b272-532f14b64b60]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446303, 'reachable_time': 34447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241640, 'error': None, 'target': 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:07 compute-0 systemd[1]: run-netns-ovnmeta\x2dcf3ac8ba\x2d0b95\x2d4da0\x2d8a42\x2da4e8cffa2713.mount: Deactivated successfully.
Dec 05 12:18:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.063 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:18:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.063 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[86d8e60f-3477-41b4-80fa-6e596930d964]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.092 187212 INFO nova.compute.manager [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Took 0.37 seconds to destroy the instance on the hypervisor.
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.093 187212 DEBUG oslo.service.loopingcall [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.093 187212 DEBUG nova.compute.manager [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:18:07 compute-0 nova_compute[187208]: 2025-12-05 12:18:07.093 187212 DEBUG nova.network.neutron [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:18:08 compute-0 podman[241641]: 2025-12-05 12:18:08.206359741 +0000 UTC m=+0.055597781 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.450 187212 DEBUG nova.compute.manager [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-unplugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.451 187212 DEBUG oslo_concurrency.lockutils [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.451 187212 DEBUG oslo_concurrency.lockutils [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.452 187212 DEBUG oslo_concurrency.lockutils [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.452 187212 DEBUG nova.compute.manager [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] No waiting events found dispatching network-vif-unplugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.452 187212 DEBUG nova.compute.manager [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-unplugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.799 187212 DEBUG nova.network.neutron [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.821 187212 INFO nova.compute.manager [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Took 1.73 seconds to deallocate network for instance.
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.873 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.874 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.935 187212 DEBUG nova.compute.provider_tree [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.941 187212 DEBUG nova.compute.manager [req-e36d7745-5665-42da-bcc6-3e3ad9a5db06 req-84ff2602-f808-4994-b63c-db5662fcbd69 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-deleted-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.950 187212 DEBUG nova.scheduler.client.report [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:18:08 compute-0 nova_compute[187208]: 2025-12-05 12:18:08.977 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:09 compute-0 nova_compute[187208]: 2025-12-05 12:18:09.004 187212 INFO nova.scheduler.client.report [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Deleted allocations for instance c8b0c32f-8175-42fc-834d-a65de5b28996
Dec 05 12:18:09 compute-0 nova_compute[187208]: 2025-12-05 12:18:09.082 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:10 compute-0 nova_compute[187208]: 2025-12-05 12:18:10.834 187212 DEBUG nova.compute.manager [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:10 compute-0 nova_compute[187208]: 2025-12-05 12:18:10.835 187212 DEBUG oslo_concurrency.lockutils [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:10 compute-0 nova_compute[187208]: 2025-12-05 12:18:10.835 187212 DEBUG oslo_concurrency.lockutils [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:10 compute-0 nova_compute[187208]: 2025-12-05 12:18:10.836 187212 DEBUG oslo_concurrency.lockutils [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:10 compute-0 nova_compute[187208]: 2025-12-05 12:18:10.836 187212 DEBUG nova.compute.manager [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] No waiting events found dispatching network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:18:10 compute-0 nova_compute[187208]: 2025-12-05 12:18:10.836 187212 WARNING nova.compute.manager [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received unexpected event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca for instance with vm_state deleted and task_state None.
Dec 05 12:18:12 compute-0 nova_compute[187208]: 2025-12-05 12:18:12.027 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:13 compute-0 podman[241662]: 2025-12-05 12:18:13.209099052 +0000 UTC m=+0.054168921 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 05 12:18:13 compute-0 podman[241661]: 2025-12-05 12:18:13.209150973 +0000 UTC m=+0.057337471 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 12:18:13 compute-0 nova_compute[187208]: 2025-12-05 12:18:13.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.272 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.495 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "ef9ada46-64bf-4990-954a-cc70a354b443" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.496 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.511 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.596 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.596 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.604 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.605 187212 INFO nova.compute.claims [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.758 187212 DEBUG nova.compute.provider_tree [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.779 187212 DEBUG nova.scheduler.client.report [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.798 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.799 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.839 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.839 187212 DEBUG nova.network.neutron [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.861 187212 INFO nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.882 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.978 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.979 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.980 187212 INFO nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Creating image(s)
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.980 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.981 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.982 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:16 compute-0 nova_compute[187208]: 2025-12-05 12:18:16.998 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.029 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.063 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.064 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.065 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.081 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.144 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.145 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.196 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.197 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.198 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.254 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.255 187212 DEBUG nova.virt.disk.api [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Checking if we can resize image /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.256 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.319 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.321 187212 DEBUG nova.virt.disk.api [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Cannot resize image /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.321 187212 DEBUG nova.objects.instance [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lazy-loading 'migration_context' on Instance uuid ef9ada46-64bf-4990-954a-cc70a354b443 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.336 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.337 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Ensure instance console log exists: /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.337 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.337 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:17 compute-0 nova_compute[187208]: 2025-12-05 12:18:17.338 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.184 187212 DEBUG nova.network.neutron [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.185 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.187 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.193 187212 WARNING nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.198 187212 DEBUG nova.virt.libvirt.host [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.199 187212 DEBUG nova.virt.libvirt.host [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.203 187212 DEBUG nova.virt.libvirt.host [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.203 187212 DEBUG nova.virt.libvirt.host [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.204 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.204 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.205 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.205 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.205 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.205 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.206 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.206 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.206 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.206 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.207 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.207 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.211 187212 DEBUG nova.objects.instance [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lazy-loading 'pci_devices' on Instance uuid ef9ada46-64bf-4990-954a-cc70a354b443 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.224 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <uuid>ef9ada46-64bf-4990-954a-cc70a354b443</uuid>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <name>instance-00000066</name>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <nova:name>tempest-VolumesNegativeTest-instance-577523945</nova:name>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:18:18</nova:creationTime>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:18:18 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:18:18 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:18:18 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:18:18 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:18:18 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:18:18 compute-0 nova_compute[187208]:         <nova:user uuid="50394847033f4123a02f592a98d13f9e">tempest-VolumesNegativeTest-893074152-project-member</nova:user>
Dec 05 12:18:18 compute-0 nova_compute[187208]:         <nova:project uuid="23d25e1d365b4bca9d6a6e954185bd66">tempest-VolumesNegativeTest-893074152</nova:project>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <system>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <entry name="serial">ef9ada46-64bf-4990-954a-cc70a354b443</entry>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <entry name="uuid">ef9ada46-64bf-4990-954a-cc70a354b443</entry>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     </system>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <os>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   </os>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <features>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   </features>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.config"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/console.log" append="off"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <video>
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     </video>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:18:18 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:18:18 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:18:18 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:18:18 compute-0 nova_compute[187208]: </domain>
Dec 05 12:18:18 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.320 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.322 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.323 187212 INFO nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Using config drive
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.607 187212 INFO nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Creating config drive at /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.config
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.611 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoatpvx3o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.742 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoatpvx3o" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:18 compute-0 systemd-machined[153543]: New machine qemu-127-instance-00000066.
Dec 05 12:18:18 compute-0 systemd[1]: Started Virtual Machine qemu-127-instance-00000066.
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.937 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937083.9363487, 01eab75c-0be7-4ae5-8946-99edd40a7231 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.938 187212 INFO nova.compute.manager [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] VM Stopped (Lifecycle Event)
Dec 05 12:18:18 compute-0 nova_compute[187208]: 2025-12-05 12:18:18.973 187212 DEBUG nova.compute.manager [None req-3486363c-0e07-4f65-9c04-2dab29836616 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.300 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937099.299712, ef9ada46-64bf-4990-954a-cc70a354b443 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.300 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] VM Resumed (Lifecycle Event)
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.302 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.302 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.308 187212 INFO nova.virt.libvirt.driver [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance spawned successfully.
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.308 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.332 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.338 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.342 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.342 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.343 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.344 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.344 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.345 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.378 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.378 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937099.3005576, ef9ada46-64bf-4990-954a-cc70a354b443 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.379 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] VM Started (Lifecycle Event)
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.410 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.413 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.424 187212 INFO nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Took 2.45 seconds to spawn the instance on the hypervisor.
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.425 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.437 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.489 187212 INFO nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Took 2.93 seconds to build instance.
Dec 05 12:18:19 compute-0 nova_compute[187208]: 2025-12-05 12:18:19.508 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.142 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "ef9ada46-64bf-4990-954a-cc70a354b443" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.143 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.143 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "ef9ada46-64bf-4990-954a-cc70a354b443-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.144 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.144 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.145 187212 INFO nova.compute.manager [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Terminating instance
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.146 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "refresh_cache-ef9ada46-64bf-4990-954a-cc70a354b443" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.146 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquired lock "refresh_cache-ef9ada46-64bf-4990-954a-cc70a354b443" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.146 187212 DEBUG nova.network.neutron [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:18:21 compute-0 nova_compute[187208]: 2025-12-05 12:18:21.433 187212 DEBUG nova.network.neutron [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.004 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937087.0036385, c8b0c32f-8175-42fc-834d-a65de5b28996 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.005 187212 INFO nova.compute.manager [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] VM Stopped (Lifecycle Event)
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.031 187212 DEBUG nova.compute.manager [None req-852a9fe4-7c91-4a2b-bb15-6351b8d2d880 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.109 187212 DEBUG nova.network.neutron [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.133 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Releasing lock "refresh_cache-ef9ada46-64bf-4990-954a-cc70a354b443" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.134 187212 DEBUG nova.compute.manager [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:18:22 compute-0 systemd[1]: machine-qemu\x2d127\x2dinstance\x2d00000066.scope: Deactivated successfully.
Dec 05 12:18:22 compute-0 systemd[1]: machine-qemu\x2d127\x2dinstance\x2d00000066.scope: Consumed 3.301s CPU time.
Dec 05 12:18:22 compute-0 systemd-machined[153543]: Machine qemu-127-instance-00000066 terminated.
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.380 187212 INFO nova.virt.libvirt.driver [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance destroyed successfully.
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.381 187212 DEBUG nova.objects.instance [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lazy-loading 'resources' on Instance uuid ef9ada46-64bf-4990-954a-cc70a354b443 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.396 187212 INFO nova.virt.libvirt.driver [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Deleting instance files /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443_del
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.397 187212 INFO nova.virt.libvirt.driver [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Deletion of /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443_del complete
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.459 187212 INFO nova.compute.manager [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Took 0.32 seconds to destroy the instance on the hypervisor.
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.460 187212 DEBUG oslo.service.loopingcall [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.461 187212 DEBUG nova.compute.manager [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.461 187212 DEBUG nova.network.neutron [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.789 187212 DEBUG nova.network.neutron [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.803 187212 DEBUG nova.network.neutron [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.830 187212 INFO nova.compute.manager [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Took 0.37 seconds to deallocate network for instance.
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.876 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.877 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.936 187212 DEBUG nova.compute.provider_tree [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.956 187212 DEBUG nova.scheduler.client.report [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:18:22 compute-0 nova_compute[187208]: 2025-12-05 12:18:22.999 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:23 compute-0 nova_compute[187208]: 2025-12-05 12:18:23.030 187212 INFO nova.scheduler.client.report [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Deleted allocations for instance ef9ada46-64bf-4990-954a-cc70a354b443
Dec 05 12:18:23 compute-0 nova_compute[187208]: 2025-12-05 12:18:23.122 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:23 compute-0 podman[241754]: 2025-12-05 12:18:23.217063534 +0000 UTC m=+0.059762151 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:18:23 compute-0 podman[241753]: 2025-12-05 12:18:23.2249657 +0000 UTC m=+0.069486709 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:18:23 compute-0 podman[241755]: 2025-12-05 12:18:23.266933211 +0000 UTC m=+0.102977207 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:18:23 compute-0 nova_compute[187208]: 2025-12-05 12:18:23.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:27 compute-0 nova_compute[187208]: 2025-12-05 12:18:27.152 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:28 compute-0 nova_compute[187208]: 2025-12-05 12:18:28.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:32 compute-0 nova_compute[187208]: 2025-12-05 12:18:32.153 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:33 compute-0 podman[241823]: 2025-12-05 12:18:33.203673754 +0000 UTC m=+0.052236725 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:18:33 compute-0 nova_compute[187208]: 2025-12-05 12:18:33.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:18:37 compute-0 nova_compute[187208]: 2025-12-05 12:18:37.155 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:37 compute-0 nova_compute[187208]: 2025-12-05 12:18:37.379 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937102.3787365, ef9ada46-64bf-4990-954a-cc70a354b443 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:37 compute-0 nova_compute[187208]: 2025-12-05 12:18:37.380 187212 INFO nova.compute.manager [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] VM Stopped (Lifecycle Event)
Dec 05 12:18:37 compute-0 nova_compute[187208]: 2025-12-05 12:18:37.400 187212 DEBUG nova.compute.manager [None req-3709be06-a359-4428-b633-cfb2b733b2c8 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:37 compute-0 nova_compute[187208]: 2025-12-05 12:18:37.995 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:37 compute-0 nova_compute[187208]: 2025-12-05 12:18:37.996 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.026 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.121 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.121 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.128 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.129 187212 INFO nova.compute.claims [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.235 187212 DEBUG nova.compute.provider_tree [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.257 187212 DEBUG nova.scheduler.client.report [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.301 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.302 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.362 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.363 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.384 187212 INFO nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.401 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.535 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.536 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.536 187212 INFO nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Creating image(s)
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.537 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.537 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.538 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.552 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.613 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.614 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.615 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.626 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.691 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.693 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.737 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.738 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.738 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.799 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.800 187212 DEBUG nova.virt.disk.api [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Checking if we can resize image /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.800 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.856 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.857 187212 DEBUG nova.virt.disk.api [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Cannot resize image /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.857 187212 DEBUG nova.objects.instance [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lazy-loading 'migration_context' on Instance uuid b95b2427-7c9a-4d8d-bcfd-645393721cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.872 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.872 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Ensure instance console log exists: /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.873 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.873 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.874 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:38 compute-0 nova_compute[187208]: 2025-12-05 12:18:38.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:39 compute-0 podman[241862]: 2025-12-05 12:18:39.221933138 +0000 UTC m=+0.065293179 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 05 12:18:39 compute-0 nova_compute[187208]: 2025-12-05 12:18:39.564 187212 DEBUG nova.policy [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7907c905947f4a2290c1cb23fc23e453', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '526f66a0e3ca44b097d8ce7f4a763497', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:18:40 compute-0 nova_compute[187208]: 2025-12-05 12:18:40.736 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Successfully created port: a63fc129-9d70-44d9-a73a-cfa00b3264aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:18:41 compute-0 nova_compute[187208]: 2025-12-05 12:18:41.985 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Successfully updated port: a63fc129-9d70-44d9-a73a-cfa00b3264aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:18:42 compute-0 nova_compute[187208]: 2025-12-05 12:18:42.018 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:18:42 compute-0 nova_compute[187208]: 2025-12-05 12:18:42.018 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquired lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:18:42 compute-0 nova_compute[187208]: 2025-12-05 12:18:42.019 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:18:42 compute-0 nova_compute[187208]: 2025-12-05 12:18:42.156 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:42 compute-0 nova_compute[187208]: 2025-12-05 12:18:42.269 187212 DEBUG nova.compute.manager [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received event network-changed-a63fc129-9d70-44d9-a73a-cfa00b3264aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:42 compute-0 nova_compute[187208]: 2025-12-05 12:18:42.270 187212 DEBUG nova.compute.manager [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Refreshing instance network info cache due to event network-changed-a63fc129-9d70-44d9-a73a-cfa00b3264aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:18:42 compute-0 nova_compute[187208]: 2025-12-05 12:18:42.270 187212 DEBUG oslo_concurrency.lockutils [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:18:42 compute-0 nova_compute[187208]: 2025-12-05 12:18:42.408 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:18:43 compute-0 nova_compute[187208]: 2025-12-05 12:18:43.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:44 compute-0 podman[241883]: 2025-12-05 12:18:44.003277024 +0000 UTC m=+0.052640807 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 05 12:18:44 compute-0 podman[241882]: 2025-12-05 12:18:44.003283344 +0000 UTC m=+0.058394671 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.026 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updating instance_info_cache with network_info: [{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.047 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Releasing lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.048 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance network_info: |[{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.048 187212 DEBUG oslo_concurrency.lockutils [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.048 187212 DEBUG nova.network.neutron [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Refreshing network info cache for port a63fc129-9d70-44d9-a73a-cfa00b3264aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.051 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start _get_guest_xml network_info=[{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.056 187212 WARNING nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.060 187212 DEBUG nova.virt.libvirt.host [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.060 187212 DEBUG nova.virt.libvirt.host [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.063 187212 DEBUG nova.virt.libvirt.host [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.063 187212 DEBUG nova.virt.libvirt.host [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.064 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.064 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.066 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.066 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.066 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.066 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.070 187212 DEBUG nova.virt.libvirt.vif [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:18:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-257891300',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-257891300',id=103,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='526f66a0e3ca44b097d8ce7f4a763497',ramdisk_id='',reservation_id='r-glcipxwz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-210151535',owner_user_name='tempest-ServerTagsTestJSON-210151535-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:18:38Z,user_data=None,user_id='7907c905947f4a2290c1cb23fc23e453',uuid=b95b2427-7c9a-4d8d-bcfd-645393721cb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.070 187212 DEBUG nova.network.os_vif_util [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converting VIF {"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.071 187212 DEBUG nova.network.os_vif_util [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.072 187212 DEBUG nova.objects.instance [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lazy-loading 'pci_devices' on Instance uuid b95b2427-7c9a-4d8d-bcfd-645393721cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.086 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <uuid>b95b2427-7c9a-4d8d-bcfd-645393721cb5</uuid>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <name>instance-00000067</name>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerTagsTestJSON-server-257891300</nova:name>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:18:44</nova:creationTime>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:18:44 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:18:44 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:18:44 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:18:44 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:18:44 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:18:44 compute-0 nova_compute[187208]:         <nova:user uuid="7907c905947f4a2290c1cb23fc23e453">tempest-ServerTagsTestJSON-210151535-project-member</nova:user>
Dec 05 12:18:44 compute-0 nova_compute[187208]:         <nova:project uuid="526f66a0e3ca44b097d8ce7f4a763497">tempest-ServerTagsTestJSON-210151535</nova:project>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:18:44 compute-0 nova_compute[187208]:         <nova:port uuid="a63fc129-9d70-44d9-a73a-cfa00b3264aa">
Dec 05 12:18:44 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <system>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <entry name="serial">b95b2427-7c9a-4d8d-bcfd-645393721cb5</entry>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <entry name="uuid">b95b2427-7c9a-4d8d-bcfd-645393721cb5</entry>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     </system>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <os>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   </os>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <features>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   </features>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.config"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:b1:b9:03"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <target dev="tapa63fc129-9d"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/console.log" append="off"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <video>
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     </video>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:18:44 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:18:44 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:18:44 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:18:44 compute-0 nova_compute[187208]: </domain>
Dec 05 12:18:44 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.087 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Preparing to wait for external event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.087 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.088 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.088 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.088 187212 DEBUG nova.virt.libvirt.vif [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:18:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-257891300',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-257891300',id=103,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='526f66a0e3ca44b097d8ce7f4a763497',ramdisk_id='',reservation_id='r-glcipxwz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-210151535',owner_user_name='tempest-ServerTagsTestJSON-210151535-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:18:38Z,user_data=None,user_id='7907c905947f4a2290c1cb23fc23e453',uuid=b95b2427-7c9a-4d8d-bcfd-645393721cb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.089 187212 DEBUG nova.network.os_vif_util [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converting VIF {"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.089 187212 DEBUG nova.network.os_vif_util [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.090 187212 DEBUG os_vif [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.090 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.090 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.091 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.093 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.093 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa63fc129-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.094 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa63fc129-9d, col_values=(('external_ids', {'iface-id': 'a63fc129-9d70-44d9-a73a-cfa00b3264aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:b9:03', 'vm-uuid': 'b95b2427-7c9a-4d8d-bcfd-645393721cb5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.095 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:44 compute-0 NetworkManager[55691]: <info>  [1764937124.0963] manager: (tapa63fc129-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/422)
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.097 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.106 187212 INFO os_vif [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d')
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.163 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.164 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.164 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] No VIF found with MAC fa:16:3e:b1:b9:03, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.164 187212 INFO nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Using config drive
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.753 187212 INFO nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Creating config drive at /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.config
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.758 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_j2j4bh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.884 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_j2j4bh" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:18:44 compute-0 kernel: tapa63fc129-9d: entered promiscuous mode
Dec 05 12:18:44 compute-0 ovn_controller[95610]: 2025-12-05T12:18:44Z|01110|binding|INFO|Claiming lport a63fc129-9d70-44d9-a73a-cfa00b3264aa for this chassis.
Dec 05 12:18:44 compute-0 ovn_controller[95610]: 2025-12-05T12:18:44Z|01111|binding|INFO|a63fc129-9d70-44d9-a73a-cfa00b3264aa: Claiming fa:16:3e:b1:b9:03 10.100.0.5
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:44 compute-0 NetworkManager[55691]: <info>  [1764937124.9728] manager: (tapa63fc129-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/423)
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.974 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:44 compute-0 nova_compute[187208]: 2025-12-05 12:18:44.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:44.989 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:b9:03 10.100.0.5'], port_security=['fa:16:3e:b1:b9:03 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b95b2427-7c9a-4d8d-bcfd-645393721cb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-331f2e14-0579-41c0-b551-4fc605c604b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '526f66a0e3ca44b097d8ce7f4a763497', 'neutron:revision_number': '2', 'neutron:security_group_ids': '597979d2-7558-443c-8a4c-18c518fb0d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8c94da1-7797-46e4-848a-0d3e3ffcc75d, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a63fc129-9d70-44d9-a73a-cfa00b3264aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:18:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:44.990 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a63fc129-9d70-44d9-a73a-cfa00b3264aa in datapath 331f2e14-0579-41c0-b551-4fc605c604b5 bound to our chassis
Dec 05 12:18:44 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:44.992 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 331f2e14-0579-41c0-b551-4fc605c604b5
Dec 05 12:18:45 compute-0 systemd-udevd[241942]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.004 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab0e64c-fe91-4978-90c7-c0fbd508f825]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.005 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap331f2e14-01 in ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.007 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap331f2e14-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.007 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a8cc0bac-4e78-41b7-863f-6f9bbc0fa82d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.008 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4d3afe-4669-44fe-8bdf-e963735c5512]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 NetworkManager[55691]: <info>  [1764937125.0150] device (tapa63fc129-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:18:45 compute-0 NetworkManager[55691]: <info>  [1764937125.0157] device (tapa63fc129-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.021 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[02324057-ec33-4cd2-8554-e874ac64d0f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 systemd-machined[153543]: New machine qemu-128-instance-00000067.
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.035 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:45 compute-0 ovn_controller[95610]: 2025-12-05T12:18:45Z|01112|binding|INFO|Setting lport a63fc129-9d70-44d9-a73a-cfa00b3264aa ovn-installed in OVS
Dec 05 12:18:45 compute-0 ovn_controller[95610]: 2025-12-05T12:18:45Z|01113|binding|INFO|Setting lport a63fc129-9d70-44d9-a73a-cfa00b3264aa up in Southbound
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.036 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fac599a7-b428-4183-9873-fe87d47c8c0c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.039 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:45 compute-0 systemd[1]: Started Virtual Machine qemu-128-instance-00000067.
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b798a01c-0850-4aa2-85b7-903cf3a35c8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 systemd-udevd[241946]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:18:45 compute-0 NetworkManager[55691]: <info>  [1764937125.0740] manager: (tap331f2e14-00): new Veth device (/org/freedesktop/NetworkManager/Devices/424)
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.073 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ff1c301a-ede3-4385-82c4-55df9e0dddb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.105 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b98f9753-54ca-49d0-967c-d08779673172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.109 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0b29f9-da8e-4531-a1ed-c1642a311eac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 NetworkManager[55691]: <info>  [1764937125.1382] device (tap331f2e14-00): carrier: link connected
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.145 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a91cf2-db59-4237-998f-1393c9519edc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.164 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ab39db21-834e-4b3a-ac10-c6fbd52d841b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap331f2e14-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:62:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 306], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450570, 'reachable_time': 19663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241975, 'error': None, 'target': 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.184 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[972c96bb-3090-4ef0-a313-431cf156a045]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:624a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450570, 'tstamp': 450570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241981, 'error': None, 'target': 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.205 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5aad5bb7-2b84-4cf0-ba5d-157415ca4850]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap331f2e14-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:62:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 306], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450570, 'reachable_time': 19663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241983, 'error': None, 'target': 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.240 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec44292-dfa0-45a2-a7cc-5f92fd37ac77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.257 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937125.2569408, b95b2427-7c9a-4d8d-bcfd-645393721cb5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.258 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] VM Started (Lifecycle Event)
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.279 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.283 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937125.2570643, b95b2427-7c9a-4d8d-bcfd-645393721cb5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.283 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] VM Paused (Lifecycle Event)
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.300 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b166de56-450c-41b2-b64b-2cc52c384a31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.301 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap331f2e14-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.302 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.302 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap331f2e14-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:45 compute-0 kernel: tap331f2e14-00: entered promiscuous mode
Dec 05 12:18:45 compute-0 NetworkManager[55691]: <info>  [1764937125.3049] manager: (tap331f2e14-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/425)
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.306 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.306 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap331f2e14-00, col_values=(('external_ids', {'iface-id': 'e7daf13b-4028-44c8-88f6-67c36a959e89'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.307 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:45 compute-0 ovn_controller[95610]: 2025-12-05T12:18:45Z|01114|binding|INFO|Releasing lport e7daf13b-4028-44c8-88f6-67c36a959e89 from this chassis (sb_readonly=0)
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.309 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.309 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/331f2e14-0579-41c0-b551-4fc605c604b5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/331f2e14-0579-41c0-b551-4fc605c604b5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.310 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[99827112-0c68-4105-9849-7f2b5978ee32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.311 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-331f2e14-0579-41c0-b551-4fc605c604b5
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/331f2e14-0579-41c0-b551-4fc605c604b5.pid.haproxy
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 331f2e14-0579-41c0-b551-4fc605c604b5
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.312 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:18:45 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.312 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'env', 'PROCESS_TAG=haproxy-331f2e14-0579-41c0-b551-4fc605c604b5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/331f2e14-0579-41c0-b551-4fc605c604b5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.336 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:18:45 compute-0 podman[242016]: 2025-12-05 12:18:45.699566443 +0000 UTC m=+0.054885531 container create 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 05 12:18:45 compute-0 systemd[1]: Started libpod-conmon-194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5.scope.
Dec 05 12:18:45 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:18:45 compute-0 podman[242016]: 2025-12-05 12:18:45.671120689 +0000 UTC m=+0.026439807 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:18:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78c6e2eb3a9f74cb7fddf3d8d585a32734ee1d05235538c68add3f2e68661886/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.826 187212 DEBUG nova.network.neutron [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updated VIF entry in instance network info cache for port a63fc129-9d70-44d9-a73a-cfa00b3264aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.827 187212 DEBUG nova.network.neutron [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updating instance_info_cache with network_info: [{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:45 compute-0 podman[242016]: 2025-12-05 12:18:45.844399867 +0000 UTC m=+0.199718985 container init 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:18:45 compute-0 nova_compute[187208]: 2025-12-05 12:18:45.844 187212 DEBUG oslo_concurrency.lockutils [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:18:45 compute-0 podman[242016]: 2025-12-05 12:18:45.849827732 +0000 UTC m=+0.205146820 container start 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:18:45 compute-0 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [NOTICE]   (242035) : New worker (242037) forked
Dec 05 12:18:45 compute-0 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [NOTICE]   (242035) : Loading success.
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.987 187212 DEBUG nova.compute.manager [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.987 187212 DEBUG oslo_concurrency.lockutils [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.988 187212 DEBUG oslo_concurrency.lockutils [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.988 187212 DEBUG oslo_concurrency.lockutils [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.988 187212 DEBUG nova.compute.manager [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Processing event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.989 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.992 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937127.9922357, b95b2427-7c9a-4d8d-bcfd-645393721cb5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.992 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] VM Resumed (Lifecycle Event)
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.994 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.996 187212 INFO nova.virt.libvirt.driver [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance spawned successfully.
Dec 05 12:18:47 compute-0 nova_compute[187208]: 2025-12-05 12:18:47.996 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.017 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.022 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.023 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.023 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.024 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.024 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.025 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.030 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.069 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.094 187212 INFO nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Took 9.56 seconds to spawn the instance on the hypervisor.
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.094 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.161 187212 INFO nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Took 10.07 seconds to build instance.
Dec 05 12:18:48 compute-0 nova_compute[187208]: 2025-12-05 12:18:48.176 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:49 compute-0 nova_compute[187208]: 2025-12-05 12:18:49.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:49 compute-0 nova_compute[187208]: 2025-12-05 12:18:49.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:18:49 compute-0 nova_compute[187208]: 2025-12-05 12:18:49.493 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.080 187212 DEBUG nova.compute.manager [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.080 187212 DEBUG oslo_concurrency.lockutils [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.080 187212 DEBUG oslo_concurrency.lockutils [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.080 187212 DEBUG oslo_concurrency.lockutils [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.081 187212 DEBUG nova.compute.manager [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] No waiting events found dispatching network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.081 187212 WARNING nova.compute.manager [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received unexpected event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa for instance with vm_state active and task_state None.
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.611 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.612 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.612 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 12:18:50 compute-0 nova_compute[187208]: 2025-12-05 12:18:50.612 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b95b2427-7c9a-4d8d-bcfd-645393721cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.844 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.845 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.846 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.846 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.846 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.847 187212 INFO nova.compute.manager [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Terminating instance
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.848 187212 DEBUG nova.compute.manager [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:18:52 compute-0 kernel: tapa63fc129-9d (unregistering): left promiscuous mode
Dec 05 12:18:52 compute-0 NetworkManager[55691]: <info>  [1764937132.8883] device (tapa63fc129-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:18:52 compute-0 ovn_controller[95610]: 2025-12-05T12:18:52Z|01115|binding|INFO|Releasing lport a63fc129-9d70-44d9-a73a-cfa00b3264aa from this chassis (sb_readonly=0)
Dec 05 12:18:52 compute-0 ovn_controller[95610]: 2025-12-05T12:18:52Z|01116|binding|INFO|Setting lport a63fc129-9d70-44d9-a73a-cfa00b3264aa down in Southbound
Dec 05 12:18:52 compute-0 ovn_controller[95610]: 2025-12-05T12:18:52Z|01117|binding|INFO|Removing iface tapa63fc129-9d ovn-installed in OVS
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.897 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.901 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.910 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:b9:03 10.100.0.5'], port_security=['fa:16:3e:b1:b9:03 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b95b2427-7c9a-4d8d-bcfd-645393721cb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-331f2e14-0579-41c0-b551-4fc605c604b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '526f66a0e3ca44b097d8ce7f4a763497', 'neutron:revision_number': '4', 'neutron:security_group_ids': '597979d2-7558-443c-8a4c-18c518fb0d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8c94da1-7797-46e4-848a-0d3e3ffcc75d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a63fc129-9d70-44d9-a73a-cfa00b3264aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:18:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.911 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a63fc129-9d70-44d9-a73a-cfa00b3264aa in datapath 331f2e14-0579-41c0-b551-4fc605c604b5 unbound from our chassis
Dec 05 12:18:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.912 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 331f2e14-0579-41c0-b551-4fc605c604b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:18:52 compute-0 nova_compute[187208]: 2025-12-05 12:18:52.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.914 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[859331b6-08c9-4317-8b65-e8ba02c2a8d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:52 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.915 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 namespace which is not needed anymore
Dec 05 12:18:52 compute-0 systemd[1]: machine-qemu\x2d128\x2dinstance\x2d00000067.scope: Deactivated successfully.
Dec 05 12:18:52 compute-0 systemd[1]: machine-qemu\x2d128\x2dinstance\x2d00000067.scope: Consumed 4.043s CPU time.
Dec 05 12:18:52 compute-0 systemd-machined[153543]: Machine qemu-128-instance-00000067 terminated.
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.137 187212 INFO nova.virt.libvirt.driver [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance destroyed successfully.
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.138 187212 DEBUG nova.objects.instance [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lazy-loading 'resources' on Instance uuid b95b2427-7c9a-4d8d-bcfd-645393721cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.154 187212 DEBUG nova.virt.libvirt.vif [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:18:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-257891300',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-257891300',id=103,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:18:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='526f66a0e3ca44b097d8ce7f4a763497',ramdisk_id='',reservation_id='r-glcipxwz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerTagsTestJSON-210151535',owner_user_name='tempest-ServerTagsTestJSON-210151535-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:18:48Z,user_data=None,user_id='7907c905947f4a2290c1cb23fc23e453',uuid=b95b2427-7c9a-4d8d-bcfd-645393721cb5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.154 187212 DEBUG nova.network.os_vif_util [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converting VIF {"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.155 187212 DEBUG nova.network.os_vif_util [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.155 187212 DEBUG os_vif [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.157 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.157 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa63fc129-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.163 187212 INFO os_vif [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d')
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.164 187212 INFO nova.virt.libvirt.driver [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Deleting instance files /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5_del
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.165 187212 INFO nova.virt.libvirt.driver [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Deletion of /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5_del complete
Dec 05 12:18:53 compute-0 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [NOTICE]   (242035) : haproxy version is 2.8.14-c23fe91
Dec 05 12:18:53 compute-0 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [NOTICE]   (242035) : path to executable is /usr/sbin/haproxy
Dec 05 12:18:53 compute-0 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [WARNING]  (242035) : Exiting Master process...
Dec 05 12:18:53 compute-0 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [WARNING]  (242035) : Exiting Master process...
Dec 05 12:18:53 compute-0 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [ALERT]    (242035) : Current worker (242037) exited with code 143 (Terminated)
Dec 05 12:18:53 compute-0 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [WARNING]  (242035) : All workers exited. Exiting... (0)
Dec 05 12:18:53 compute-0 systemd[1]: libpod-194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5.scope: Deactivated successfully.
Dec 05 12:18:53 compute-0 podman[242068]: 2025-12-05 12:18:53.195958833 +0000 UTC m=+0.172315751 container died 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:18:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5-userdata-shm.mount: Deactivated successfully.
Dec 05 12:18:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-78c6e2eb3a9f74cb7fddf3d8d585a32734ee1d05235538c68add3f2e68661886-merged.mount: Deactivated successfully.
Dec 05 12:18:53 compute-0 podman[242068]: 2025-12-05 12:18:53.34195999 +0000 UTC m=+0.318316898 container cleanup 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:18:53 compute-0 systemd[1]: libpod-conmon-194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5.scope: Deactivated successfully.
Dec 05 12:18:53 compute-0 podman[242114]: 2025-12-05 12:18:53.376752695 +0000 UTC m=+0.087835744 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.395 187212 INFO nova.compute.manager [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Took 0.55 seconds to destroy the instance on the hypervisor.
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.395 187212 DEBUG oslo.service.loopingcall [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.396 187212 DEBUG nova.compute.manager [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.396 187212 DEBUG nova.network.neutron [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:18:53 compute-0 podman[242149]: 2025-12-05 12:18:53.407418193 +0000 UTC m=+0.045130823 container remove 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 05 12:18:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.439 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[34dabdb1-05a4-4f5c-8efd-af88c45b5acb]: (4, ('Fri Dec  5 12:18:53 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 (194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5)\n194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5\nFri Dec  5 12:18:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 (194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5)\n194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:53 compute-0 podman[242115]: 2025-12-05 12:18:53.44053701 +0000 UTC m=+0.151166416 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:18:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.442 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[415252b6-18bd-44c4-86a4-cde2dea481a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.443 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap331f2e14-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.445 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:53 compute-0 kernel: tap331f2e14-00: left promiscuous mode
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.458 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.460 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[504d67e5-d4f0-476d-bd4c-cf5aad41ed7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.475 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b31df3b6-40d1-4524-aae1-995c17f5c6c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.477 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eecead2d-5e11-434f-8c2d-dcd3e1b90e63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:53 compute-0 podman[242116]: 2025-12-05 12:18:53.490358045 +0000 UTC m=+0.196583985 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:18:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.496 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0997a0d8-32cd-4a7c-a98c-5bfd1483bcde]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450562, 'reachable_time': 35861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242194, 'error': None, 'target': 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d331f2e14\x2d0579\x2d41c0\x2db551\x2d4fc605c604b5.mount: Deactivated successfully.
Dec 05 12:18:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.500 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:18:53 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.500 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[271b5dba-f5db-473d-b022-efa6a0bcd72e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.734 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updating instance_info_cache with network_info: [{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.776 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 12:18:53 compute-0 nova_compute[187208]: 2025-12-05 12:18:53.794 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:54 compute-0 nova_compute[187208]: 2025-12-05 12:18:54.095 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:54 compute-0 nova_compute[187208]: 2025-12-05 12:18:54.502 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:54 compute-0 nova_compute[187208]: 2025-12-05 12:18:54.601 187212 DEBUG nova.network.neutron [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:18:54 compute-0 nova_compute[187208]: 2025-12-05 12:18:54.623 187212 INFO nova.compute.manager [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Took 1.23 seconds to deallocate network for instance.
Dec 05 12:18:54 compute-0 nova_compute[187208]: 2025-12-05 12:18:54.663 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:54 compute-0 nova_compute[187208]: 2025-12-05 12:18:54.663 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:54 compute-0 nova_compute[187208]: 2025-12-05 12:18:54.830 187212 DEBUG nova.compute.manager [req-2e25c503-3b93-4b22-80ff-87e784509782 req-5de10c4e-f966-49bc-910b-1057f412a41e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received event network-vif-deleted-a63fc129-9d70-44d9-a73a-cfa00b3264aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:18:55 compute-0 nova_compute[187208]: 2025-12-05 12:18:55.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:55 compute-0 nova_compute[187208]: 2025-12-05 12:18:55.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:55 compute-0 nova_compute[187208]: 2025-12-05 12:18:55.094 187212 DEBUG nova.compute.provider_tree [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:18:55 compute-0 nova_compute[187208]: 2025-12-05 12:18:55.112 187212 DEBUG nova.scheduler.client.report [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:18:55 compute-0 nova_compute[187208]: 2025-12-05 12:18:55.143 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:55 compute-0 nova_compute[187208]: 2025-12-05 12:18:55.322 187212 INFO nova.scheduler.client.report [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Deleted allocations for instance b95b2427-7c9a-4d8d-bcfd-645393721cb5
Dec 05 12:18:55 compute-0 nova_compute[187208]: 2025-12-05 12:18:55.402 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:55.956 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:18:55 compute-0 nova_compute[187208]: 2025-12-05 12:18:55.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:55 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:18:55.958 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.086 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.088 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.263 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.264 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5587MB free_disk=73.04059600830078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.264 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.264 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.316 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.317 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.338 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.350 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.384 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:18:56 compute-0 nova_compute[187208]: 2025-12-05 12:18:56.384 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:18:58 compute-0 nova_compute[187208]: 2025-12-05 12:18:58.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:18:59 compute-0 nova_compute[187208]: 2025-12-05 12:18:59.380 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:18:59 compute-0 nova_compute[187208]: 2025-12-05 12:18:59.504 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:00 compute-0 nova_compute[187208]: 2025-12-05 12:19:00.107 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:03.023 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:03.024 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:03 compute-0 nova_compute[187208]: 2025-12-05 12:19:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:19:03 compute-0 nova_compute[187208]: 2025-12-05 12:19:03.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 12:19:03 compute-0 nova_compute[187208]: 2025-12-05 12:19:03.077 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 12:19:03 compute-0 nova_compute[187208]: 2025-12-05 12:19:03.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:04 compute-0 podman[242196]: 2025-12-05 12:19:04.219746636 +0000 UTC m=+0.057544547 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:19:04 compute-0 nova_compute[187208]: 2025-12-05 12:19:04.507 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:05 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:05.960 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.134 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937133.1328614, b95b2427-7c9a-4d8d-bcfd-645393721cb5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.135 187212 INFO nova.compute.manager [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] VM Stopped (Lifecycle Event)
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.163 187212 DEBUG nova.compute.manager [None req-b53d5033-0024-418b-86ff-6195a98baee9 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.211 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.768 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.769 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.788 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.879 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.880 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.888 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.888 187212 INFO nova.compute.claims [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.962 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.988 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 12:19:08 compute-0 nova_compute[187208]: 2025-12-05 12:19:08.988 187212 DEBUG nova.compute.provider_tree [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.006 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.034 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.075 187212 DEBUG nova.compute.provider_tree [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.090 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.116 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.116 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.168 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.186 187212 INFO nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.205 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.290 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.292 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.293 187212 INFO nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Creating image(s)
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.293 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.294 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.294 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.306 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.376 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.377 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.378 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.391 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.452 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.453 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.491 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.492 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.492 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.553 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.554 187212 DEBUG nova.virt.disk.api [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Checking if we can resize image /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.554 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.615 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.616 187212 DEBUG nova.virt.disk.api [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Cannot resize image /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.616 187212 DEBUG nova.objects.instance [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'migration_context' on Instance uuid 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.631 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.631 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Ensure instance console log exists: /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.632 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.632 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.633 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.635 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.640 187212 WARNING nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.647 187212 DEBUG nova.virt.libvirt.host [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.648 187212 DEBUG nova.virt.libvirt.host [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.653 187212 DEBUG nova.virt.libvirt.host [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.653 187212 DEBUG nova.virt.libvirt.host [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.654 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.654 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.655 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.655 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.655 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.656 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.656 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.657 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.657 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.657 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.657 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.658 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.664 187212 DEBUG nova.objects.instance [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.687 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <uuid>06e1cdc7-fc0d-4de0-baed-0876536b7ee1</uuid>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <name>instance-00000068</name>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerShowV247Test-server-1392417467</nova:name>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:19:09</nova:creationTime>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:19:09 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:19:09 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:19:09 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:19:09 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:19:09 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:19:09 compute-0 nova_compute[187208]:         <nova:user uuid="b0d9487c1e0a49ad9ca1c5ebe37d4ed3">tempest-ServerShowV247Test-1738469039-project-member</nova:user>
Dec 05 12:19:09 compute-0 nova_compute[187208]:         <nova:project uuid="25d7882911914ef5ae762cbd5dc95a3a">tempest-ServerShowV247Test-1738469039</nova:project>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <system>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <entry name="serial">06e1cdc7-fc0d-4de0-baed-0876536b7ee1</entry>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <entry name="uuid">06e1cdc7-fc0d-4de0-baed-0876536b7ee1</entry>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     </system>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <os>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   </os>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <features>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   </features>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.config"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/console.log" append="off"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <video>
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     </video>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:19:09 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:19:09 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:19:09 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:19:09 compute-0 nova_compute[187208]: </domain>
Dec 05 12:19:09 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.729 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.729 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:09 compute-0 nova_compute[187208]: 2025-12-05 12:19:09.730 187212 INFO nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Using config drive
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.193 187212 INFO nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Creating config drive at /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.config
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.198 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp8upwpe3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:10 compute-0 podman[242235]: 2025-12-05 12:19:10.20738144 +0000 UTC m=+0.058750182 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible)
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.332 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp8upwpe3" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:10 compute-0 systemd-machined[153543]: New machine qemu-129-instance-00000068.
Dec 05 12:19:10 compute-0 systemd[1]: Started Virtual Machine qemu-129-instance-00000068.
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.727 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937150.7270656, 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.727 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] VM Resumed (Lifecycle Event)
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.730 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.731 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.734 187212 INFO nova.virt.libvirt.driver [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance spawned successfully.
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.734 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.753 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.758 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.762 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.763 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.763 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.763 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.764 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.764 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.793 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.794 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937150.7280824, 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.794 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] VM Started (Lifecycle Event)
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.822 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.826 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.842 187212 INFO nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Took 1.55 seconds to spawn the instance on the hypervisor.
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.842 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.848 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.856 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.856 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.883 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.916 187212 INFO nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Took 2.07 seconds to build instance.
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.939 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.951 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.951 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.962 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:19:10 compute-0 nova_compute[187208]: 2025-12-05 12:19:10.963 187212 INFO nova.compute.claims [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.077 187212 DEBUG nova.compute.provider_tree [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.093 187212 DEBUG nova.scheduler.client.report [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.114 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.115 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.160 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.174 187212 INFO nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.196 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.272 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.273 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.274 187212 INFO nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating image(s)
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.274 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.275 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.276 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.292 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.361 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.362 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.363 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.375 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.438 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.440 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.475 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.477 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.477 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.535 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.536 187212 DEBUG nova.virt.disk.api [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Checking if we can resize image /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.536 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.594 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.595 187212 DEBUG nova.virt.disk.api [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Cannot resize image /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.596 187212 DEBUG nova.objects.instance [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'migration_context' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.610 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.610 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Ensure instance console log exists: /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.611 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.611 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.612 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.614 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.617 187212 WARNING nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.622 187212 DEBUG nova.virt.libvirt.host [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.622 187212 DEBUG nova.virt.libvirt.host [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.625 187212 DEBUG nova.virt.libvirt.host [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.626 187212 DEBUG nova.virt.libvirt.host [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.626 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.627 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.627 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.628 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.628 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.629 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.629 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.629 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.630 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.630 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.630 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.631 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.636 187212 DEBUG nova.objects.instance [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.653 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <uuid>4cdf5703-a103-4583-9e40-a33e86b5bf04</uuid>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <name>instance-00000069</name>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerShowV247Test-server-538809939</nova:name>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:19:11</nova:creationTime>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:19:11 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:19:11 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:19:11 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:19:11 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:19:11 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:19:11 compute-0 nova_compute[187208]:         <nova:user uuid="b0d9487c1e0a49ad9ca1c5ebe37d4ed3">tempest-ServerShowV247Test-1738469039-project-member</nova:user>
Dec 05 12:19:11 compute-0 nova_compute[187208]:         <nova:project uuid="25d7882911914ef5ae762cbd5dc95a3a">tempest-ServerShowV247Test-1738469039</nova:project>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <system>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <entry name="serial">4cdf5703-a103-4583-9e40-a33e86b5bf04</entry>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <entry name="uuid">4cdf5703-a103-4583-9e40-a33e86b5bf04</entry>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     </system>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <os>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   </os>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <features>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   </features>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/console.log" append="off"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <video>
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     </video>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:19:11 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:19:11 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:19:11 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:19:11 compute-0 nova_compute[187208]: </domain>
Dec 05 12:19:11 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.721 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.721 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:11 compute-0 nova_compute[187208]: 2025-12-05 12:19:11.722 187212 INFO nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Using config drive
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.444 187212 INFO nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating config drive at /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.451 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuedvgwse execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.578 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuedvgwse" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:12 compute-0 systemd-machined[153543]: New machine qemu-130-instance-00000069.
Dec 05 12:19:12 compute-0 systemd[1]: Started Virtual Machine qemu-130-instance-00000069.
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.903 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937152.9027894, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.903 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Resumed (Lifecycle Event)
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.906 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.906 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.910 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance spawned successfully.
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.911 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.929 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.934 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.941 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.942 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.943 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.943 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.944 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.944 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.979 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.979 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937152.9029043, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:12 compute-0 nova_compute[187208]: 2025-12-05 12:19:12.980 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Started (Lifecycle Event)
Dec 05 12:19:13 compute-0 nova_compute[187208]: 2025-12-05 12:19:13.009 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:13 compute-0 nova_compute[187208]: 2025-12-05 12:19:13.012 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:13 compute-0 nova_compute[187208]: 2025-12-05 12:19:13.019 187212 INFO nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Took 1.75 seconds to spawn the instance on the hypervisor.
Dec 05 12:19:13 compute-0 nova_compute[187208]: 2025-12-05 12:19:13.019 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:13 compute-0 nova_compute[187208]: 2025-12-05 12:19:13.030 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:19:13 compute-0 nova_compute[187208]: 2025-12-05 12:19:13.072 187212 INFO nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Took 2.14 seconds to build instance.
Dec 05 12:19:13 compute-0 nova_compute[187208]: 2025-12-05 12:19:13.094 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:13 compute-0 nova_compute[187208]: 2025-12-05 12:19:13.215 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:14 compute-0 podman[242326]: 2025-12-05 12:19:14.203445494 +0000 UTC m=+0.055766137 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 12:19:14 compute-0 podman[242325]: 2025-12-05 12:19:14.210099334 +0000 UTC m=+0.064151676 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm)
Dec 05 12:19:14 compute-0 nova_compute[187208]: 2025-12-05 12:19:14.510 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:15 compute-0 nova_compute[187208]: 2025-12-05 12:19:15.199 187212 INFO nova.compute.manager [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Rebuilding instance
Dec 05 12:19:15 compute-0 nova_compute[187208]: 2025-12-05 12:19:15.530 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:15 compute-0 nova_compute[187208]: 2025-12-05 12:19:15.561 187212 DEBUG nova.compute.manager [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:15 compute-0 nova_compute[187208]: 2025-12-05 12:19:15.636 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'pci_requests' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:15 compute-0 nova_compute[187208]: 2025-12-05 12:19:15.651 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:15 compute-0 nova_compute[187208]: 2025-12-05 12:19:15.666 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'resources' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:15 compute-0 nova_compute[187208]: 2025-12-05 12:19:15.680 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'migration_context' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:15 compute-0 nova_compute[187208]: 2025-12-05 12:19:15.695 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:19:15 compute-0 nova_compute[187208]: 2025-12-05 12:19:15.700 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 12:19:18 compute-0 nova_compute[187208]: 2025-12-05 12:19:18.217 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:19 compute-0 nova_compute[187208]: 2025-12-05 12:19:19.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:23 compute-0 nova_compute[187208]: 2025-12-05 12:19:23.221 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:24 compute-0 podman[242394]: 2025-12-05 12:19:24.198722072 +0000 UTC m=+0.051139054 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 12:19:24 compute-0 podman[242393]: 2025-12-05 12:19:24.201005998 +0000 UTC m=+0.055886890 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 12:19:24 compute-0 podman[242395]: 2025-12-05 12:19:24.234078184 +0000 UTC m=+0.083098448 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:19:24 compute-0 nova_compute[187208]: 2025-12-05 12:19:24.513 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:25 compute-0 sshd-session[242458]: error: kex_exchange_identification: read: Connection reset by peer
Dec 05 12:19:25 compute-0 sshd-session[242458]: Connection reset by 45.140.17.97 port 3124
Dec 05 12:19:25 compute-0 nova_compute[187208]: 2025-12-05 12:19:25.744 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 05 12:19:27 compute-0 systemd[1]: machine-qemu\x2d130\x2dinstance\x2d00000069.scope: Deactivated successfully.
Dec 05 12:19:27 compute-0 systemd[1]: machine-qemu\x2d130\x2dinstance\x2d00000069.scope: Consumed 12.021s CPU time.
Dec 05 12:19:27 compute-0 systemd-machined[153543]: Machine qemu-130-instance-00000069 terminated.
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.223 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.759 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance shutdown successfully after 13 seconds.
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.764 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance destroyed successfully.
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.769 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance destroyed successfully.
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.770 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deleting instance files /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04_del
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.770 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deletion of /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04_del complete
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.956 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.956 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating image(s)
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.957 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.957 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.958 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:28 compute-0 nova_compute[187208]: 2025-12-05 12:19:28.972 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.031 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.032 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.033 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.053 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.112 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.113 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.149 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.150 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.150 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.209 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.210 187212 DEBUG nova.virt.disk.api [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Checking if we can resize image /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.210 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.268 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.270 187212 DEBUG nova.virt.disk.api [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Cannot resize image /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.270 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.270 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Ensure instance console log exists: /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.271 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.271 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.272 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.273 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.277 187212 WARNING nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.298 187212 DEBUG nova.virt.libvirt.host [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.299 187212 DEBUG nova.virt.libvirt.host [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.304 187212 DEBUG nova.virt.libvirt.host [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.305 187212 DEBUG nova.virt.libvirt.host [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.305 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.305 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.306 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.306 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.306 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.307 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.307 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.307 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.308 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.308 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.308 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.308 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.309 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.330 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <uuid>4cdf5703-a103-4583-9e40-a33e86b5bf04</uuid>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <name>instance-00000069</name>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <nova:name>tempest-ServerShowV247Test-server-538809939</nova:name>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:19:29</nova:creationTime>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:19:29 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:19:29 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:19:29 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:19:29 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:19:29 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:19:29 compute-0 nova_compute[187208]:         <nova:user uuid="b0d9487c1e0a49ad9ca1c5ebe37d4ed3">tempest-ServerShowV247Test-1738469039-project-member</nova:user>
Dec 05 12:19:29 compute-0 nova_compute[187208]:         <nova:project uuid="25d7882911914ef5ae762cbd5dc95a3a">tempest-ServerShowV247Test-1738469039</nova:project>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <nova:ports/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <system>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <entry name="serial">4cdf5703-a103-4583-9e40-a33e86b5bf04</entry>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <entry name="uuid">4cdf5703-a103-4583-9e40-a33e86b5bf04</entry>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     </system>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <os>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   </os>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <features>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   </features>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/console.log" append="off"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <video>
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     </video>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:19:29 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:19:29 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:19:29 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:19:29 compute-0 nova_compute[187208]: </domain>
Dec 05 12:19:29 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.389 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.389 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.390 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Using config drive
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.408 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.448 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'keypairs' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.515 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.829 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating config drive at /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.834 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpley9fnfz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:29 compute-0 nova_compute[187208]: 2025-12-05 12:19:29.959 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpley9fnfz" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:30 compute-0 systemd-machined[153543]: New machine qemu-131-instance-00000069.
Dec 05 12:19:30 compute-0 systemd[1]: Started Virtual Machine qemu-131-instance-00000069.
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.444 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 4cdf5703-a103-4583-9e40-a33e86b5bf04 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.445 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937170.4435465, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.445 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Resumed (Lifecycle Event)
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.449 187212 DEBUG nova.compute.manager [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.449 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.453 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance spawned successfully.
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.454 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.479 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.486 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.491 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.491 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.492 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.492 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.493 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.494 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.519 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.520 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937170.4449296, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.520 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Started (Lifecycle Event)
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.540 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.541 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.548 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.552 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.558 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.585 187212 DEBUG nova.compute.manager [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.587 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.675 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.675 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.676 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.693 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.744 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.745 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.752 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.752 187212 INFO nova.compute.claims [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.913 187212 DEBUG nova.compute.provider_tree [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.928 187212 DEBUG nova.scheduler.client.report [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.957 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:30 compute-0 nova_compute[187208]: 2025-12-05 12:19:30.958 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.011 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.012 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.031 187212 INFO nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.049 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.157 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.158 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.159 187212 INFO nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Creating image(s)
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.161 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.161 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.162 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.175 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.245 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.246 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.246 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.260 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.317 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.318 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.390 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk 1073741824" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.392 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.393 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.461 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.462 187212 DEBUG nova.virt.disk.api [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Checking if we can resize image /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.463 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.491 187212 DEBUG nova.policy [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '223f7822261946cc9228b2207bd1096c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.527 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.528 187212 DEBUG nova.virt.disk.api [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Cannot resize image /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.529 187212 DEBUG nova.objects.instance [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'migration_context' on Instance uuid 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.543 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.544 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Ensure instance console log exists: /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.544 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.544 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:31 compute-0 nova_compute[187208]: 2025-12-05 12:19:31.545 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.109 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.110 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.110 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "4cdf5703-a103-4583-9e40-a33e86b5bf04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.110 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.111 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.112 187212 INFO nova.compute.manager [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Terminating instance
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.113 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "refresh_cache-4cdf5703-a103-4583-9e40-a33e86b5bf04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.113 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquired lock "refresh_cache-4cdf5703-a103-4583-9e40-a33e86b5bf04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.113 187212 DEBUG nova.network.neutron [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.418 187212 DEBUG nova.network.neutron [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.511 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Successfully created port: f7a08175-a5c6-45b7-b194-819c5b881995 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.741 187212 DEBUG nova.network.neutron [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.756 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Releasing lock "refresh_cache-4cdf5703-a103-4583-9e40-a33e86b5bf04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:19:33 compute-0 nova_compute[187208]: 2025-12-05 12:19:33.756 187212 DEBUG nova.compute.manager [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:19:33 compute-0 systemd[1]: machine-qemu\x2d131\x2dinstance\x2d00000069.scope: Deactivated successfully.
Dec 05 12:19:33 compute-0 systemd[1]: machine-qemu\x2d131\x2dinstance\x2d00000069.scope: Consumed 3.761s CPU time.
Dec 05 12:19:33 compute-0 systemd-machined[153543]: Machine qemu-131-instance-00000069 terminated.
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.004 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance destroyed successfully.
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.005 187212 DEBUG nova.objects.instance [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'resources' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.023 187212 INFO nova.virt.libvirt.driver [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deleting instance files /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04_del
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.023 187212 INFO nova.virt.libvirt.driver [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deletion of /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04_del complete
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.073 187212 INFO nova.compute.manager [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Took 0.32 seconds to destroy the instance on the hypervisor.
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.073 187212 DEBUG oslo.service.loopingcall [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.074 187212 DEBUG nova.compute.manager [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.074 187212 DEBUG nova.network.neutron [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.507 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Successfully updated port: f7a08175-a5c6-45b7-b194-819c5b881995 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.517 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.529 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.529 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquired lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.530 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.644 187212 DEBUG nova.network.neutron [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.662 187212 DEBUG nova.network.neutron [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.697 187212 INFO nova.compute.manager [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Took 0.62 seconds to deallocate network for instance.
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.753 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.754 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.773 187212 DEBUG nova.compute.manager [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-changed-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.773 187212 DEBUG nova.compute.manager [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Refreshing instance network info cache due to event network-changed-f7a08175-a5c6-45b7-b194-819c5b881995. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.774 187212 DEBUG oslo_concurrency.lockutils [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.845 187212 DEBUG nova.compute.provider_tree [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.859 187212 DEBUG nova.scheduler.client.report [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.880 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.916 187212 INFO nova.scheduler.client.report [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Deleted allocations for instance 4cdf5703-a103-4583-9e40-a33e86b5bf04
Dec 05 12:19:34 compute-0 nova_compute[187208]: 2025-12-05 12:19:34.993 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.883s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:35 compute-0 nova_compute[187208]: 2025-12-05 12:19:35.058 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:19:35 compute-0 podman[242534]: 2025-12-05 12:19:35.199206319 +0000 UTC m=+0.053827521 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.134 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Updating instance_info_cache with network_info: [{"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.166 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Releasing lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.166 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance network_info: |[{"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.167 187212 DEBUG oslo_concurrency.lockutils [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.167 187212 DEBUG nova.network.neutron [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Refreshing network info cache for port f7a08175-a5c6-45b7-b194-819c5b881995 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.172 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start _get_guest_xml network_info=[{"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.181 187212 WARNING nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.186 187212 DEBUG nova.virt.libvirt.host [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.187 187212 DEBUG nova.virt.libvirt.host [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.190 187212 DEBUG nova.virt.libvirt.host [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.190 187212 DEBUG nova.virt.libvirt.host [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.191 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.191 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.192 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.192 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.192 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.194 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.194 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.200 187212 DEBUG nova.virt.libvirt.vif [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:19:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-793938802',display_name='tempest-VolumesActionsTest-instance-793938802',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-793938802',id=106,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-6clntd3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:19:31Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.200 187212 DEBUG nova.network.os_vif_util [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.201 187212 DEBUG nova.network.os_vif_util [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.204 187212 DEBUG nova.objects.instance [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.220 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <uuid>3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e</uuid>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <name>instance-0000006a</name>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <nova:name>tempest-VolumesActionsTest-instance-793938802</nova:name>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:19:36</nova:creationTime>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:19:36 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:19:36 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:19:36 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:19:36 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:19:36 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:19:36 compute-0 nova_compute[187208]:         <nova:user uuid="223f7822261946cc9228b2207bd1096c">tempest-VolumesActionsTest-1057905007-project-member</nova:user>
Dec 05 12:19:36 compute-0 nova_compute[187208]:         <nova:project uuid="3463fde58c6c4bea98c82b2cb087a0dd">tempest-VolumesActionsTest-1057905007</nova:project>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:19:36 compute-0 nova_compute[187208]:         <nova:port uuid="f7a08175-a5c6-45b7-b194-819c5b881995">
Dec 05 12:19:36 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <system>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <entry name="serial">3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e</entry>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <entry name="uuid">3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e</entry>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     </system>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <os>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   </os>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <features>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   </features>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.config"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:b6:75:08"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <target dev="tapf7a08175-a5"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/console.log" append="off"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <video>
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     </video>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:19:36 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:19:36 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:19:36 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:19:36 compute-0 nova_compute[187208]: </domain>
Dec 05 12:19:36 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.221 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Preparing to wait for external event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.222 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.222 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.222 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.223 187212 DEBUG nova.virt.libvirt.vif [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:19:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-793938802',display_name='tempest-VolumesActionsTest-instance-793938802',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-793938802',id=106,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-6clntd3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:19:31Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.223 187212 DEBUG nova.network.os_vif_util [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.224 187212 DEBUG nova.network.os_vif_util [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.224 187212 DEBUG os_vif [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.225 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.225 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.226 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.230 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.231 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a08175-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.231 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7a08175-a5, col_values=(('external_ids', {'iface-id': 'f7a08175-a5c6-45b7-b194-819c5b881995', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:75:08', 'vm-uuid': '3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.405 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:36 compute-0 NetworkManager[55691]: <info>  [1764937176.4063] manager: (tapf7a08175-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/426)
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.407 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.411 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.415 187212 INFO os_vif [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5')
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.476 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.476 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.477 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No VIF found with MAC fa:16:3e:b6:75:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.477 187212 INFO nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Using config drive
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.600 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.601 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.602 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.602 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.603 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.605 187212 INFO nova.compute.manager [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Terminating instance
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.606 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "refresh_cache-06e1cdc7-fc0d-4de0-baed-0876536b7ee1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.607 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquired lock "refresh_cache-06e1cdc7-fc0d-4de0-baed-0876536b7ee1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.607 187212 DEBUG nova.network.neutron [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:19:36 compute-0 nova_compute[187208]: 2025-12-05 12:19:36.976 187212 DEBUG nova.network.neutron [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.196 187212 INFO nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Creating config drive at /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.config
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.201 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_3cqb5n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.331 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_3cqb5n" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.372 187212 DEBUG nova.network.neutron [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.387 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Releasing lock "refresh_cache-06e1cdc7-fc0d-4de0-baed-0876536b7ee1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.388 187212 DEBUG nova.compute.manager [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:19:37 compute-0 kernel: tapf7a08175-a5: entered promiscuous mode
Dec 05 12:19:37 compute-0 NetworkManager[55691]: <info>  [1764937177.4165] manager: (tapf7a08175-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/427)
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.416 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:37 compute-0 ovn_controller[95610]: 2025-12-05T12:19:37Z|01118|binding|INFO|Claiming lport f7a08175-a5c6-45b7-b194-819c5b881995 for this chassis.
Dec 05 12:19:37 compute-0 ovn_controller[95610]: 2025-12-05T12:19:37Z|01119|binding|INFO|f7a08175-a5c6-45b7-b194-819c5b881995: Claiming fa:16:3e:b6:75:08 10.100.0.11
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.458 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:75:08 10.100.0.11'], port_security=['fa:16:3e:b6:75:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52916d9d-eb76-4677-8333-d02c9507adbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '21c17f9f-419d-4f4a-937b-418d08c504db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ca61dc-1564-41a5-9908-f7ecfadeff73, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f7a08175-a5c6-45b7-b194-819c5b881995) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.459 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.460 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f7a08175-a5c6-45b7-b194-819c5b881995 in datapath 52916d9d-eb76-4677-8333-d02c9507adbc bound to our chassis
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.461 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 52916d9d-eb76-4677-8333-d02c9507adbc
Dec 05 12:19:37 compute-0 systemd-udevd[242580]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.477 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5734d1f4-5460-483d-b2ff-49f2939bc990]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.478 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap52916d9d-e1 in ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.480 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap52916d9d-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.480 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cdacad6b-90bd-4174-a26f-5574ba4ea406]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.481 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2318c9ab-5912-445a-9a51-4c5c39c4f62d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 systemd[1]: machine-qemu\x2d129\x2dinstance\x2d00000068.scope: Deactivated successfully.
Dec 05 12:19:37 compute-0 systemd[1]: machine-qemu\x2d129\x2dinstance\x2d00000068.scope: Consumed 12.818s CPU time.
Dec 05 12:19:37 compute-0 NetworkManager[55691]: <info>  [1764937177.4946] device (tapf7a08175-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.493 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b42d40c7-0752-4863-995c-88788d1bfd97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 NetworkManager[55691]: <info>  [1764937177.4957] device (tapf7a08175-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:19:37 compute-0 systemd-machined[153543]: Machine qemu-129-instance-00000068 terminated.
Dec 05 12:19:37 compute-0 systemd-machined[153543]: New machine qemu-132-instance-0000006a.
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.509 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d97fee-9df8-4073-871c-978863fac552]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.511 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:37 compute-0 systemd[1]: Started Virtual Machine qemu-132-instance-0000006a.
Dec 05 12:19:37 compute-0 ovn_controller[95610]: 2025-12-05T12:19:37Z|01120|binding|INFO|Setting lport f7a08175-a5c6-45b7-b194-819c5b881995 ovn-installed in OVS
Dec 05 12:19:37 compute-0 ovn_controller[95610]: 2025-12-05T12:19:37Z|01121|binding|INFO|Setting lport f7a08175-a5c6-45b7-b194-819c5b881995 up in Southbound
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.516 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.552 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[97a1af6f-fd33-4180-8e8c-87757eeb5f33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.557 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2783415-aee0-4e6f-8949-945dee034afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 NetworkManager[55691]: <info>  [1764937177.5585] manager: (tap52916d9d-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/428)
Dec 05 12:19:37 compute-0 systemd-udevd[242579]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.594 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[873917d7-39c9-43a3-a02e-d8e398f0174f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.598 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[35771e56-e896-4f63-b6da-34730cc44b87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 NetworkManager[55691]: <info>  [1764937177.6274] device (tap52916d9d-e0): carrier: link connected
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.630 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[71ece242-78b8-42a3-8716-cc5a835e9572]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.635 187212 INFO nova.virt.libvirt.driver [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance destroyed successfully.
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.636 187212 DEBUG nova.objects.instance [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'resources' on Instance uuid 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.650 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7b822629-72bc-4dac-bd41-220aa929ca64]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52916d9d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:c4:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 309], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455818, 'reachable_time': 38112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242620, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.654 187212 INFO nova.virt.libvirt.driver [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Deleting instance files /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1_del
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.655 187212 INFO nova.virt.libvirt.driver [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Deletion of /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1_del complete
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.666 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf60c6c8-af16-4cc1-a3d8-21f21aebedc8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:c462'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455818, 'tstamp': 455818}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242621, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.684 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8709c161-9786-4d1a-beae-1f99e142ac4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52916d9d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:c4:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 309], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455818, 'reachable_time': 38112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242622, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.715 187212 INFO nova.compute.manager [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Took 0.33 seconds to destroy the instance on the hypervisor.
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.715 187212 DEBUG oslo.service.loopingcall [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.715 187212 DEBUG nova.compute.manager [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.716 187212 DEBUG nova.network.neutron [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.716 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3df1e73c-cb62-45ee-a48e-970a6ca360c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.786 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab95947-9868-4d71-a171-97a892832230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.788 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52916d9d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.789 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.789 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52916d9d-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.791 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:37 compute-0 kernel: tap52916d9d-e0: entered promiscuous mode
Dec 05 12:19:37 compute-0 NetworkManager[55691]: <info>  [1764937177.7920] manager: (tap52916d9d-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/429)
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.794 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap52916d9d-e0, col_values=(('external_ids', {'iface-id': 'bfd2a34a-bdd5-4486-82a8-fc55b6e1020a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:37 compute-0 ovn_controller[95610]: 2025-12-05T12:19:37Z|01122|binding|INFO|Releasing lport bfd2a34a-bdd5-4486-82a8-fc55b6e1020a from this chassis (sb_readonly=0)
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.796 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.797 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.806 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[51b7d638-3eca-403c-bbe1-87f0b0d05fd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:37 compute-0 nova_compute[187208]: 2025-12-05 12:19:37.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.808 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-52916d9d-eb76-4677-8333-d02c9507adbc
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 52916d9d-eb76-4677-8333-d02c9507adbc
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:19:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.810 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'env', 'PROCESS_TAG=haproxy-52916d9d-eb76-4677-8333-d02c9507adbc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/52916d9d-eb76-4677-8333-d02c9507adbc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.132 187212 DEBUG nova.compute.manager [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.133 187212 DEBUG oslo_concurrency.lockutils [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.134 187212 DEBUG oslo_concurrency.lockutils [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.134 187212 DEBUG oslo_concurrency.lockutils [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.134 187212 DEBUG nova.compute.manager [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Processing event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.145 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937178.1450875, 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.146 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] VM Started (Lifecycle Event)
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.148 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.153 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.157 187212 INFO nova.virt.libvirt.driver [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance spawned successfully.
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.157 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.177 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.183 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.188 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.188 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.189 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.189 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.190 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.190 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.218 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.220 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937178.1478107, 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.220 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] VM Paused (Lifecycle Event)
Dec 05 12:19:38 compute-0 podman[242661]: 2025-12-05 12:19:38.231673213 +0000 UTC m=+0.053587434 container create a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.244 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.251 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937178.1524527, 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.252 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] VM Resumed (Lifecycle Event)
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.254 187212 INFO nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Took 7.10 seconds to spawn the instance on the hypervisor.
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.255 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.267 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:38 compute-0 systemd[1]: Started libpod-conmon-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0.scope.
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.272 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:38 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:19:38 compute-0 podman[242661]: 2025-12-05 12:19:38.202838588 +0000 UTC m=+0.024752829 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:19:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe9e174fa0d1960b337c7b5c186b03bdd5ebfcd106c3c3e9d8495dbd20d710fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.303 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:19:38 compute-0 podman[242661]: 2025-12-05 12:19:38.319411793 +0000 UTC m=+0.141326024 container init a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 12:19:38 compute-0 podman[242661]: 2025-12-05 12:19:38.325147897 +0000 UTC m=+0.147062108 container start a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.330 187212 INFO nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Took 7.68 seconds to build instance.
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.347 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:38 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [NOTICE]   (242682) : New worker (242684) forked
Dec 05 12:19:38 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [NOTICE]   (242682) : Loading success.
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.482 187212 DEBUG nova.network.neutron [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.539 187212 DEBUG nova.network.neutron [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.553 187212 INFO nova.compute.manager [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Took 0.84 seconds to deallocate network for instance.
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.607 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.610 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.697 187212 DEBUG nova.compute.provider_tree [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.712 187212 DEBUG nova.scheduler.client.report [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.737 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.779 187212 INFO nova.scheduler.client.report [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Deleted allocations for instance 06e1cdc7-fc0d-4de0-baed-0876536b7ee1
Dec 05 12:19:38 compute-0 nova_compute[187208]: 2025-12-05 12:19:38.842 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:39 compute-0 nova_compute[187208]: 2025-12-05 12:19:39.220 187212 DEBUG nova.network.neutron [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Updated VIF entry in instance network info cache for port f7a08175-a5c6-45b7-b194-819c5b881995. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:19:39 compute-0 nova_compute[187208]: 2025-12-05 12:19:39.223 187212 DEBUG nova.network.neutron [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Updating instance_info_cache with network_info: [{"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:19:39 compute-0 nova_compute[187208]: 2025-12-05 12:19:39.242 187212 DEBUG oslo_concurrency.lockutils [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:19:39 compute-0 nova_compute[187208]: 2025-12-05 12:19:39.518 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:40 compute-0 nova_compute[187208]: 2025-12-05 12:19:40.320 187212 DEBUG nova.compute.manager [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:40 compute-0 nova_compute[187208]: 2025-12-05 12:19:40.321 187212 DEBUG oslo_concurrency.lockutils [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:40 compute-0 nova_compute[187208]: 2025-12-05 12:19:40.322 187212 DEBUG oslo_concurrency.lockutils [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:40 compute-0 nova_compute[187208]: 2025-12-05 12:19:40.322 187212 DEBUG oslo_concurrency.lockutils [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:40 compute-0 nova_compute[187208]: 2025-12-05 12:19:40.322 187212 DEBUG nova.compute.manager [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] No waiting events found dispatching network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:19:40 compute-0 nova_compute[187208]: 2025-12-05 12:19:40.323 187212 WARNING nova.compute.manager [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received unexpected event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 for instance with vm_state active and task_state None.
Dec 05 12:19:41 compute-0 podman[242693]: 2025-12-05 12:19:41.217889613 +0000 UTC m=+0.067556473 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.517 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.520 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.521 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.522 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.523 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.525 187212 INFO nova.compute.manager [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Terminating instance
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.527 187212 DEBUG nova.compute.manager [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:19:41 compute-0 kernel: tapf7a08175-a5 (unregistering): left promiscuous mode
Dec 05 12:19:41 compute-0 NetworkManager[55691]: <info>  [1764937181.5453] device (tapf7a08175-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.552 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 ovn_controller[95610]: 2025-12-05T12:19:41Z|01123|binding|INFO|Releasing lport f7a08175-a5c6-45b7-b194-819c5b881995 from this chassis (sb_readonly=0)
Dec 05 12:19:41 compute-0 ovn_controller[95610]: 2025-12-05T12:19:41Z|01124|binding|INFO|Setting lport f7a08175-a5c6-45b7-b194-819c5b881995 down in Southbound
Dec 05 12:19:41 compute-0 ovn_controller[95610]: 2025-12-05T12:19:41Z|01125|binding|INFO|Removing iface tapf7a08175-a5 ovn-installed in OVS
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.555 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.563 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:75:08 10.100.0.11'], port_security=['fa:16:3e:b6:75:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52916d9d-eb76-4677-8333-d02c9507adbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21c17f9f-419d-4f4a-937b-418d08c504db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ca61dc-1564-41a5-9908-f7ecfadeff73, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f7a08175-a5c6-45b7-b194-819c5b881995) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.564 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f7a08175-a5c6-45b7-b194-819c5b881995 in datapath 52916d9d-eb76-4677-8333-d02c9507adbc unbound from our chassis
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.565 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 52916d9d-eb76-4677-8333-d02c9507adbc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.567 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.567 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[acd7a734-343e-413c-a923-ea16f1f23b1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.567 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc namespace which is not needed anymore
Dec 05 12:19:41 compute-0 systemd[1]: machine-qemu\x2d132\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Dec 05 12:19:41 compute-0 systemd[1]: machine-qemu\x2d132\x2dinstance\x2d0000006a.scope: Consumed 4.034s CPU time.
Dec 05 12:19:41 compute-0 systemd-machined[153543]: Machine qemu-132-instance-0000006a terminated.
Dec 05 12:19:41 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [NOTICE]   (242682) : haproxy version is 2.8.14-c23fe91
Dec 05 12:19:41 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [NOTICE]   (242682) : path to executable is /usr/sbin/haproxy
Dec 05 12:19:41 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [WARNING]  (242682) : Exiting Master process...
Dec 05 12:19:41 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [WARNING]  (242682) : Exiting Master process...
Dec 05 12:19:41 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [ALERT]    (242682) : Current worker (242684) exited with code 143 (Terminated)
Dec 05 12:19:41 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [WARNING]  (242682) : All workers exited. Exiting... (0)
Dec 05 12:19:41 compute-0 systemd[1]: libpod-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0.scope: Deactivated successfully.
Dec 05 12:19:41 compute-0 conmon[242674]: conmon a2fc1f32fe789290236e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0.scope/container/memory.events
Dec 05 12:19:41 compute-0 podman[242739]: 2025-12-05 12:19:41.699885253 +0000 UTC m=+0.045216745 container died a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 05 12:19:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0-userdata-shm.mount: Deactivated successfully.
Dec 05 12:19:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-fe9e174fa0d1960b337c7b5c186b03bdd5ebfcd106c3c3e9d8495dbd20d710fc-merged.mount: Deactivated successfully.
Dec 05 12:19:41 compute-0 podman[242739]: 2025-12-05 12:19:41.744513829 +0000 UTC m=+0.089845321 container cleanup a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:19:41 compute-0 systemd[1]: libpod-conmon-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0.scope: Deactivated successfully.
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.826 187212 INFO nova.virt.libvirt.driver [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance destroyed successfully.
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.828 187212 DEBUG nova.objects.instance [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'resources' on Instance uuid 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.841 187212 DEBUG nova.virt.libvirt.vif [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:19:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-793938802',display_name='tempest-VolumesActionsTest-instance-793938802',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-793938802',id=106,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:19:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-6clntd3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:19:38Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.842 187212 DEBUG nova.network.os_vif_util [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.843 187212 DEBUG nova.network.os_vif_util [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.843 187212 DEBUG os_vif [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.845 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.845 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a08175-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.846 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.848 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 podman[242774]: 2025-12-05 12:19:41.849872734 +0000 UTC m=+0.053231784 container remove a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.851 187212 INFO os_vif [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5')
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.852 187212 INFO nova.virt.libvirt.driver [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Deleting instance files /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e_del
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.853 187212 INFO nova.virt.libvirt.driver [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Deletion of /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e_del complete
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.854 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3f56af92-06df-4899-b635-9d6263b8b83f]: (4, ('Fri Dec  5 12:19:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc (a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0)\na2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0\nFri Dec  5 12:19:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc (a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0)\na2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.855 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[482b2f5f-7d5a-4978-9c66-20d7c08c921f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.856 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52916d9d-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.858 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 kernel: tap52916d9d-e0: left promiscuous mode
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.868 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.869 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.870 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aa4abc68-ac86-40ce-ba41-7c0319db6268]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.887 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0d5d3070-4199-4ad3-93ed-f48d6091a2eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.889 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec273c0-22f5-4dd6-b8bc-85c61a516435]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.907 187212 INFO nova.compute.manager [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.908 187212 DEBUG oslo.service.loopingcall [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.909 187212 DEBUG nova.compute.manager [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:19:41 compute-0 nova_compute[187208]: 2025-12-05 12:19:41.909 187212 DEBUG nova.network.neutron [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.908 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fccee749-f5ef-489f-8363-f553aa0ba029]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455810, 'reachable_time': 42002, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242798, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.912 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:19:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.913 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c57f4809-13ae-453c-994f-f9d4a6876437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:41 compute-0 systemd[1]: run-netns-ovnmeta\x2d52916d9d\x2deb76\x2d4677\x2d8333\x2dd02c9507adbc.mount: Deactivated successfully.
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.572 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-unplugged-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.572 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.573 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.573 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.574 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] No waiting events found dispatching network-vif-unplugged-f7a08175-a5c6-45b7-b194-819c5b881995 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.574 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-unplugged-f7a08175-a5c6-45b7-b194-819c5b881995 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.575 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.575 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.576 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.576 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.576 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] No waiting events found dispatching network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:19:42 compute-0 nova_compute[187208]: 2025-12-05 12:19:42.577 187212 WARNING nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received unexpected event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 for instance with vm_state active and task_state deleting.
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.384 187212 DEBUG nova.network.neutron [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.405 187212 INFO nova.compute.manager [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Took 1.50 seconds to deallocate network for instance.
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.468 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.469 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.551 187212 DEBUG nova.compute.provider_tree [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.568 187212 DEBUG nova.scheduler.client.report [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.598 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.610 187212 DEBUG nova.compute.manager [req-459ac535-60cf-4884-b316-39f17361ac48 req-554e2935-329c-4a68-a133-4a6c5cd558cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-deleted-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.632 187212 INFO nova.scheduler.client.report [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Deleted allocations for instance 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e
Dec 05 12:19:43 compute-0 nova_compute[187208]: 2025-12-05 12:19:43.688 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:44 compute-0 nova_compute[187208]: 2025-12-05 12:19:44.520 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:45 compute-0 podman[242800]: 2025-12-05 12:19:45.22149872 +0000 UTC m=+0.077462917 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:19:45 compute-0 podman[242799]: 2025-12-05 12:19:45.240195745 +0000 UTC m=+0.096291266 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350)
Dec 05 12:19:46 compute-0 nova_compute[187208]: 2025-12-05 12:19:46.848 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:47 compute-0 nova_compute[187208]: 2025-12-05 12:19:47.941 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:47 compute-0 nova_compute[187208]: 2025-12-05 12:19:47.941 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:47 compute-0 nova_compute[187208]: 2025-12-05 12:19:47.965 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.043 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.044 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.050 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.051 187212 INFO nova.compute.claims [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Claim successful on node compute-0.ctlplane.example.com
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.477 187212 DEBUG nova.compute.provider_tree [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.494 187212 DEBUG nova.scheduler.client.report [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.524 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.524 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.585 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.585 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.671 187212 INFO nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.773 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.868 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.870 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.870 187212 INFO nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Creating image(s)
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.871 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.871 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.872 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.888 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.954 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.955 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.956 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:48 compute-0 nova_compute[187208]: 2025-12-05 12:19:48.967 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.003 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937174.0022178, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.004 187212 INFO nova.compute.manager [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Stopped (Lifecycle Event)
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.026 187212 DEBUG nova.compute.manager [None req-ddd973ba-e75d-48b8-87d4-ea05960e6a31 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.028 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.028 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.063 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.065 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.065 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.132 187212 DEBUG nova.policy [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '223f7822261946cc9228b2207bd1096c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.136 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.136 187212 DEBUG nova.virt.disk.api [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Checking if we can resize image /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.137 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.227 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.229 187212 DEBUG nova.virt.disk.api [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Cannot resize image /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.230 187212 DEBUG nova.objects.instance [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'migration_context' on Instance uuid 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.251 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.252 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Ensure instance console log exists: /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.252 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.253 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.253 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:49 compute-0 nova_compute[187208]: 2025-12-05 12:19:49.547 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:50 compute-0 nova_compute[187208]: 2025-12-05 12:19:50.077 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:19:50 compute-0 nova_compute[187208]: 2025-12-05 12:19:50.078 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:19:50 compute-0 nova_compute[187208]: 2025-12-05 12:19:50.180 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Successfully created port: ddf5ec0d-377b-480c-8991-738446cfb2db _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.081 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.081 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.793 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Successfully updated port: ddf5ec0d-377b-480c-8991-738446cfb2db _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.811 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.812 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquired lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.812 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 12:19:51 compute-0 nova_compute[187208]: 2025-12-05 12:19:51.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:52 compute-0 nova_compute[187208]: 2025-12-05 12:19:52.028 187212 DEBUG nova.compute.manager [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-changed-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:52 compute-0 nova_compute[187208]: 2025-12-05 12:19:52.029 187212 DEBUG nova.compute.manager [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Refreshing instance network info cache due to event network-changed-ddf5ec0d-377b-480c-8991-738446cfb2db. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 12:19:52 compute-0 nova_compute[187208]: 2025-12-05 12:19:52.030 187212 DEBUG oslo_concurrency.lockutils [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 12:19:52 compute-0 nova_compute[187208]: 2025-12-05 12:19:52.160 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 12:19:52 compute-0 nova_compute[187208]: 2025-12-05 12:19:52.634 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937177.6323996, 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:52 compute-0 nova_compute[187208]: 2025-12-05 12:19:52.635 187212 INFO nova.compute.manager [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] VM Stopped (Lifecycle Event)
Dec 05 12:19:52 compute-0 nova_compute[187208]: 2025-12-05 12:19:52.817 187212 DEBUG nova.compute.manager [None req-957d7a77-fdff-4898-86e0-daef7da8c69c - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.197 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Updating instance_info_cache with network_info: [{"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.218 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Releasing lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.218 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance network_info: |[{"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.219 187212 DEBUG oslo_concurrency.lockutils [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.219 187212 DEBUG nova.network.neutron [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Refreshing network info cache for port ddf5ec0d-377b-480c-8991-738446cfb2db _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.222 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start _get_guest_xml network_info=[{"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.226 187212 WARNING nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.231 187212 DEBUG nova.virt.libvirt.host [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.231 187212 DEBUG nova.virt.libvirt.host [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.237 187212 DEBUG nova.virt.libvirt.host [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.238 187212 DEBUG nova.virt.libvirt.host [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.238 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.239 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.239 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.240 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.240 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.240 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.240 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.241 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.241 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.241 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.241 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.242 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.245 187212 DEBUG nova.virt.libvirt.vif [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-375351270',display_name='tempest-VolumesActionsTest-instance-375351270',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-375351270',id=107,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-g6z4cse6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:19:48Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=7eb08f99-b40c-4ba3-9b30-6cfb447ba68d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.246 187212 DEBUG nova.network.os_vif_util [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.247 187212 DEBUG nova.network.os_vif_util [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.248 187212 DEBUG nova.objects.instance [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'pci_devices' on Instance uuid 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.261 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] End _get_guest_xml xml=<domain type="kvm">
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <uuid>7eb08f99-b40c-4ba3-9b30-6cfb447ba68d</uuid>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <name>instance-0000006b</name>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <memory>131072</memory>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <vcpu>1</vcpu>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <metadata>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <nova:name>tempest-VolumesActionsTest-instance-375351270</nova:name>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <nova:creationTime>2025-12-05 12:19:53</nova:creationTime>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <nova:flavor name="m1.nano">
Dec 05 12:19:53 compute-0 nova_compute[187208]:         <nova:memory>128</nova:memory>
Dec 05 12:19:53 compute-0 nova_compute[187208]:         <nova:disk>1</nova:disk>
Dec 05 12:19:53 compute-0 nova_compute[187208]:         <nova:swap>0</nova:swap>
Dec 05 12:19:53 compute-0 nova_compute[187208]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 12:19:53 compute-0 nova_compute[187208]:         <nova:vcpus>1</nova:vcpus>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       </nova:flavor>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <nova:owner>
Dec 05 12:19:53 compute-0 nova_compute[187208]:         <nova:user uuid="223f7822261946cc9228b2207bd1096c">tempest-VolumesActionsTest-1057905007-project-member</nova:user>
Dec 05 12:19:53 compute-0 nova_compute[187208]:         <nova:project uuid="3463fde58c6c4bea98c82b2cb087a0dd">tempest-VolumesActionsTest-1057905007</nova:project>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       </nova:owner>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <nova:ports>
Dec 05 12:19:53 compute-0 nova_compute[187208]:         <nova:port uuid="ddf5ec0d-377b-480c-8991-738446cfb2db">
Dec 05 12:19:53 compute-0 nova_compute[187208]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:         </nova:port>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       </nova:ports>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     </nova:instance>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   </metadata>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <sysinfo type="smbios">
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <system>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <entry name="manufacturer">RDO</entry>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <entry name="product">OpenStack Compute</entry>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <entry name="serial">7eb08f99-b40c-4ba3-9b30-6cfb447ba68d</entry>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <entry name="uuid">7eb08f99-b40c-4ba3-9b30-6cfb447ba68d</entry>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <entry name="family">Virtual Machine</entry>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     </system>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   </sysinfo>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <os>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <boot dev="hd"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <smbios mode="sysinfo"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   </os>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <features>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <acpi/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <apic/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <vmcoreinfo/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   </features>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <clock offset="utc">
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <timer name="hpet" present="no"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   </clock>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <cpu mode="host-model" match="exact">
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   </cpu>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   <devices>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <disk type="file" device="disk">
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <target dev="vda" bus="virtio"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <disk type="file" device="cdrom">
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <source file="/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.config"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <target dev="sda" bus="sata"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     </disk>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <interface type="ethernet">
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <mac address="fa:16:3e:51:2f:35"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <mtu size="1442"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <target dev="tapddf5ec0d-37"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     </interface>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <serial type="pty">
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <log file="/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/console.log" append="off"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     </serial>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <video>
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <model type="virtio"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     </video>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <input type="tablet" bus="usb"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <rng model="virtio">
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <backend model="random">/dev/urandom</backend>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     </rng>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <controller type="usb" index="0"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     <memballoon model="virtio">
Dec 05 12:19:53 compute-0 nova_compute[187208]:       <stats period="10"/>
Dec 05 12:19:53 compute-0 nova_compute[187208]:     </memballoon>
Dec 05 12:19:53 compute-0 nova_compute[187208]:   </devices>
Dec 05 12:19:53 compute-0 nova_compute[187208]: </domain>
Dec 05 12:19:53 compute-0 nova_compute[187208]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.262 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Preparing to wait for external event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.263 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.263 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.263 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.264 187212 DEBUG nova.virt.libvirt.vif [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-375351270',display_name='tempest-VolumesActionsTest-instance-375351270',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-375351270',id=107,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-g6z4cse6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:19:48Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=7eb08f99-b40c-4ba3-9b30-6cfb447ba68d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.264 187212 DEBUG nova.network.os_vif_util [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.264 187212 DEBUG nova.network.os_vif_util [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.265 187212 DEBUG os_vif [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.265 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.266 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.266 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.268 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.268 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapddf5ec0d-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.269 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapddf5ec0d-37, col_values=(('external_ids', {'iface-id': 'ddf5ec0d-377b-480c-8991-738446cfb2db', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:51:2f:35', 'vm-uuid': '7eb08f99-b40c-4ba3-9b30-6cfb447ba68d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:53 compute-0 NetworkManager[55691]: <info>  [1764937193.2716] manager: (tapddf5ec0d-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/430)
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.273 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.276 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.277 187212 INFO os_vif [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37')
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.329 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.329 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.329 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No VIF found with MAC fa:16:3e:51:2f:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.330 187212 INFO nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Using config drive
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.853 187212 INFO nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Creating config drive at /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.config
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.858 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mc55qvf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:53 compute-0 nova_compute[187208]: 2025-12-05 12:19:53.987 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mc55qvf" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:54 compute-0 kernel: tapddf5ec0d-37: entered promiscuous mode
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.049 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:54 compute-0 NetworkManager[55691]: <info>  [1764937194.0508] manager: (tapddf5ec0d-37): new Tun device (/org/freedesktop/NetworkManager/Devices/431)
Dec 05 12:19:54 compute-0 ovn_controller[95610]: 2025-12-05T12:19:54Z|01126|binding|INFO|Claiming lport ddf5ec0d-377b-480c-8991-738446cfb2db for this chassis.
Dec 05 12:19:54 compute-0 ovn_controller[95610]: 2025-12-05T12:19:54Z|01127|binding|INFO|ddf5ec0d-377b-480c-8991-738446cfb2db: Claiming fa:16:3e:51:2f:35 10.100.0.6
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:54 compute-0 ovn_controller[95610]: 2025-12-05T12:19:54Z|01128|binding|INFO|Setting lport ddf5ec0d-377b-480c-8991-738446cfb2db ovn-installed in OVS
Dec 05 12:19:54 compute-0 ovn_controller[95610]: 2025-12-05T12:19:54Z|01129|binding|INFO|Setting lport ddf5ec0d-377b-480c-8991-738446cfb2db up in Southbound
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.063 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:2f:35 10.100.0.6'], port_security=['fa:16:3e:51:2f:35 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7eb08f99-b40c-4ba3-9b30-6cfb447ba68d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52916d9d-eb76-4677-8333-d02c9507adbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '21c17f9f-419d-4f4a-937b-418d08c504db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ca61dc-1564-41a5-9908-f7ecfadeff73, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ddf5ec0d-377b-480c-8991-738446cfb2db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.064 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.065 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ddf5ec0d-377b-480c-8991-738446cfb2db in datapath 52916d9d-eb76-4677-8333-d02c9507adbc bound to our chassis
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.067 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 52916d9d-eb76-4677-8333-d02c9507adbc
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.077 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d159017d-58f6-43d2-b8ff-1aa7a51b11e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.078 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap52916d9d-e1 in ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.080 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap52916d9d-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.080 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[39a579f4-b627-46e9-a29d-3c865bf27ef2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.080 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1c638d23-d1fa-4810-8fda-0e3e4e016d89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 systemd-udevd[242875]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 12:19:54 compute-0 systemd-machined[153543]: New machine qemu-133-instance-0000006b.
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.093 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a5fa9f38-2434-42d3-908f-ca358903018d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 NetworkManager[55691]: <info>  [1764937194.1003] device (tapddf5ec0d-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 12:19:54 compute-0 NetworkManager[55691]: <info>  [1764937194.1018] device (tapddf5ec0d-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 12:19:54 compute-0 systemd[1]: Started Virtual Machine qemu-133-instance-0000006b.
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.113 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[60ef8f18-3f2f-457d-9d84-1f1b9146ccff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.145 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[11fdc477-1af9-4817-840d-746553ce0619]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.150 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[729c44f6-32f0-4978-932c-75eed4daecd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 NetworkManager[55691]: <info>  [1764937194.1517] manager: (tap52916d9d-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/432)
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.178 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fffb8cf7-fc42-4930-9e63-73ded0ef148e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.180 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4494d9-f766-4e13-8e6a-c02434c854a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 NetworkManager[55691]: <info>  [1764937194.2042] device (tap52916d9d-e0): carrier: link connected
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.213 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[804c074c-191c-4011-bb7a-934462bccbd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.231 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c306d941-2055-46ab-abf0-2e5d54e83a59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52916d9d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:c4:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 312], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457476, 'reachable_time': 32043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242908, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.245 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84bb564a-4e2a-4bb9-8197-89529a5539a9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:c462'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457476, 'tstamp': 457476}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242909, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.259 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6236ce24-09cd-4ec5-893d-73b870147080]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52916d9d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:c4:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 312], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457476, 'reachable_time': 32043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242910, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef487fb-f91f-45e9-b8e9-9d14773094ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.339 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fc69905e-af17-4358-b3ad-d549d6d7cec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.341 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52916d9d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.341 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.341 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52916d9d-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:54 compute-0 kernel: tap52916d9d-e0: entered promiscuous mode
Dec 05 12:19:54 compute-0 NetworkManager[55691]: <info>  [1764937194.3452] manager: (tap52916d9d-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/433)
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.345 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap52916d9d-e0, col_values=(('external_ids', {'iface-id': 'bfd2a34a-bdd5-4486-82a8-fc55b6e1020a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:54 compute-0 ovn_controller[95610]: 2025-12-05T12:19:54Z|01130|binding|INFO|Releasing lport bfd2a34a-bdd5-4486-82a8-fc55b6e1020a from this chassis (sb_readonly=0)
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.346 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.362 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.363 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.363 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b670e069-f2a8-4c8e-a29a-7104058bd6a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.364 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: global
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     log         /dev/log local0 debug
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     log-tag     haproxy-metadata-proxy-52916d9d-eb76-4677-8333-d02c9507adbc
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     user        root
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     group       root
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     maxconn     1024
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     pidfile     /var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     daemon
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: defaults
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     log global
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     mode http
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     option httplog
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     option dontlognull
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     option http-server-close
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     option forwardfor
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     retries                 3
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     timeout http-request    30s
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     timeout connect         30s
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     timeout client          32s
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     timeout server          32s
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     timeout http-keep-alive 30s
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: listen listener
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     bind 169.254.169.254:80
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:     http-request add-header X-OVN-Network-ID 52916d9d-eb76-4677-8333-d02c9507adbc
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 12:19:54 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.365 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'env', 'PROCESS_TAG=haproxy-52916d9d-eb76-4677-8333-d02c9507adbc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/52916d9d-eb76-4677-8333-d02c9507adbc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.526 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937194.5253913, 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.527 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] VM Started (Lifecycle Event)
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.680 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.686 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937194.5256512, 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.687 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] VM Paused (Lifecycle Event)
Dec 05 12:19:54 compute-0 podman[242949]: 2025-12-05 12:19:54.71129209 +0000 UTC m=+0.054488620 container create a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.722 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.726 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:54 compute-0 systemd[1]: Started libpod-conmon-a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d.scope.
Dec 05 12:19:54 compute-0 nova_compute[187208]: 2025-12-05 12:19:54.754 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:19:54 compute-0 podman[242949]: 2025-12-05 12:19:54.684510113 +0000 UTC m=+0.027706673 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 12:19:54 compute-0 systemd[1]: Started libcrun container.
Dec 05 12:19:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d28657b9bcc58b5db932e3079c4da3935dab0ca3c6598d46814aaf9161b94c14/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 12:19:54 compute-0 podman[242949]: 2025-12-05 12:19:54.796718993 +0000 UTC m=+0.139915553 container init a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:19:54 compute-0 podman[242949]: 2025-12-05 12:19:54.805114414 +0000 UTC m=+0.148310954 container start a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 12:19:54 compute-0 podman[242965]: 2025-12-05 12:19:54.814381309 +0000 UTC m=+0.062066607 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:19:54 compute-0 podman[242962]: 2025-12-05 12:19:54.824958541 +0000 UTC m=+0.074628676 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 05 12:19:54 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [NOTICE]   (243013) : New worker (243031) forked
Dec 05 12:19:54 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [NOTICE]   (243013) : Loading success.
Dec 05 12:19:54 compute-0 podman[242966]: 2025-12-05 12:19:54.858740117 +0000 UTC m=+0.097863160 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.310 187212 DEBUG nova.compute.manager [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.310 187212 DEBUG oslo_concurrency.lockutils [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.311 187212 DEBUG oslo_concurrency.lockutils [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.311 187212 DEBUG oslo_concurrency.lockutils [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.311 187212 DEBUG nova.compute.manager [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Processing event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.312 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.317 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937195.3173542, 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.318 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] VM Resumed (Lifecycle Event)
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.320 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.324 187212 INFO nova.virt.libvirt.driver [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance spawned successfully.
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.324 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.349 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.356 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.360 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.360 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.361 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.361 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.362 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.363 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.383 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.412 187212 INFO nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Took 6.54 seconds to spawn the instance on the hypervisor.
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.413 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.471 187212 INFO nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Took 7.45 seconds to build instance.
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.489 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.903 187212 DEBUG nova.network.neutron [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Updated VIF entry in instance network info cache for port ddf5ec0d-377b-480c-8991-738446cfb2db. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.903 187212 DEBUG nova.network.neutron [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Updating instance_info_cache with network_info: [{"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:19:55 compute-0 nova_compute[187208]: 2025-12-05 12:19:55.926 187212 DEBUG oslo_concurrency.lockutils [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 12:19:56 compute-0 nova_compute[187208]: 2025-12-05 12:19:56.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:19:56 compute-0 nova_compute[187208]: 2025-12-05 12:19:56.086 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:56.087 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:19:56 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:56.089 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:19:56 compute-0 nova_compute[187208]: 2025-12-05 12:19:56.824 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937181.8230302, 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:19:56 compute-0 nova_compute[187208]: 2025-12-05 12:19:56.825 187212 INFO nova.compute.manager [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] VM Stopped (Lifecycle Event)
Dec 05 12:19:56 compute-0 nova_compute[187208]: 2025-12-05 12:19:56.846 187212 DEBUG nova.compute.manager [None req-b29b46de-d7c7-4a7c-8486-0227331da15d - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:19:57 compute-0 nova_compute[187208]: 2025-12-05 12:19:57.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:19:57 compute-0 nova_compute[187208]: 2025-12-05 12:19:57.463 187212 DEBUG nova.compute.manager [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:57 compute-0 nova_compute[187208]: 2025-12-05 12:19:57.464 187212 DEBUG oslo_concurrency.lockutils [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:57 compute-0 nova_compute[187208]: 2025-12-05 12:19:57.465 187212 DEBUG oslo_concurrency.lockutils [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:57 compute-0 nova_compute[187208]: 2025-12-05 12:19:57.465 187212 DEBUG oslo_concurrency.lockutils [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:57 compute-0 nova_compute[187208]: 2025-12-05 12:19:57.466 187212 DEBUG nova.compute.manager [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] No waiting events found dispatching network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:19:57 compute-0 nova_compute[187208]: 2025-12-05 12:19:57.466 187212 WARNING nova.compute.manager [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received unexpected event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db for instance with vm_state active and task_state None.
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.088 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.162 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.222 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.223 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.282 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.286 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.438 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.439 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5409MB free_disk=73.03979873657227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.439 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.440 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.515 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.516 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.516 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.570 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.584 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.615 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.615 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.799 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.802 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.803 187212 INFO nova.compute.manager [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Terminating instance
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.804 187212 DEBUG nova.compute.manager [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 12:19:58 compute-0 kernel: tapddf5ec0d-37 (unregistering): left promiscuous mode
Dec 05 12:19:58 compute-0 NetworkManager[55691]: <info>  [1764937198.8296] device (tapddf5ec0d-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 12:19:58 compute-0 ovn_controller[95610]: 2025-12-05T12:19:58Z|01131|binding|INFO|Releasing lport ddf5ec0d-377b-480c-8991-738446cfb2db from this chassis (sb_readonly=0)
Dec 05 12:19:58 compute-0 ovn_controller[95610]: 2025-12-05T12:19:58Z|01132|binding|INFO|Setting lport ddf5ec0d-377b-480c-8991-738446cfb2db down in Southbound
Dec 05 12:19:58 compute-0 ovn_controller[95610]: 2025-12-05T12:19:58Z|01133|binding|INFO|Removing iface tapddf5ec0d-37 ovn-installed in OVS
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.840 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.848 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:2f:35 10.100.0.6'], port_security=['fa:16:3e:51:2f:35 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7eb08f99-b40c-4ba3-9b30-6cfb447ba68d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52916d9d-eb76-4677-8333-d02c9507adbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21c17f9f-419d-4f4a-937b-418d08c504db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ca61dc-1564-41a5-9908-f7ecfadeff73, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ddf5ec0d-377b-480c-8991-738446cfb2db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:19:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.850 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ddf5ec0d-377b-480c-8991-738446cfb2db in datapath 52916d9d-eb76-4677-8333-d02c9507adbc unbound from our chassis
Dec 05 12:19:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.852 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 52916d9d-eb76-4677-8333-d02c9507adbc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:19:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.854 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf133c81-ba71-44ec-82d6-1fcde3250615]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.855 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc namespace which is not needed anymore
Dec 05 12:19:58 compute-0 nova_compute[187208]: 2025-12-05 12:19:58.856 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:58 compute-0 systemd[1]: machine-qemu\x2d133\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Dec 05 12:19:58 compute-0 systemd[1]: machine-qemu\x2d133\x2dinstance\x2d0000006b.scope: Consumed 3.954s CPU time.
Dec 05 12:19:58 compute-0 systemd-machined[153543]: Machine qemu-133-instance-0000006b terminated.
Dec 05 12:19:58 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [NOTICE]   (243013) : haproxy version is 2.8.14-c23fe91
Dec 05 12:19:58 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [NOTICE]   (243013) : path to executable is /usr/sbin/haproxy
Dec 05 12:19:58 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [WARNING]  (243013) : Exiting Master process...
Dec 05 12:19:58 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [ALERT]    (243013) : Current worker (243031) exited with code 143 (Terminated)
Dec 05 12:19:58 compute-0 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [WARNING]  (243013) : All workers exited. Exiting... (0)
Dec 05 12:19:58 compute-0 systemd[1]: libpod-a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d.scope: Deactivated successfully.
Dec 05 12:19:59 compute-0 podman[243076]: 2025-12-05 12:19:59.006153018 +0000 UTC m=+0.053330267 container died a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d-userdata-shm.mount: Deactivated successfully.
Dec 05 12:19:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-d28657b9bcc58b5db932e3079c4da3935dab0ca3c6598d46814aaf9161b94c14-merged.mount: Deactivated successfully.
Dec 05 12:19:59 compute-0 podman[243076]: 2025-12-05 12:19:59.054742518 +0000 UTC m=+0.101919777 container cleanup a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 05 12:19:59 compute-0 systemd[1]: libpod-conmon-a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d.scope: Deactivated successfully.
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.090 187212 INFO nova.virt.libvirt.driver [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance destroyed successfully.
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.092 187212 DEBUG nova.objects.instance [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'resources' on Instance uuid 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.106 187212 DEBUG nova.virt.libvirt.vif [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-375351270',display_name='tempest-VolumesActionsTest-instance-375351270',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-375351270',id=107,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:19:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-g6z4cse6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:19:55Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=7eb08f99-b40c-4ba3-9b30-6cfb447ba68d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.107 187212 DEBUG nova.network.os_vif_util [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.108 187212 DEBUG nova.network.os_vif_util [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.109 187212 DEBUG os_vif [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.112 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.113 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapddf5ec0d-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.116 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.118 187212 INFO os_vif [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37')
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.118 187212 INFO nova.virt.libvirt.driver [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Deleting instance files /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d_del
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.119 187212 INFO nova.virt.libvirt.driver [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Deletion of /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d_del complete
Dec 05 12:19:59 compute-0 podman[243117]: 2025-12-05 12:19:59.126575353 +0000 UTC m=+0.043523666 container remove a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 05 12:19:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.131 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9954b2a6-7800-43a5-b865-841406270ca0]: (4, ('Fri Dec  5 12:19:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc (a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d)\na9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d\nFri Dec  5 12:19:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc (a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d)\na9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.133 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[20b9d631-0816-4570-826f-8f489feffdd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.134 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52916d9d-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.136 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:59 compute-0 kernel: tap52916d9d-e0: left promiscuous mode
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.149 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.153 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[793a252d-7e57-47f7-a47e-7674d0f7530c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a76e09e0-e9b5-4ba0-b4b9-122d3685b3e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.172 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce83d5d-20a5-4c32-a671-7d18beaedbb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.182 187212 INFO nova.compute.manager [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.182 187212 DEBUG oslo.service.loopingcall [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.182 187212 DEBUG nova.compute.manager [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.183 187212 DEBUG nova.network.neutron [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 12:19:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.189 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d294dde2-3636-4c66-acf0-e64ce6282ba8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457470, 'reachable_time': 34426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243134, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.192 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 12:19:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.192 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7ea968d1-3177-4b6f-817a-b6ee98221a3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:19:59 compute-0 systemd[1]: run-netns-ovnmeta\x2d52916d9d\x2deb76\x2d4677\x2d8333\x2dd02c9507adbc.mount: Deactivated successfully.
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.552 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.805 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-unplugged-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.806 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.806 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.807 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.807 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] No waiting events found dispatching network-vif-unplugged-ddf5ec0d-377b-480c-8991-738446cfb2db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.807 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-unplugged-ddf5ec0d-377b-480c-8991-738446cfb2db for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.807 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.808 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.808 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.808 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.808 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] No waiting events found dispatching network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 12:19:59 compute-0 nova_compute[187208]: 2025-12-05 12:19:59.809 187212 WARNING nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received unexpected event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db for instance with vm_state active and task_state deleting.
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.495 187212 DEBUG nova.network.neutron [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.512 187212 INFO nova.compute.manager [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Took 1.33 seconds to deallocate network for instance.
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.562 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.562 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.628 187212 DEBUG nova.compute.provider_tree [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.653 187212 DEBUG nova.scheduler.client.report [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.674 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.712 187212 INFO nova.scheduler.client.report [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Deleted allocations for instance 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.751 187212 DEBUG nova.compute.manager [req-186f0cf4-77ef-4cfa-9de9-83f3f1d31fa5 req-936c14f6-b75c-4464-93cd-1bf8dcc13470 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-deleted-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 12:20:00 compute-0 nova_compute[187208]: 2025-12-05 12:20:00.970 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:20:01 compute-0 nova_compute[187208]: 2025-12-05 12:20:01.611 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:02 compute-0 nova_compute[187208]: 2025-12-05 12:20:02.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:20:03.023 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:20:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:20:03.024 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:20:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:20:03.024 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:20:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:20:04.093 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:20:04 compute-0 nova_compute[187208]: 2025-12-05 12:20:04.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:04 compute-0 nova_compute[187208]: 2025-12-05 12:20:04.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:06 compute-0 podman[243135]: 2025-12-05 12:20:06.207084505 +0000 UTC m=+0.055394566 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:20:09 compute-0 nova_compute[187208]: 2025-12-05 12:20:09.120 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:09 compute-0 nova_compute[187208]: 2025-12-05 12:20:09.555 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:12 compute-0 podman[243159]: 2025-12-05 12:20:12.229959671 +0000 UTC m=+0.083990164 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 12:20:14 compute-0 nova_compute[187208]: 2025-12-05 12:20:14.088 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937199.0874865, 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 12:20:14 compute-0 nova_compute[187208]: 2025-12-05 12:20:14.088 187212 INFO nova.compute.manager [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] VM Stopped (Lifecycle Event)
Dec 05 12:20:14 compute-0 nova_compute[187208]: 2025-12-05 12:20:14.122 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:14 compute-0 nova_compute[187208]: 2025-12-05 12:20:14.150 187212 DEBUG nova.compute.manager [None req-a360d02f-62f0-4717-92f4-4248aa2efd85 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 12:20:14 compute-0 nova_compute[187208]: 2025-12-05 12:20:14.557 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:16 compute-0 podman[243180]: 2025-12-05 12:20:16.202787507 +0000 UTC m=+0.059389490 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git)
Dec 05 12:20:16 compute-0 podman[243181]: 2025-12-05 12:20:16.220697839 +0000 UTC m=+0.072990539 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 05 12:20:19 compute-0 nova_compute[187208]: 2025-12-05 12:20:19.124 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:19 compute-0 nova_compute[187208]: 2025-12-05 12:20:19.558 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:24 compute-0 nova_compute[187208]: 2025-12-05 12:20:24.127 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:24 compute-0 nova_compute[187208]: 2025-12-05 12:20:24.560 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:25 compute-0 podman[243219]: 2025-12-05 12:20:25.200974739 +0000 UTC m=+0.053219053 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:20:25 compute-0 podman[243220]: 2025-12-05 12:20:25.206994301 +0000 UTC m=+0.054495500 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:20:25 compute-0 podman[243221]: 2025-12-05 12:20:25.233759787 +0000 UTC m=+0.076928872 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:20:29 compute-0 nova_compute[187208]: 2025-12-05 12:20:29.131 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:29 compute-0 nova_compute[187208]: 2025-12-05 12:20:29.561 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:34 compute-0 nova_compute[187208]: 2025-12-05 12:20:34.135 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:34 compute-0 nova_compute[187208]: 2025-12-05 12:20:34.561 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:20:37 compute-0 podman[243282]: 2025-12-05 12:20:37.197914274 +0000 UTC m=+0.050129385 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:20:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:20:37.611 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:20:37 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:20:37.611 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:20:37 compute-0 nova_compute[187208]: 2025-12-05 12:20:37.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:39 compute-0 nova_compute[187208]: 2025-12-05 12:20:39.139 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:39 compute-0 nova_compute[187208]: 2025-12-05 12:20:39.564 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:41 compute-0 nova_compute[187208]: 2025-12-05 12:20:41.399 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:41 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:20:41.613 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:20:43 compute-0 podman[243306]: 2025-12-05 12:20:43.197399659 +0000 UTC m=+0.051704040 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 05 12:20:44 compute-0 nova_compute[187208]: 2025-12-05 12:20:44.141 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:44 compute-0 nova_compute[187208]: 2025-12-05 12:20:44.566 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:47 compute-0 podman[243328]: 2025-12-05 12:20:47.205837124 +0000 UTC m=+0.054730197 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 12:20:47 compute-0 podman[243327]: 2025-12-05 12:20:47.208767908 +0000 UTC m=+0.060204973 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public)
Dec 05 12:20:49 compute-0 nova_compute[187208]: 2025-12-05 12:20:49.144 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:49 compute-0 nova_compute[187208]: 2025-12-05 12:20:49.568 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:51 compute-0 nova_compute[187208]: 2025-12-05 12:20:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:51 compute-0 nova_compute[187208]: 2025-12-05 12:20:51.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:20:51 compute-0 nova_compute[187208]: 2025-12-05 12:20:51.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:20:51 compute-0 nova_compute[187208]: 2025-12-05 12:20:51.077 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:20:51 compute-0 nova_compute[187208]: 2025-12-05 12:20:51.077 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:51 compute-0 nova_compute[187208]: 2025-12-05 12:20:51.078 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:20:53 compute-0 nova_compute[187208]: 2025-12-05 12:20:53.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:54 compute-0 nova_compute[187208]: 2025-12-05 12:20:54.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:54 compute-0 nova_compute[187208]: 2025-12-05 12:20:54.148 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:54 compute-0 nova_compute[187208]: 2025-12-05 12:20:54.570 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:55 compute-0 nova_compute[187208]: 2025-12-05 12:20:55.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:56 compute-0 podman[243363]: 2025-12-05 12:20:56.19782976 +0000 UTC m=+0.056385954 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:20:56 compute-0 podman[243364]: 2025-12-05 12:20:56.205473149 +0000 UTC m=+0.057076824 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:20:56 compute-0 podman[243365]: 2025-12-05 12:20:56.241214011 +0000 UTC m=+0.092237279 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 12:20:57 compute-0 nova_compute[187208]: 2025-12-05 12:20:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.099 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.100 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.300 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.301 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5642MB free_disk=73.0406265258789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.301 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.302 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.401 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.402 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.456 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.471 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.505 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:20:58 compute-0 nova_compute[187208]: 2025-12-05 12:20:58.506 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:20:59 compute-0 nova_compute[187208]: 2025-12-05 12:20:59.151 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:20:59 compute-0 nova_compute[187208]: 2025-12-05 12:20:59.573 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:02 compute-0 nova_compute[187208]: 2025-12-05 12:21:02.501 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:21:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:21:03.024 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:21:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:21:03.025 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:21:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:21:03.025 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:21:04 compute-0 nova_compute[187208]: 2025-12-05 12:21:04.154 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:04 compute-0 nova_compute[187208]: 2025-12-05 12:21:04.575 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:08 compute-0 podman[243429]: 2025-12-05 12:21:08.192776778 +0000 UTC m=+0.049020884 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:21:09 compute-0 nova_compute[187208]: 2025-12-05 12:21:09.158 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:09 compute-0 nova_compute[187208]: 2025-12-05 12:21:09.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:12 compute-0 ovn_controller[95610]: 2025-12-05T12:21:12Z|01134|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 05 12:21:14 compute-0 nova_compute[187208]: 2025-12-05 12:21:14.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:14 compute-0 podman[243454]: 2025-12-05 12:21:14.221105859 +0000 UTC m=+0.067801101 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Dec 05 12:21:14 compute-0 nova_compute[187208]: 2025-12-05 12:21:14.579 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:18 compute-0 podman[243476]: 2025-12-05 12:21:18.197141806 +0000 UTC m=+0.047589572 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 12:21:18 compute-0 podman[243475]: 2025-12-05 12:21:18.204694192 +0000 UTC m=+0.057813395 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 12:21:19 compute-0 nova_compute[187208]: 2025-12-05 12:21:19.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:19 compute-0 nova_compute[187208]: 2025-12-05 12:21:19.581 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:24 compute-0 nova_compute[187208]: 2025-12-05 12:21:24.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:24 compute-0 nova_compute[187208]: 2025-12-05 12:21:24.583 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:27 compute-0 podman[243519]: 2025-12-05 12:21:27.208639132 +0000 UTC m=+0.051207256 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 12:21:27 compute-0 podman[243518]: 2025-12-05 12:21:27.208364824 +0000 UTC m=+0.059269686 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:21:27 compute-0 podman[243525]: 2025-12-05 12:21:27.265810668 +0000 UTC m=+0.095152603 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:21:29 compute-0 nova_compute[187208]: 2025-12-05 12:21:29.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:29 compute-0 nova_compute[187208]: 2025-12-05 12:21:29.584 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:34 compute-0 nova_compute[187208]: 2025-12-05 12:21:34.175 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:34 compute-0 nova_compute[187208]: 2025-12-05 12:21:34.586 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:39 compute-0 nova_compute[187208]: 2025-12-05 12:21:39.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:39 compute-0 podman[243588]: 2025-12-05 12:21:39.213941787 +0000 UTC m=+0.057222302 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:21:39 compute-0 nova_compute[187208]: 2025-12-05 12:21:39.587 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:44 compute-0 nova_compute[187208]: 2025-12-05 12:21:44.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:44 compute-0 nova_compute[187208]: 2025-12-05 12:21:44.588 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:45 compute-0 podman[243614]: 2025-12-05 12:21:45.195008037 +0000 UTC m=+0.053047863 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm)
Dec 05 12:21:49 compute-0 nova_compute[187208]: 2025-12-05 12:21:49.185 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:49 compute-0 podman[243634]: 2025-12-05 12:21:49.207576076 +0000 UTC m=+0.057256453 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 05 12:21:49 compute-0 podman[243635]: 2025-12-05 12:21:49.234719715 +0000 UTC m=+0.079401719 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 05 12:21:49 compute-0 nova_compute[187208]: 2025-12-05 12:21:49.590 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:52 compute-0 nova_compute[187208]: 2025-12-05 12:21:52.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:21:52 compute-0 nova_compute[187208]: 2025-12-05 12:21:52.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:21:52 compute-0 nova_compute[187208]: 2025-12-05 12:21:52.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:21:52 compute-0 nova_compute[187208]: 2025-12-05 12:21:52.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:21:53 compute-0 nova_compute[187208]: 2025-12-05 12:21:53.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:21:53 compute-0 nova_compute[187208]: 2025-12-05 12:21:53.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:21:54 compute-0 nova_compute[187208]: 2025-12-05 12:21:54.187 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:54 compute-0 nova_compute[187208]: 2025-12-05 12:21:54.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:55 compute-0 nova_compute[187208]: 2025-12-05 12:21:55.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:21:55 compute-0 nova_compute[187208]: 2025-12-05 12:21:55.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:21:55 compute-0 nova_compute[187208]: 2025-12-05 12:21:55.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:21:58 compute-0 nova_compute[187208]: 2025-12-05 12:21:58.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:21:58 compute-0 podman[243674]: 2025-12-05 12:21:58.201291709 +0000 UTC m=+0.058456998 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:21:58 compute-0 podman[243675]: 2025-12-05 12:21:58.220645084 +0000 UTC m=+0.074918030 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:21:58 compute-0 nova_compute[187208]: 2025-12-05 12:21:58.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:21:58.232 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:21:58 compute-0 podman[243676]: 2025-12-05 12:21:58.234216524 +0000 UTC m=+0.083751864 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:21:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:21:58.234 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.089 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.188 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.230 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.231 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5675MB free_disk=73.04093551635742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.231 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.231 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:21:59 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:21:59.236 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.311 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.312 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.337 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.362 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.363 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.363 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:21:59 compute-0 nova_compute[187208]: 2025-12-05 12:21:59.596 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:00 compute-0 nova_compute[187208]: 2025-12-05 12:22:00.364 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:22:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:02.908 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:02.909 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated
Dec 05 12:22:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:02.910 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:22:02 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:02.912 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4cec93-7abc-47fa-bf8e-f3a3f045eb29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:22:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:03.025 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:22:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:22:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:22:03 compute-0 nova_compute[187208]: 2025-12-05 12:22:03.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:22:04 compute-0 nova_compute[187208]: 2025-12-05 12:22:04.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:22:04 compute-0 nova_compute[187208]: 2025-12-05 12:22:04.192 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:04 compute-0 nova_compute[187208]: 2025-12-05 12:22:04.598 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:07.684 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:07.686 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated
Dec 05 12:22:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:07.688 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:22:07 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:07.689 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[93606769-1c4b-4d13-9f09-5dd0632681da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:22:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:08.997 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:08 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:08.999 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated
Dec 05 12:22:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:09.001 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:22:09 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:09.002 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5c4b4418-e7a8-40e9-a977-752ca4b57ec7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:22:09 compute-0 nova_compute[187208]: 2025-12-05 12:22:09.195 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:09 compute-0 nova_compute[187208]: 2025-12-05 12:22:09.599 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:10 compute-0 podman[243743]: 2025-12-05 12:22:10.200224824 +0000 UTC m=+0.051726815 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:22:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:12.816 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:12.817 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated
Dec 05 12:22:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:12.818 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:22:12 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:12.819 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[761caaf4-fd6a-46f8-87ba-c3e0feda4afd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:22:14 compute-0 nova_compute[187208]: 2025-12-05 12:22:14.198 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:14.331 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:14.332 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated
Dec 05 12:22:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:14.333 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:22:14 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:14.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8f409b-b6f1-42ad-965c-2afd5765d1d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:22:14 compute-0 nova_compute[187208]: 2025-12-05 12:22:14.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:16 compute-0 podman[243767]: 2025-12-05 12:22:16.222589017 +0000 UTC m=+0.071013228 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 05 12:22:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:18.872 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:18.873 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated
Dec 05 12:22:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:18.874 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:22:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:18.875 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7ac410-11e3-4643-b09c-fb918b5edaf9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:22:19 compute-0 nova_compute[187208]: 2025-12-05 12:22:19.202 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:19 compute-0 nova_compute[187208]: 2025-12-05 12:22:19.604 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:19.984 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:19.985 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated
Dec 05 12:22:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:19.986 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:22:19 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:19.987 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2e7b6ba2-e6c0-4290-b59f-a9d08a96a071]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:22:20 compute-0 podman[243789]: 2025-12-05 12:22:20.204325332 +0000 UTC m=+0.054113304 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:22:20 compute-0 podman[243788]: 2025-12-05 12:22:20.233819288 +0000 UTC m=+0.087278025 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec 05 12:22:24 compute-0 nova_compute[187208]: 2025-12-05 12:22:24.205 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:24 compute-0 nova_compute[187208]: 2025-12-05 12:22:24.605 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:27.344 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:27.345 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated
Dec 05 12:22:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:27.346 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:22:27 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:27.347 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4d256603-6c7a-443c-a18b-89fe6eaba5f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:22:29 compute-0 podman[243828]: 2025-12-05 12:22:29.207091654 +0000 UTC m=+0.053461775 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:22:29 compute-0 nova_compute[187208]: 2025-12-05 12:22:29.208 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:29 compute-0 podman[243827]: 2025-12-05 12:22:29.235269722 +0000 UTC m=+0.086756010 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec 05 12:22:29 compute-0 podman[243829]: 2025-12-05 12:22:29.247967387 +0000 UTC m=+0.088759968 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller)
Dec 05 12:22:29 compute-0 nova_compute[187208]: 2025-12-05 12:22:29.608 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:34 compute-0 nova_compute[187208]: 2025-12-05 12:22:34.212 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:34 compute-0 nova_compute[187208]: 2025-12-05 12:22:34.610 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:22:39 compute-0 nova_compute[187208]: 2025-12-05 12:22:39.215 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:39 compute-0 nova_compute[187208]: 2025-12-05 12:22:39.611 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:41 compute-0 podman[243892]: 2025-12-05 12:22:41.190134974 +0000 UTC m=+0.048786590 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:22:44 compute-0 nova_compute[187208]: 2025-12-05 12:22:44.218 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:44 compute-0 nova_compute[187208]: 2025-12-05 12:22:44.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:47 compute-0 podman[243917]: 2025-12-05 12:22:47.209887713 +0000 UTC m=+0.062234396 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 12:22:49 compute-0 nova_compute[187208]: 2025-12-05 12:22:49.222 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:49 compute-0 nova_compute[187208]: 2025-12-05 12:22:49.614 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:51 compute-0 podman[243938]: 2025-12-05 12:22:51.197885838 +0000 UTC m=+0.051191220 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:22:51 compute-0 podman[243937]: 2025-12-05 12:22:51.20528683 +0000 UTC m=+0.061337630 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Dec 05 12:22:53 compute-0 nova_compute[187208]: 2025-12-05 12:22:53.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:22:53 compute-0 nova_compute[187208]: 2025-12-05 12:22:53.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:22:53 compute-0 nova_compute[187208]: 2025-12-05 12:22:53.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:22:53 compute-0 nova_compute[187208]: 2025-12-05 12:22:53.346 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:22:54 compute-0 nova_compute[187208]: 2025-12-05 12:22:54.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:22:54 compute-0 nova_compute[187208]: 2025-12-05 12:22:54.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:22:54 compute-0 nova_compute[187208]: 2025-12-05 12:22:54.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:54 compute-0 nova_compute[187208]: 2025-12-05 12:22:54.616 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:55 compute-0 nova_compute[187208]: 2025-12-05 12:22:55.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:22:55 compute-0 nova_compute[187208]: 2025-12-05 12:22:55.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:22:57 compute-0 nova_compute[187208]: 2025-12-05 12:22:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:22:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:57.069 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8:0:1:f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:57.071 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated
Dec 05 12:22:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:57.072 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 12:22:57 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:57.073 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c04cac78-9ed0-4948-861d-e2824475cabd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 12:22:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:58.308 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:22:58 compute-0 nova_compute[187208]: 2025-12-05 12:22:58.309 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:22:58.309 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:22:59 compute-0 nova_compute[187208]: 2025-12-05 12:22:59.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:22:59 compute-0 nova_compute[187208]: 2025-12-05 12:22:59.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:22:59 compute-0 nova_compute[187208]: 2025-12-05 12:22:59.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:00 compute-0 nova_compute[187208]: 2025-12-05 12:23:00.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:00 compute-0 podman[243976]: 2025-12-05 12:23:00.197872233 +0000 UTC m=+0.051226470 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:23:00 compute-0 podman[243975]: 2025-12-05 12:23:00.197847412 +0000 UTC m=+0.054843544 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 05 12:23:00 compute-0 podman[243977]: 2025-12-05 12:23:00.229111609 +0000 UTC m=+0.078811472 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller)
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.097 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.287 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.288 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5688MB free_disk=73.04093551635742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.288 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.288 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.351 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.352 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.389 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.404 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.405 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:23:01 compute-0 nova_compute[187208]: 2025-12-05 12:23:01.406 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:23:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:23:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:23:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:23:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:23:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:23:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:23:04 compute-0 nova_compute[187208]: 2025-12-05 12:23:04.236 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:04 compute-0 nova_compute[187208]: 2025-12-05 12:23:04.401 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:04 compute-0 nova_compute[187208]: 2025-12-05 12:23:04.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:06 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:23:06.311 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:23:09 compute-0 nova_compute[187208]: 2025-12-05 12:23:09.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:09 compute-0 nova_compute[187208]: 2025-12-05 12:23:09.624 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:12 compute-0 podman[244044]: 2025-12-05 12:23:12.199944629 +0000 UTC m=+0.053093615 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:23:14 compute-0 nova_compute[187208]: 2025-12-05 12:23:14.242 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:14 compute-0 nova_compute[187208]: 2025-12-05 12:23:14.625 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:18 compute-0 podman[244069]: 2025-12-05 12:23:18.207927023 +0000 UTC m=+0.063598186 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:23:19 compute-0 nova_compute[187208]: 2025-12-05 12:23:19.246 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:19 compute-0 nova_compute[187208]: 2025-12-05 12:23:19.626 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:22 compute-0 podman[244092]: 2025-12-05 12:23:22.196263728 +0000 UTC m=+0.049708647 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:23:22 compute-0 podman[244091]: 2025-12-05 12:23:22.196282509 +0000 UTC m=+0.052999322 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec 05 12:23:24 compute-0 nova_compute[187208]: 2025-12-05 12:23:24.250 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:24 compute-0 nova_compute[187208]: 2025-12-05 12:23:24.628 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:29 compute-0 nova_compute[187208]: 2025-12-05 12:23:29.418 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:29 compute-0 nova_compute[187208]: 2025-12-05 12:23:29.629 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:31 compute-0 podman[244131]: 2025-12-05 12:23:31.216918154 +0000 UTC m=+0.060997510 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:23:31 compute-0 podman[244132]: 2025-12-05 12:23:31.233476909 +0000 UTC m=+0.075241199 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125)
Dec 05 12:23:31 compute-0 podman[244130]: 2025-12-05 12:23:31.234327814 +0000 UTC m=+0.084650450 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:23:32 compute-0 sshd-session[244202]: Received disconnect from 193.46.255.103 port 12640:11:  [preauth]
Dec 05 12:23:32 compute-0 sshd-session[244202]: Disconnected from authenticating user root 193.46.255.103 port 12640 [preauth]
Dec 05 12:23:34 compute-0 nova_compute[187208]: 2025-12-05 12:23:34.422 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:34 compute-0 nova_compute[187208]: 2025-12-05 12:23:34.632 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:39 compute-0 nova_compute[187208]: 2025-12-05 12:23:39.425 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:39 compute-0 nova_compute[187208]: 2025-12-05 12:23:39.634 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:41 compute-0 nova_compute[187208]: 2025-12-05 12:23:41.099 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:43 compute-0 podman[244204]: 2025-12-05 12:23:43.192921821 +0000 UTC m=+0.051384005 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:23:44 compute-0 nova_compute[187208]: 2025-12-05 12:23:44.429 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:44 compute-0 nova_compute[187208]: 2025-12-05 12:23:44.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:49 compute-0 podman[244228]: 2025-12-05 12:23:49.21123089 +0000 UTC m=+0.060133646 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:23:49 compute-0 nova_compute[187208]: 2025-12-05 12:23:49.432 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:49 compute-0 nova_compute[187208]: 2025-12-05 12:23:49.637 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:53 compute-0 podman[244249]: 2025-12-05 12:23:53.202908031 +0000 UTC m=+0.047948906 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 12:23:53 compute-0 podman[244248]: 2025-12-05 12:23:53.211660842 +0000 UTC m=+0.058684664 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64)
Dec 05 12:23:54 compute-0 nova_compute[187208]: 2025-12-05 12:23:54.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:54 compute-0 nova_compute[187208]: 2025-12-05 12:23:54.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:23:54 compute-0 nova_compute[187208]: 2025-12-05 12:23:54.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:23:54 compute-0 nova_compute[187208]: 2025-12-05 12:23:54.090 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:23:54 compute-0 nova_compute[187208]: 2025-12-05 12:23:54.090 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:54 compute-0 nova_compute[187208]: 2025-12-05 12:23:54.090 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:23:54 compute-0 nova_compute[187208]: 2025-12-05 12:23:54.436 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:54 compute-0 nova_compute[187208]: 2025-12-05 12:23:54.639 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:56 compute-0 nova_compute[187208]: 2025-12-05 12:23:56.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:56 compute-0 nova_compute[187208]: 2025-12-05 12:23:56.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:57 compute-0 nova_compute[187208]: 2025-12-05 12:23:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:57 compute-0 nova_compute[187208]: 2025-12-05 12:23:57.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 12:23:57 compute-0 nova_compute[187208]: 2025-12-05 12:23:57.085 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:23:58.453 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:23:58 compute-0 nova_compute[187208]: 2025-12-05 12:23:58.454 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:58 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:23:58.454 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:23:59 compute-0 nova_compute[187208]: 2025-12-05 12:23:59.102 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:23:59 compute-0 nova_compute[187208]: 2025-12-05 12:23:59.438 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:23:59 compute-0 nova_compute[187208]: 2025-12-05 12:23:59.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:00 compute-0 nova_compute[187208]: 2025-12-05 12:24:00.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:01 compute-0 nova_compute[187208]: 2025-12-05 12:24:01.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:01 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:24:01.456 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:24:01 compute-0 anacron[216307]: Job `cron.daily' started
Dec 05 12:24:01 compute-0 anacron[216307]: Job `cron.daily' terminated
Dec 05 12:24:02 compute-0 podman[244288]: 2025-12-05 12:24:02.216874517 +0000 UTC m=+0.059828848 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:24:02 compute-0 podman[244287]: 2025-12-05 12:24:02.226502553 +0000 UTC m=+0.065928412 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:24:02 compute-0 podman[244289]: 2025-12-05 12:24:02.257794591 +0000 UTC m=+0.091320751 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 05 12:24:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:24:03.028 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:24:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:24:03.028 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:24:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:24:03.029 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.089 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.253 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.254 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5695MB free_disk=73.03999328613281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.254 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.254 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.482 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.482 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.559 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.571 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.572 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:24:03 compute-0 nova_compute[187208]: 2025-12-05 12:24:03.573 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:24:04 compute-0 nova_compute[187208]: 2025-12-05 12:24:04.442 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:04 compute-0 nova_compute[187208]: 2025-12-05 12:24:04.643 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:05 compute-0 nova_compute[187208]: 2025-12-05 12:24:05.567 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:05 compute-0 nova_compute[187208]: 2025-12-05 12:24:05.568 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:07 compute-0 nova_compute[187208]: 2025-12-05 12:24:07.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:07 compute-0 nova_compute[187208]: 2025-12-05 12:24:07.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 12:24:07 compute-0 nova_compute[187208]: 2025-12-05 12:24:07.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 12:24:09 compute-0 nova_compute[187208]: 2025-12-05 12:24:09.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:09 compute-0 nova_compute[187208]: 2025-12-05 12:24:09.646 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:14 compute-0 podman[244358]: 2025-12-05 12:24:14.209783038 +0000 UTC m=+0.052107065 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:24:14 compute-0 nova_compute[187208]: 2025-12-05 12:24:14.449 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:14 compute-0 nova_compute[187208]: 2025-12-05 12:24:14.648 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:19 compute-0 nova_compute[187208]: 2025-12-05 12:24:19.096 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:19 compute-0 nova_compute[187208]: 2025-12-05 12:24:19.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:19 compute-0 nova_compute[187208]: 2025-12-05 12:24:19.649 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:20 compute-0 podman[244382]: 2025-12-05 12:24:20.221867138 +0000 UTC m=+0.075780875 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:24:24 compute-0 podman[244402]: 2025-12-05 12:24:24.201922054 +0000 UTC m=+0.055950416 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Dec 05 12:24:24 compute-0 podman[244403]: 2025-12-05 12:24:24.202782869 +0000 UTC m=+0.052101015 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 12:24:24 compute-0 nova_compute[187208]: 2025-12-05 12:24:24.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:24 compute-0 nova_compute[187208]: 2025-12-05 12:24:24.651 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:29 compute-0 nova_compute[187208]: 2025-12-05 12:24:29.543 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:29 compute-0 nova_compute[187208]: 2025-12-05 12:24:29.653 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:33 compute-0 podman[244441]: 2025-12-05 12:24:33.198930552 +0000 UTC m=+0.052880979 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 12:24:33 compute-0 podman[244442]: 2025-12-05 12:24:33.198930452 +0000 UTC m=+0.049690307 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 12:24:33 compute-0 podman[244443]: 2025-12-05 12:24:33.226850943 +0000 UTC m=+0.075419665 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 12:24:34 compute-0 nova_compute[187208]: 2025-12-05 12:24:34.545 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:34 compute-0 nova_compute[187208]: 2025-12-05 12:24:34.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:24:39 compute-0 nova_compute[187208]: 2025-12-05 12:24:39.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:39 compute-0 nova_compute[187208]: 2025-12-05 12:24:39.656 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:44 compute-0 nova_compute[187208]: 2025-12-05 12:24:44.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:44 compute-0 podman[244511]: 2025-12-05 12:24:44.633178757 +0000 UTC m=+0.055749140 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:24:44 compute-0 nova_compute[187208]: 2025-12-05 12:24:44.658 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:49 compute-0 nova_compute[187208]: 2025-12-05 12:24:49.556 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:49 compute-0 nova_compute[187208]: 2025-12-05 12:24:49.660 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:51 compute-0 podman[244535]: 2025-12-05 12:24:51.19511435 +0000 UTC m=+0.053228498 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 05 12:24:54 compute-0 nova_compute[187208]: 2025-12-05 12:24:54.085 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:54 compute-0 nova_compute[187208]: 2025-12-05 12:24:54.085 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:24:54 compute-0 nova_compute[187208]: 2025-12-05 12:24:54.085 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:24:54 compute-0 nova_compute[187208]: 2025-12-05 12:24:54.136 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:24:54 compute-0 nova_compute[187208]: 2025-12-05 12:24:54.558 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:54 compute-0 nova_compute[187208]: 2025-12-05 12:24:54.661 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:55 compute-0 nova_compute[187208]: 2025-12-05 12:24:55.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:55 compute-0 nova_compute[187208]: 2025-12-05 12:24:55.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:24:55 compute-0 podman[244557]: 2025-12-05 12:24:55.199829014 +0000 UTC m=+0.046471514 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 05 12:24:55 compute-0 podman[244556]: 2025-12-05 12:24:55.230912506 +0000 UTC m=+0.083659151 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64)
Dec 05 12:24:57 compute-0 nova_compute[187208]: 2025-12-05 12:24:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:58 compute-0 nova_compute[187208]: 2025-12-05 12:24:58.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:59 compute-0 nova_compute[187208]: 2025-12-05 12:24:59.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:24:59 compute-0 nova_compute[187208]: 2025-12-05 12:24:59.561 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:24:59 compute-0 nova_compute[187208]: 2025-12-05 12:24:59.663 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:01 compute-0 nova_compute[187208]: 2025-12-05 12:25:01.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:25:02 compute-0 nova_compute[187208]: 2025-12-05 12:25:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:25:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:25:03.029 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:25:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:25:03.030 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:25:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:25:03.031 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:25:04 compute-0 podman[244597]: 2025-12-05 12:25:04.200245189 +0000 UTC m=+0.046565877 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.205 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.206 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.206 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.206 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:25:04 compute-0 podman[244596]: 2025-12-05 12:25:04.238246619 +0000 UTC m=+0.089493748 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 05 12:25:04 compute-0 podman[244598]: 2025-12-05 12:25:04.268083535 +0000 UTC m=+0.112027805 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.358 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.359 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5684MB free_disk=73.03991317749023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.360 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.360 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.469 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.470 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.562 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.639 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.656 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.656 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.663 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.668 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.687 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.708 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.749 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.751 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.751 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:25:04 compute-0 nova_compute[187208]: 2025-12-05 12:25:04.751 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:25:04.751 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:25:04 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:25:04.752 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:25:05 compute-0 nova_compute[187208]: 2025-12-05 12:25:05.747 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:25:09 compute-0 nova_compute[187208]: 2025-12-05 12:25:09.614 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:09 compute-0 nova_compute[187208]: 2025-12-05 12:25:09.665 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:10 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:25:10.754 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:25:14 compute-0 nova_compute[187208]: 2025-12-05 12:25:14.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:14 compute-0 nova_compute[187208]: 2025-12-05 12:25:14.667 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:15 compute-0 podman[244665]: 2025-12-05 12:25:15.194022978 +0000 UTC m=+0.052718493 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:25:19 compute-0 nova_compute[187208]: 2025-12-05 12:25:19.621 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:19 compute-0 nova_compute[187208]: 2025-12-05 12:25:19.669 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:21 compute-0 sshd-session[244691]: Connection reset by authenticating user root 45.140.17.124 port 59160 [preauth]
Dec 05 12:25:22 compute-0 sshd-session[244693]: Invalid user service from 45.140.17.124 port 59168
Dec 05 12:25:22 compute-0 podman[244695]: 2025-12-05 12:25:22.194822611 +0000 UTC m=+0.048496792 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:25:22 compute-0 sshd-session[244693]: Connection reset by invalid user service 45.140.17.124 port 59168 [preauth]
Dec 05 12:25:23 compute-0 sshd-session[244715]: Connection reset by authenticating user root 45.140.17.124 port 37296 [preauth]
Dec 05 12:25:24 compute-0 nova_compute[187208]: 2025-12-05 12:25:24.632 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:24 compute-0 nova_compute[187208]: 2025-12-05 12:25:24.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:25 compute-0 sshd-session[244717]: Invalid user ubuntu from 43.225.159.111 port 59990
Dec 05 12:25:25 compute-0 sshd-session[244717]: Received disconnect from 43.225.159.111 port 59990:11:  [preauth]
Dec 05 12:25:25 compute-0 sshd-session[244717]: Disconnected from invalid user ubuntu 43.225.159.111 port 59990 [preauth]
Dec 05 12:25:26 compute-0 podman[244721]: 2025-12-05 12:25:26.214283648 +0000 UTC m=+0.060659342 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Dec 05 12:25:26 compute-0 podman[244722]: 2025-12-05 12:25:26.22725775 +0000 UTC m=+0.075246900 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 12:25:26 compute-0 sshd-session[244719]: Connection reset by authenticating user root 45.140.17.124 port 37322 [preauth]
Dec 05 12:25:28 compute-0 sshd-session[244759]: Connection reset by authenticating user root 45.140.17.124 port 37328 [preauth]
Dec 05 12:25:29 compute-0 nova_compute[187208]: 2025-12-05 12:25:29.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:29 compute-0 nova_compute[187208]: 2025-12-05 12:25:29.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:34 compute-0 nova_compute[187208]: 2025-12-05 12:25:34.638 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:34 compute-0 nova_compute[187208]: 2025-12-05 12:25:34.672 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:35 compute-0 podman[244761]: 2025-12-05 12:25:35.198160839 +0000 UTC m=+0.054007320 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 05 12:25:35 compute-0 podman[244762]: 2025-12-05 12:25:35.204857082 +0000 UTC m=+0.055383420 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:25:35 compute-0 podman[244763]: 2025-12-05 12:25:35.227721808 +0000 UTC m=+0.076182457 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:25:39 compute-0 nova_compute[187208]: 2025-12-05 12:25:39.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:39 compute-0 nova_compute[187208]: 2025-12-05 12:25:39.673 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:44 compute-0 nova_compute[187208]: 2025-12-05 12:25:44.645 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:44 compute-0 nova_compute[187208]: 2025-12-05 12:25:44.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:46 compute-0 podman[244827]: 2025-12-05 12:25:46.200985889 +0000 UTC m=+0.048510373 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:25:49 compute-0 nova_compute[187208]: 2025-12-05 12:25:49.649 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:49 compute-0 nova_compute[187208]: 2025-12-05 12:25:49.677 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:53 compute-0 podman[244851]: 2025-12-05 12:25:53.19716427 +0000 UTC m=+0.046537066 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:25:54 compute-0 nova_compute[187208]: 2025-12-05 12:25:54.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:25:54 compute-0 nova_compute[187208]: 2025-12-05 12:25:54.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:25:54 compute-0 nova_compute[187208]: 2025-12-05 12:25:54.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:25:54 compute-0 nova_compute[187208]: 2025-12-05 12:25:54.076 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:25:54 compute-0 nova_compute[187208]: 2025-12-05 12:25:54.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:54 compute-0 nova_compute[187208]: 2025-12-05 12:25:54.679 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:55 compute-0 nova_compute[187208]: 2025-12-05 12:25:55.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:25:55 compute-0 nova_compute[187208]: 2025-12-05 12:25:55.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:25:57 compute-0 nova_compute[187208]: 2025-12-05 12:25:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:25:57 compute-0 podman[244871]: 2025-12-05 12:25:57.213648904 +0000 UTC m=+0.064142080 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Dec 05 12:25:57 compute-0 podman[244872]: 2025-12-05 12:25:57.231183949 +0000 UTC m=+0.074143788 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:25:59 compute-0 nova_compute[187208]: 2025-12-05 12:25:59.657 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:25:59 compute-0 nova_compute[187208]: 2025-12-05 12:25:59.680 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:00 compute-0 nova_compute[187208]: 2025-12-05 12:26:00.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:01 compute-0 nova_compute[187208]: 2025-12-05 12:26:01.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:02 compute-0 nova_compute[187208]: 2025-12-05 12:26:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:26:03.031 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:26:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:26:03.032 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:26:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:26:03.032 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:26:03 compute-0 nova_compute[187208]: 2025-12-05 12:26:03.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:04 compute-0 nova_compute[187208]: 2025-12-05 12:26:04.660 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:04 compute-0 nova_compute[187208]: 2025-12-05 12:26:04.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.097 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.258 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.259 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5707MB free_disk=73.04000854492188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.259 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.260 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.343 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.344 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.379 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.398 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.400 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:26:05 compute-0 nova_compute[187208]: 2025-12-05 12:26:05.400 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:26:06 compute-0 podman[244912]: 2025-12-05 12:26:06.24187057 +0000 UTC m=+0.056585792 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 05 12:26:06 compute-0 podman[244913]: 2025-12-05 12:26:06.26306466 +0000 UTC m=+0.075761564 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:26:06 compute-0 podman[244914]: 2025-12-05 12:26:06.29600709 +0000 UTC m=+0.104682128 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 12:26:09 compute-0 nova_compute[187208]: 2025-12-05 12:26:09.663 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:09 compute-0 nova_compute[187208]: 2025-12-05 12:26:09.683 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:10 compute-0 nova_compute[187208]: 2025-12-05 12:26:10.396 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:14 compute-0 nova_compute[187208]: 2025-12-05 12:26:14.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:14 compute-0 nova_compute[187208]: 2025-12-05 12:26:14.684 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:17 compute-0 podman[244981]: 2025-12-05 12:26:17.218966469 +0000 UTC m=+0.070235155 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 12:26:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:26:18.086 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:26:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:26:18.086 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:26:18 compute-0 nova_compute[187208]: 2025-12-05 12:26:18.088 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:19 compute-0 nova_compute[187208]: 2025-12-05 12:26:19.678 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:19 compute-0 nova_compute[187208]: 2025-12-05 12:26:19.685 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:24 compute-0 podman[245006]: 2025-12-05 12:26:24.204977893 +0000 UTC m=+0.056342385 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:26:24 compute-0 nova_compute[187208]: 2025-12-05 12:26:24.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:24 compute-0 nova_compute[187208]: 2025-12-05 12:26:24.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:26:28.089 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:26:28 compute-0 podman[245026]: 2025-12-05 12:26:28.228515553 +0000 UTC m=+0.081562922 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec 05 12:26:28 compute-0 podman[245027]: 2025-12-05 12:26:28.228526223 +0000 UTC m=+0.078196695 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 05 12:26:29 compute-0 nova_compute[187208]: 2025-12-05 12:26:29.684 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:29 compute-0 nova_compute[187208]: 2025-12-05 12:26:29.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:34 compute-0 nova_compute[187208]: 2025-12-05 12:26:34.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:34 compute-0 nova_compute[187208]: 2025-12-05 12:26:34.688 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:26:37 compute-0 podman[245067]: 2025-12-05 12:26:37.193944567 +0000 UTC m=+0.047821399 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:26:37 compute-0 podman[245066]: 2025-12-05 12:26:37.194306378 +0000 UTC m=+0.050541858 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Dec 05 12:26:37 compute-0 podman[245068]: 2025-12-05 12:26:37.233991301 +0000 UTC m=+0.083848067 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:26:39 compute-0 nova_compute[187208]: 2025-12-05 12:26:39.690 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:26:39 compute-0 nova_compute[187208]: 2025-12-05 12:26:39.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:26:39 compute-0 nova_compute[187208]: 2025-12-05 12:26:39.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:26:39 compute-0 nova_compute[187208]: 2025-12-05 12:26:39.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:26:39 compute-0 nova_compute[187208]: 2025-12-05 12:26:39.824 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:39 compute-0 nova_compute[187208]: 2025-12-05 12:26:39.824 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:26:44 compute-0 nova_compute[187208]: 2025-12-05 12:26:44.825 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:26:48 compute-0 podman[245134]: 2025-12-05 12:26:48.195925263 +0000 UTC m=+0.051032561 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:26:49 compute-0 nova_compute[187208]: 2025-12-05 12:26:49.827 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:26:54 compute-0 nova_compute[187208]: 2025-12-05 12:26:54.828 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:26:54 compute-0 nova_compute[187208]: 2025-12-05 12:26:54.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:54 compute-0 nova_compute[187208]: 2025-12-05 12:26:54.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:26:54 compute-0 nova_compute[187208]: 2025-12-05 12:26:54.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:26:54 compute-0 nova_compute[187208]: 2025-12-05 12:26:54.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:26:54 compute-0 nova_compute[187208]: 2025-12-05 12:26:54.831 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:55 compute-0 nova_compute[187208]: 2025-12-05 12:26:55.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:55 compute-0 nova_compute[187208]: 2025-12-05 12:26:55.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:26:55 compute-0 nova_compute[187208]: 2025-12-05 12:26:55.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:26:55 compute-0 nova_compute[187208]: 2025-12-05 12:26:55.077 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:26:55 compute-0 podman[245158]: 2025-12-05 12:26:55.206810893 +0000 UTC m=+0.060459644 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:26:56 compute-0 nova_compute[187208]: 2025-12-05 12:26:56.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:56 compute-0 nova_compute[187208]: 2025-12-05 12:26:56.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:26:57 compute-0 nova_compute[187208]: 2025-12-05 12:26:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:26:59 compute-0 podman[245180]: 2025-12-05 12:26:59.217602366 +0000 UTC m=+0.058193728 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 05 12:26:59 compute-0 podman[245179]: 2025-12-05 12:26:59.238412356 +0000 UTC m=+0.075970621 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 05 12:26:59 compute-0 nova_compute[187208]: 2025-12-05 12:26:59.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:26:59 compute-0 nova_compute[187208]: 2025-12-05 12:26:59.832 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:00 compute-0 nova_compute[187208]: 2025-12-05 12:27:00.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:27:01 compute-0 nova_compute[187208]: 2025-12-05 12:27:01.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:27:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:27:03.032 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:27:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:27:03.033 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:27:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:27:03.033 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:27:03 compute-0 nova_compute[187208]: 2025-12-05 12:27:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:27:04 compute-0 nova_compute[187208]: 2025-12-05 12:27:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:27:04 compute-0 nova_compute[187208]: 2025-12-05 12:27:04.834 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:27:04 compute-0 nova_compute[187208]: 2025-12-05 12:27:04.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:27:04 compute-0 nova_compute[187208]: 2025-12-05 12:27:04.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:27:04 compute-0 nova_compute[187208]: 2025-12-05 12:27:04.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:04 compute-0 nova_compute[187208]: 2025-12-05 12:27:04.877 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:04 compute-0 nova_compute[187208]: 2025-12-05 12:27:04.878 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:06 compute-0 nova_compute[187208]: 2025-12-05 12:27:06.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.094 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.094 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.094 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.231 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.232 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5708MB free_disk=73.04000854492188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.233 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.233 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.285 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.285 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.305 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.319 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.321 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:27:07 compute-0 nova_compute[187208]: 2025-12-05 12:27:07.321 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:27:08 compute-0 podman[245219]: 2025-12-05 12:27:08.222963101 +0000 UTC m=+0.058613240 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:27:08 compute-0 podman[245218]: 2025-12-05 12:27:08.238287653 +0000 UTC m=+0.084884637 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 12:27:08 compute-0 podman[245223]: 2025-12-05 12:27:08.251199965 +0000 UTC m=+0.086558526 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 12:27:09 compute-0 nova_compute[187208]: 2025-12-05 12:27:09.878 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:27:09 compute-0 nova_compute[187208]: 2025-12-05 12:27:09.879 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:09 compute-0 nova_compute[187208]: 2025-12-05 12:27:09.879 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:27:09 compute-0 nova_compute[187208]: 2025-12-05 12:27:09.879 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:09 compute-0 nova_compute[187208]: 2025-12-05 12:27:09.880 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:09 compute-0 nova_compute[187208]: 2025-12-05 12:27:09.881 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:14 compute-0 nova_compute[187208]: 2025-12-05 12:27:14.883 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:27:14 compute-0 nova_compute[187208]: 2025-12-05 12:27:14.885 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:27:14 compute-0 nova_compute[187208]: 2025-12-05 12:27:14.885 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:27:14 compute-0 nova_compute[187208]: 2025-12-05 12:27:14.886 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:14 compute-0 nova_compute[187208]: 2025-12-05 12:27:14.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:14 compute-0 nova_compute[187208]: 2025-12-05 12:27:14.908 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:19 compute-0 podman[245288]: 2025-12-05 12:27:19.197534606 +0000 UTC m=+0.045781331 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:27:19 compute-0 nova_compute[187208]: 2025-12-05 12:27:19.909 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:24 compute-0 nova_compute[187208]: 2025-12-05 12:27:24.910 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:27:24 compute-0 nova_compute[187208]: 2025-12-05 12:27:24.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:27:24 compute-0 nova_compute[187208]: 2025-12-05 12:27:24.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:27:24 compute-0 nova_compute[187208]: 2025-12-05 12:27:24.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:24 compute-0 nova_compute[187208]: 2025-12-05 12:27:24.967 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:24 compute-0 nova_compute[187208]: 2025-12-05 12:27:24.968 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:27:25.075 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:27:25 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:27:25.076 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:27:25 compute-0 nova_compute[187208]: 2025-12-05 12:27:25.077 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:26 compute-0 podman[245312]: 2025-12-05 12:27:26.187991226 +0000 UTC m=+0.046542582 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:27:29 compute-0 nova_compute[187208]: 2025-12-05 12:27:29.968 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:30 compute-0 podman[245334]: 2025-12-05 12:27:30.208086208 +0000 UTC m=+0.055403258 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 05 12:27:30 compute-0 podman[245333]: 2025-12-05 12:27:30.208286423 +0000 UTC m=+0.058109795 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, io.openshift.expose-services=)
Dec 05 12:27:32 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:27:32.078 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:27:34 compute-0 nova_compute[187208]: 2025-12-05 12:27:34.969 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:39 compute-0 podman[245374]: 2025-12-05 12:27:39.201089797 +0000 UTC m=+0.048972022 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:27:39 compute-0 podman[245373]: 2025-12-05 12:27:39.205238096 +0000 UTC m=+0.057762466 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 12:27:39 compute-0 podman[245375]: 2025-12-05 12:27:39.23660404 +0000 UTC m=+0.082038515 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 12:27:39 compute-0 nova_compute[187208]: 2025-12-05 12:27:39.970 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:44 compute-0 nova_compute[187208]: 2025-12-05 12:27:44.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:49 compute-0 nova_compute[187208]: 2025-12-05 12:27:49.974 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:27:49 compute-0 nova_compute[187208]: 2025-12-05 12:27:49.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:27:49 compute-0 nova_compute[187208]: 2025-12-05 12:27:49.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:27:49 compute-0 nova_compute[187208]: 2025-12-05 12:27:49.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:49 compute-0 nova_compute[187208]: 2025-12-05 12:27:49.993 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:49 compute-0 nova_compute[187208]: 2025-12-05 12:27:49.994 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:27:50 compute-0 podman[245436]: 2025-12-05 12:27:50.195847153 +0000 UTC m=+0.049659052 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:27:54 compute-0 nova_compute[187208]: 2025-12-05 12:27:54.995 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:54 compute-0 nova_compute[187208]: 2025-12-05 12:27:54.998 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:27:57 compute-0 podman[245463]: 2025-12-05 12:27:57.241429893 +0000 UTC m=+0.087849953 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Dec 05 12:27:57 compute-0 nova_compute[187208]: 2025-12-05 12:27:57.322 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:27:57 compute-0 nova_compute[187208]: 2025-12-05 12:27:57.323 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:27:57 compute-0 nova_compute[187208]: 2025-12-05 12:27:57.323 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:27:57 compute-0 nova_compute[187208]: 2025-12-05 12:27:57.495 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:27:58 compute-0 nova_compute[187208]: 2025-12-05 12:27:58.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:27:58 compute-0 nova_compute[187208]: 2025-12-05 12:27:58.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:27:59 compute-0 nova_compute[187208]: 2025-12-05 12:27:59.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:27:59 compute-0 nova_compute[187208]: 2025-12-05 12:27:59.998 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:28:01 compute-0 podman[245485]: 2025-12-05 12:28:01.197993893 +0000 UTC m=+0.047211671 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 05 12:28:01 compute-0 podman[245484]: 2025-12-05 12:28:01.208358322 +0000 UTC m=+0.060866195 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Dec 05 12:28:02 compute-0 nova_compute[187208]: 2025-12-05 12:28:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:28:02 compute-0 nova_compute[187208]: 2025-12-05 12:28:02.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:28:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:28:03.033 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:28:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:28:03.034 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:28:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:28:03.034 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:28:04 compute-0 nova_compute[187208]: 2025-12-05 12:28:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:28:05 compute-0 nova_compute[187208]: 2025-12-05 12:28:05.000 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:28:05 compute-0 nova_compute[187208]: 2025-12-05 12:28:05.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:05 compute-0 nova_compute[187208]: 2025-12-05 12:28:05.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:28:05 compute-0 nova_compute[187208]: 2025-12-05 12:28:05.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:28:05 compute-0 nova_compute[187208]: 2025-12-05 12:28:05.003 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:28:05 compute-0 nova_compute[187208]: 2025-12-05 12:28:05.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:05 compute-0 nova_compute[187208]: 2025-12-05 12:28:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:28:07 compute-0 nova_compute[187208]: 2025-12-05 12:28:07.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.089 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.245 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.246 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5717MB free_disk=73.04000854492188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.246 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.246 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.310 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.310 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.331 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.345 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.346 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:28:09 compute-0 nova_compute[187208]: 2025-12-05 12:28:09.347 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:28:10 compute-0 nova_compute[187208]: 2025-12-05 12:28:10.003 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:10 compute-0 podman[245525]: 2025-12-05 12:28:10.19926066 +0000 UTC m=+0.051099104 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:28:10 compute-0 podman[245524]: 2025-12-05 12:28:10.20617284 +0000 UTC m=+0.059883078 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 12:28:10 compute-0 podman[245526]: 2025-12-05 12:28:10.264959704 +0000 UTC m=+0.113311947 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 12:28:11 compute-0 nova_compute[187208]: 2025-12-05 12:28:11.343 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:28:15 compute-0 nova_compute[187208]: 2025-12-05 12:28:15.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:20 compute-0 nova_compute[187208]: 2025-12-05 12:28:20.006 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:21 compute-0 podman[245593]: 2025-12-05 12:28:21.191176534 +0000 UTC m=+0.049671132 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:28:25 compute-0 nova_compute[187208]: 2025-12-05 12:28:25.008 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:28 compute-0 podman[245619]: 2025-12-05 12:28:28.214692577 +0000 UTC m=+0.067289820 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:28:30 compute-0 nova_compute[187208]: 2025-12-05 12:28:30.010 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:32 compute-0 podman[245639]: 2025-12-05 12:28:32.203226619 +0000 UTC m=+0.057466347 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 05 12:28:32 compute-0 podman[245640]: 2025-12-05 12:28:32.204885437 +0000 UTC m=+0.055122199 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 12:28:35 compute-0 nova_compute[187208]: 2025-12-05 12:28:35.011 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:28:40 compute-0 nova_compute[187208]: 2025-12-05 12:28:40.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:41 compute-0 podman[245680]: 2025-12-05 12:28:41.213090305 +0000 UTC m=+0.057681763 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:28:41 compute-0 podman[245679]: 2025-12-05 12:28:41.233549095 +0000 UTC m=+0.086625177 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible)
Dec 05 12:28:41 compute-0 podman[245684]: 2025-12-05 12:28:41.255064135 +0000 UTC m=+0.094995059 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 12:28:45 compute-0 nova_compute[187208]: 2025-12-05 12:28:45.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:28:45 compute-0 nova_compute[187208]: 2025-12-05 12:28:45.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:28:45 compute-0 nova_compute[187208]: 2025-12-05 12:28:45.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:28:45 compute-0 nova_compute[187208]: 2025-12-05 12:28:45.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:28:45 compute-0 nova_compute[187208]: 2025-12-05 12:28:45.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:45 compute-0 nova_compute[187208]: 2025-12-05 12:28:45.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:28:50 compute-0 nova_compute[187208]: 2025-12-05 12:28:50.055 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:28:50 compute-0 nova_compute[187208]: 2025-12-05 12:28:50.056 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:50 compute-0 nova_compute[187208]: 2025-12-05 12:28:50.056 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:28:50 compute-0 nova_compute[187208]: 2025-12-05 12:28:50.056 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:28:50 compute-0 nova_compute[187208]: 2025-12-05 12:28:50.057 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:28:52 compute-0 podman[245749]: 2025-12-05 12:28:52.195538928 +0000 UTC m=+0.051113014 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:28:55 compute-0 nova_compute[187208]: 2025-12-05 12:28:55.059 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:28:55 compute-0 sshd-session[245774]: Accepted publickey for zuul from 192.168.122.10 port 42888 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 12:28:55 compute-0 systemd-logind[792]: New session 26 of user zuul.
Dec 05 12:28:55 compute-0 systemd[1]: Started Session 26 of User zuul.
Dec 05 12:28:55 compute-0 sshd-session[245774]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 12:28:55 compute-0 sudo[245778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 05 12:28:55 compute-0 sudo[245778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 12:28:58 compute-0 nova_compute[187208]: 2025-12-05 12:28:58.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:28:58 compute-0 nova_compute[187208]: 2025-12-05 12:28:58.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:28:58 compute-0 podman[245915]: 2025-12-05 12:28:58.34939467 +0000 UTC m=+0.066233230 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 05 12:28:59 compute-0 nova_compute[187208]: 2025-12-05 12:28:59.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:28:59 compute-0 nova_compute[187208]: 2025-12-05 12:28:59.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:28:59 compute-0 nova_compute[187208]: 2025-12-05 12:28:59.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:28:59 compute-0 nova_compute[187208]: 2025-12-05 12:28:59.074 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:29:00 compute-0 nova_compute[187208]: 2025-12-05 12:29:00.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:00 compute-0 nova_compute[187208]: 2025-12-05 12:29:00.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 12:29:00 compute-0 nova_compute[187208]: 2025-12-05 12:29:00.061 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:00 compute-0 ovs-vsctl[245967]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 05 12:29:01 compute-0 nova_compute[187208]: 2025-12-05 12:29:01.072 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:01 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 245802 (sos)
Dec 05 12:29:01 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 05 12:29:01 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 05 12:29:01 compute-0 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 05 12:29:01 compute-0 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 05 12:29:01 compute-0 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 05 12:29:02 compute-0 nova_compute[187208]: 2025-12-05 12:29:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:02 compute-0 podman[246184]: 2025-12-05 12:29:02.324865155 +0000 UTC m=+0.068601448 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 05 12:29:02 compute-0 podman[246177]: 2025-12-05 12:29:02.333250657 +0000 UTC m=+0.076648070 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Dec 05 12:29:02 compute-0 crontab[246415]: (root) LIST (root)
Dec 05 12:29:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:29:03.034 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:29:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:29:03.035 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:29:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:29:03.036 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:29:03 compute-0 nova_compute[187208]: 2025-12-05 12:29:03.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:05 compute-0 nova_compute[187208]: 2025-12-05 12:29:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:05 compute-0 nova_compute[187208]: 2025-12-05 12:29:05.062 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:05 compute-0 systemd[1]: Starting Hostname Service...
Dec 05 12:29:05 compute-0 systemd[1]: Started Hostname Service.
Dec 05 12:29:06 compute-0 nova_compute[187208]: 2025-12-05 12:29:06.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:08 compute-0 nova_compute[187208]: 2025-12-05 12:29:08.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:08 compute-0 nova_compute[187208]: 2025-12-05 12:29:08.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:08 compute-0 nova_compute[187208]: 2025-12-05 12:29:08.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 12:29:08 compute-0 nova_compute[187208]: 2025-12-05 12:29:08.129 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 12:29:09 compute-0 nova_compute[187208]: 2025-12-05 12:29:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.063 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.066 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.081 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.114 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.115 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.115 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.115 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.266 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.267 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5327MB free_disk=72.69318771362305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.268 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.268 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.508 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.508 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.581 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:29:10 compute-0 nova_compute[187208]: 2025-12-05 12:29:10.655 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:29:12 compute-0 nova_compute[187208]: 2025-12-05 12:29:12.157 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:29:12 compute-0 nova_compute[187208]: 2025-12-05 12:29:12.157 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:29:12 compute-0 podman[247392]: 2025-12-05 12:29:12.218049526 +0000 UTC m=+0.067874117 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:29:12 compute-0 podman[247391]: 2025-12-05 12:29:12.223881094 +0000 UTC m=+0.073974812 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:29:12 compute-0 podman[247394]: 2025-12-05 12:29:12.24350775 +0000 UTC m=+0.090301833 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Dec 05 12:29:13 compute-0 ovs-appctl[247820]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 05 12:29:13 compute-0 ovs-appctl[247824]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 05 12:29:13 compute-0 ovs-appctl[247828]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 05 12:29:15 compute-0 nova_compute[187208]: 2025-12-05 12:29:15.065 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:15 compute-0 nova_compute[187208]: 2025-12-05 12:29:15.068 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:20 compute-0 nova_compute[187208]: 2025-12-05 12:29:20.069 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:20 compute-0 nova_compute[187208]: 2025-12-05 12:29:20.071 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:20 compute-0 nova_compute[187208]: 2025-12-05 12:29:20.071 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:29:20 compute-0 nova_compute[187208]: 2025-12-05 12:29:20.071 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:29:20 compute-0 nova_compute[187208]: 2025-12-05 12:29:20.108 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:20 compute-0 nova_compute[187208]: 2025-12-05 12:29:20.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:29:20 compute-0 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 05 12:29:22 compute-0 sshd-session[249023]: Received disconnect from 43.225.159.82 port 35730:11:  [preauth]
Dec 05 12:29:22 compute-0 sshd-session[249023]: Disconnected from authenticating user root 43.225.159.82 port 35730 [preauth]
Dec 05 12:29:22 compute-0 podman[249153]: 2025-12-05 12:29:22.301393958 +0000 UTC m=+0.060445193 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 12:29:22 compute-0 systemd[1]: Starting Time & Date Service...
Dec 05 12:29:22 compute-0 systemd[1]: Started Time & Date Service.
Dec 05 12:29:25 compute-0 nova_compute[187208]: 2025-12-05 12:29:25.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:28 compute-0 podman[249217]: 2025-12-05 12:29:28.497931548 +0000 UTC m=+0.066438795 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 12:29:30 compute-0 nova_compute[187208]: 2025-12-05 12:29:30.111 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:30 compute-0 nova_compute[187208]: 2025-12-05 12:29:30.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:30 compute-0 nova_compute[187208]: 2025-12-05 12:29:30.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:29:30 compute-0 nova_compute[187208]: 2025-12-05 12:29:30.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:29:30 compute-0 nova_compute[187208]: 2025-12-05 12:29:30.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:29:30 compute-0 nova_compute[187208]: 2025-12-05 12:29:30.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:32 compute-0 podman[249239]: 2025-12-05 12:29:32.538688565 +0000 UTC m=+0.059539947 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:29:32 compute-0 podman[249238]: 2025-12-05 12:29:32.545505271 +0000 UTC m=+0.068336920 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git)
Dec 05 12:29:35 compute-0 nova_compute[187208]: 2025-12-05 12:29:35.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:35 compute-0 nova_compute[187208]: 2025-12-05 12:29:35.116 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:40 compute-0 nova_compute[187208]: 2025-12-05 12:29:40.116 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:40 compute-0 nova_compute[187208]: 2025-12-05 12:29:40.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:40 compute-0 nova_compute[187208]: 2025-12-05 12:29:40.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:29:40 compute-0 nova_compute[187208]: 2025-12-05 12:29:40.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:29:40 compute-0 nova_compute[187208]: 2025-12-05 12:29:40.119 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:29:40 compute-0 nova_compute[187208]: 2025-12-05 12:29:40.120 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:42 compute-0 podman[249277]: 2025-12-05 12:29:42.613514131 +0000 UTC m=+0.062425200 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:29:42 compute-0 podman[249278]: 2025-12-05 12:29:42.622227762 +0000 UTC m=+0.066187159 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:29:42 compute-0 podman[249279]: 2025-12-05 12:29:42.684346202 +0000 UTC m=+0.120566746 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 12:29:45 compute-0 nova_compute[187208]: 2025-12-05 12:29:45.121 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:45 compute-0 nova_compute[187208]: 2025-12-05 12:29:45.124 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:49 compute-0 sudo[245778]: pam_unix(sudo:session): session closed for user root
Dec 05 12:29:49 compute-0 sshd-session[245777]: Received disconnect from 192.168.122.10 port 42888:11: disconnected by user
Dec 05 12:29:49 compute-0 sshd-session[245777]: Disconnected from user zuul 192.168.122.10 port 42888
Dec 05 12:29:49 compute-0 sshd-session[245774]: pam_unix(sshd:session): session closed for user zuul
Dec 05 12:29:49 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Dec 05 12:29:49 compute-0 systemd[1]: session-26.scope: Consumed 1min 26.231s CPU time, 808.3M memory peak, read 324.0M from disk, written 31.3M to disk.
Dec 05 12:29:49 compute-0 systemd-logind[792]: Session 26 logged out. Waiting for processes to exit.
Dec 05 12:29:49 compute-0 systemd-logind[792]: Removed session 26.
Dec 05 12:29:49 compute-0 sshd-session[249343]: Accepted publickey for zuul from 192.168.122.10 port 38238 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 12:29:49 compute-0 systemd-logind[792]: New session 27 of user zuul.
Dec 05 12:29:49 compute-0 systemd[1]: Started Session 27 of User zuul.
Dec 05 12:29:49 compute-0 sshd-session[249343]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 12:29:50 compute-0 nova_compute[187208]: 2025-12-05 12:29:50.126 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:50 compute-0 nova_compute[187208]: 2025-12-05 12:29:50.129 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:50 compute-0 sudo[249347]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-12-05-xeomrjx.tar.xz
Dec 05 12:29:50 compute-0 sudo[249347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 12:29:50 compute-0 sudo[249347]: pam_unix(sudo:session): session closed for user root
Dec 05 12:29:50 compute-0 sshd-session[249346]: Received disconnect from 192.168.122.10 port 38238:11: disconnected by user
Dec 05 12:29:50 compute-0 sshd-session[249346]: Disconnected from user zuul 192.168.122.10 port 38238
Dec 05 12:29:50 compute-0 sshd-session[249343]: pam_unix(sshd:session): session closed for user zuul
Dec 05 12:29:50 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Dec 05 12:29:50 compute-0 systemd-logind[792]: Session 27 logged out. Waiting for processes to exit.
Dec 05 12:29:50 compute-0 systemd-logind[792]: Removed session 27.
Dec 05 12:29:50 compute-0 sshd-session[249372]: Accepted publickey for zuul from 192.168.122.10 port 38244 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 12:29:50 compute-0 systemd-logind[792]: New session 28 of user zuul.
Dec 05 12:29:50 compute-0 systemd[1]: Started Session 28 of User zuul.
Dec 05 12:29:50 compute-0 sshd-session[249372]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 12:29:50 compute-0 sudo[249376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Dec 05 12:29:50 compute-0 sudo[249376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 12:29:50 compute-0 sudo[249376]: pam_unix(sudo:session): session closed for user root
Dec 05 12:29:50 compute-0 sshd-session[249375]: Received disconnect from 192.168.122.10 port 38244:11: disconnected by user
Dec 05 12:29:50 compute-0 sshd-session[249375]: Disconnected from user zuul 192.168.122.10 port 38244
Dec 05 12:29:50 compute-0 sshd-session[249372]: pam_unix(sshd:session): session closed for user zuul
Dec 05 12:29:50 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Dec 05 12:29:50 compute-0 systemd-logind[792]: Session 28 logged out. Waiting for processes to exit.
Dec 05 12:29:50 compute-0 systemd-logind[792]: Removed session 28.
Dec 05 12:29:52 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 12:29:52 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 05 12:29:52 compute-0 podman[249401]: 2025-12-05 12:29:52.795046359 +0000 UTC m=+0.066411855 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:29:55 compute-0 nova_compute[187208]: 2025-12-05 12:29:55.129 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:29:55 compute-0 nova_compute[187208]: 2025-12-05 12:29:55.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:29:55 compute-0 nova_compute[187208]: 2025-12-05 12:29:55.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:29:55 compute-0 nova_compute[187208]: 2025-12-05 12:29:55.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:29:55 compute-0 nova_compute[187208]: 2025-12-05 12:29:55.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:29:59 compute-0 podman[249432]: 2025-12-05 12:29:59.206184946 +0000 UTC m=+0.059471185 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 05 12:30:00 compute-0 nova_compute[187208]: 2025-12-05 12:30:00.132 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:00 compute-0 nova_compute[187208]: 2025-12-05 12:30:00.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:00 compute-0 nova_compute[187208]: 2025-12-05 12:30:00.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:30:00 compute-0 nova_compute[187208]: 2025-12-05 12:30:00.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:00 compute-0 nova_compute[187208]: 2025-12-05 12:30:00.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:00 compute-0 nova_compute[187208]: 2025-12-05 12:30:00.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:01 compute-0 nova_compute[187208]: 2025-12-05 12:30:01.136 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:01 compute-0 nova_compute[187208]: 2025-12-05 12:30:01.137 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:30:01 compute-0 nova_compute[187208]: 2025-12-05 12:30:01.137 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:30:01 compute-0 nova_compute[187208]: 2025-12-05 12:30:01.159 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:30:01 compute-0 nova_compute[187208]: 2025-12-05 12:30:01.159 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:01 compute-0 nova_compute[187208]: 2025-12-05 12:30:01.159 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:30:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:30:03.036 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:30:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:30:03.037 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:30:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:30:03.037 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:30:03 compute-0 nova_compute[187208]: 2025-12-05 12:30:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:03 compute-0 nova_compute[187208]: 2025-12-05 12:30:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:03 compute-0 podman[249455]: 2025-12-05 12:30:03.205455497 +0000 UTC m=+0.055460189 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 05 12:30:03 compute-0 podman[249454]: 2025-12-05 12:30:03.205353154 +0000 UTC m=+0.057811677 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, vcs-type=git, architecture=x86_64, config_id=edpm)
Dec 05 12:30:04 compute-0 nova_compute[187208]: 2025-12-05 12:30:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:05 compute-0 nova_compute[187208]: 2025-12-05 12:30:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:05 compute-0 nova_compute[187208]: 2025-12-05 12:30:05.162 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:05 compute-0 nova_compute[187208]: 2025-12-05 12:30:05.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:05 compute-0 nova_compute[187208]: 2025-12-05 12:30:05.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:30:05 compute-0 nova_compute[187208]: 2025-12-05 12:30:05.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:05 compute-0 nova_compute[187208]: 2025-12-05 12:30:05.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:08 compute-0 nova_compute[187208]: 2025-12-05 12:30:08.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:08 compute-0 nova_compute[187208]: 2025-12-05 12:30:08.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:10 compute-0 nova_compute[187208]: 2025-12-05 12:30:10.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:11 compute-0 nova_compute[187208]: 2025-12-05 12:30:11.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.090 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.090 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.235 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.236 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5664MB free_disk=73.04059600830078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.236 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.236 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.479 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.480 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.503 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.523 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.523 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.547 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.569 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.592 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.608 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.631 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:30:12 compute-0 nova_compute[187208]: 2025-12-05 12:30:12.632 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:30:13 compute-0 podman[249496]: 2025-12-05 12:30:13.198655945 +0000 UTC m=+0.047911820 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:30:13 compute-0 podman[249495]: 2025-12-05 12:30:13.198808439 +0000 UTC m=+0.050851244 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 12:30:13 compute-0 podman[249497]: 2025-12-05 12:30:13.24225142 +0000 UTC m=+0.087258524 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:30:15 compute-0 nova_compute[187208]: 2025-12-05 12:30:15.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:15 compute-0 nova_compute[187208]: 2025-12-05 12:30:15.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:15 compute-0 nova_compute[187208]: 2025-12-05 12:30:15.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:30:15 compute-0 nova_compute[187208]: 2025-12-05 12:30:15.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:15 compute-0 nova_compute[187208]: 2025-12-05 12:30:15.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:15 compute-0 nova_compute[187208]: 2025-12-05 12:30:15.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:20 compute-0 nova_compute[187208]: 2025-12-05 12:30:20.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:20 compute-0 nova_compute[187208]: 2025-12-05 12:30:20.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:23 compute-0 podman[249562]: 2025-12-05 12:30:23.192833348 +0000 UTC m=+0.051653767 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 12:30:25 compute-0 nova_compute[187208]: 2025-12-05 12:30:25.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:30 compute-0 nova_compute[187208]: 2025-12-05 12:30:30.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:30 compute-0 nova_compute[187208]: 2025-12-05 12:30:30.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:30 compute-0 nova_compute[187208]: 2025-12-05 12:30:30.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:30:30 compute-0 nova_compute[187208]: 2025-12-05 12:30:30.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:30 compute-0 nova_compute[187208]: 2025-12-05 12:30:30.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:30 compute-0 nova_compute[187208]: 2025-12-05 12:30:30.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:30 compute-0 podman[249586]: 2025-12-05 12:30:30.192967967 +0000 UTC m=+0.047176880 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:30:34 compute-0 podman[249607]: 2025-12-05 12:30:34.206006547 +0000 UTC m=+0.052958344 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Dec 05 12:30:34 compute-0 podman[249606]: 2025-12-05 12:30:34.207698706 +0000 UTC m=+0.058235925 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, architecture=x86_64)
Dec 05 12:30:35 compute-0 nova_compute[187208]: 2025-12-05 12:30:35.174 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:30:40 compute-0 nova_compute[187208]: 2025-12-05 12:30:40.175 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:40 compute-0 nova_compute[187208]: 2025-12-05 12:30:40.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:40 compute-0 nova_compute[187208]: 2025-12-05 12:30:40.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:30:40 compute-0 nova_compute[187208]: 2025-12-05 12:30:40.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:40 compute-0 nova_compute[187208]: 2025-12-05 12:30:40.225 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:40 compute-0 nova_compute[187208]: 2025-12-05 12:30:40.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:44 compute-0 podman[249641]: 2025-12-05 12:30:44.196814686 +0000 UTC m=+0.052001017 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 12:30:44 compute-0 podman[249642]: 2025-12-05 12:30:44.197000451 +0000 UTC m=+0.048468616 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:30:44 compute-0 podman[249643]: 2025-12-05 12:30:44.232192676 +0000 UTC m=+0.077329840 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 12:30:45 compute-0 nova_compute[187208]: 2025-12-05 12:30:45.224 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:49 compute-0 sshd-session[249710]: Connection reset by authenticating user root 45.135.232.92 port 49520 [preauth]
Dec 05 12:30:50 compute-0 nova_compute[187208]: 2025-12-05 12:30:50.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:51 compute-0 sshd-session[249712]: Connection reset by authenticating user root 45.135.232.92 port 49524 [preauth]
Dec 05 12:30:53 compute-0 sshd-session[249714]: Invalid user support from 45.135.232.92 port 49550
Dec 05 12:30:53 compute-0 podman[249717]: 2025-12-05 12:30:53.906184433 +0000 UTC m=+0.078096823 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:30:54 compute-0 sshd-session[249714]: Connection reset by invalid user support 45.135.232.92 port 49550 [preauth]
Dec 05 12:30:55 compute-0 nova_compute[187208]: 2025-12-05 12:30:55.231 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:55 compute-0 nova_compute[187208]: 2025-12-05 12:30:55.272 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:55 compute-0 nova_compute[187208]: 2025-12-05 12:30:55.273 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:30:55 compute-0 nova_compute[187208]: 2025-12-05 12:30:55.273 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:55 compute-0 nova_compute[187208]: 2025-12-05 12:30:55.274 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:55 compute-0 nova_compute[187208]: 2025-12-05 12:30:55.274 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:30:55 compute-0 nova_compute[187208]: 2025-12-05 12:30:55.275 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:30:55 compute-0 nova_compute[187208]: 2025-12-05 12:30:55.276 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:30:56 compute-0 sshd-session[249742]: Invalid user admin from 45.135.232.92 port 41854
Dec 05 12:30:57 compute-0 sshd-session[249742]: Connection reset by invalid user admin 45.135.232.92 port 41854 [preauth]
Dec 05 12:30:59 compute-0 sshd-session[249744]: Connection reset by authenticating user root 45.135.232.92 port 41866 [preauth]
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.277 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.280 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.280 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.280 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.310 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.311 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.632 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.633 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.633 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:31:00 compute-0 nova_compute[187208]: 2025-12-05 12:31:00.649 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:31:01 compute-0 nova_compute[187208]: 2025-12-05 12:31:01.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:31:01 compute-0 nova_compute[187208]: 2025-12-05 12:31:01.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:31:01 compute-0 podman[249746]: 2025-12-05 12:31:01.209975357 +0000 UTC m=+0.066160462 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:31:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:31:03.038 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:31:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:31:03.039 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:31:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:31:03.039 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:31:04 compute-0 nova_compute[187208]: 2025-12-05 12:31:04.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:31:05 compute-0 nova_compute[187208]: 2025-12-05 12:31:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:31:05 compute-0 podman[249767]: 2025-12-05 12:31:05.196914261 +0000 UTC m=+0.043751271 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 12:31:05 compute-0 podman[249766]: 2025-12-05 12:31:05.200813653 +0000 UTC m=+0.048526468 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm)
Dec 05 12:31:05 compute-0 nova_compute[187208]: 2025-12-05 12:31:05.312 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:31:05 compute-0 nova_compute[187208]: 2025-12-05 12:31:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:05 compute-0 nova_compute[187208]: 2025-12-05 12:31:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:31:05 compute-0 nova_compute[187208]: 2025-12-05 12:31:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:31:05 compute-0 nova_compute[187208]: 2025-12-05 12:31:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:31:05 compute-0 nova_compute[187208]: 2025-12-05 12:31:05.314 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:06 compute-0 nova_compute[187208]: 2025-12-05 12:31:06.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:31:07 compute-0 nova_compute[187208]: 2025-12-05 12:31:07.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:31:09 compute-0 nova_compute[187208]: 2025-12-05 12:31:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:31:09 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 12:31:09 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 12:31:10 compute-0 nova_compute[187208]: 2025-12-05 12:31:10.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:31:10 compute-0 nova_compute[187208]: 2025-12-05 12:31:10.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.167 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.167 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.168 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.168 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.325 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.326 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5671MB free_disk=73.04075241088867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.326 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.327 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.463 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.463 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.529 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.553 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.555 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:31:14 compute-0 nova_compute[187208]: 2025-12-05 12:31:14.555 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:31:15 compute-0 podman[249805]: 2025-12-05 12:31:15.206176156 +0000 UTC m=+0.051785130 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:31:15 compute-0 podman[249804]: 2025-12-05 12:31:15.208184444 +0000 UTC m=+0.057515835 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 05 12:31:15 compute-0 podman[249806]: 2025-12-05 12:31:15.253790897 +0000 UTC m=+0.098935438 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 12:31:15 compute-0 nova_compute[187208]: 2025-12-05 12:31:15.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:31:15 compute-0 nova_compute[187208]: 2025-12-05 12:31:15.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:15 compute-0 nova_compute[187208]: 2025-12-05 12:31:15.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:31:15 compute-0 nova_compute[187208]: 2025-12-05 12:31:15.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:31:15 compute-0 nova_compute[187208]: 2025-12-05 12:31:15.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:31:15 compute-0 nova_compute[187208]: 2025-12-05 12:31:15.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:20 compute-0 nova_compute[187208]: 2025-12-05 12:31:20.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:24 compute-0 podman[249872]: 2025-12-05 12:31:24.19583079 +0000 UTC m=+0.046988884 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:31:25 compute-0 nova_compute[187208]: 2025-12-05 12:31:25.321 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:31:30 compute-0 nova_compute[187208]: 2025-12-05 12:31:30.325 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:31:30 compute-0 nova_compute[187208]: 2025-12-05 12:31:30.327 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:31:30 compute-0 nova_compute[187208]: 2025-12-05 12:31:30.327 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:31:30 compute-0 nova_compute[187208]: 2025-12-05 12:31:30.327 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:31:30 compute-0 nova_compute[187208]: 2025-12-05 12:31:30.355 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:30 compute-0 nova_compute[187208]: 2025-12-05 12:31:30.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:31:32 compute-0 podman[249896]: 2025-12-05 12:31:32.198792321 +0000 UTC m=+0.053421137 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 05 12:31:35 compute-0 nova_compute[187208]: 2025-12-05 12:31:35.357 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:36 compute-0 podman[249915]: 2025-12-05 12:31:36.220917902 +0000 UTC m=+0.057838024 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:31:36 compute-0 podman[249914]: 2025-12-05 12:31:36.220904961 +0000 UTC m=+0.063636509 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter)
Dec 05 12:31:40 compute-0 nova_compute[187208]: 2025-12-05 12:31:40.358 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:45 compute-0 nova_compute[187208]: 2025-12-05 12:31:45.360 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:46 compute-0 podman[249955]: 2025-12-05 12:31:46.211600676 +0000 UTC m=+0.052798890 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:31:46 compute-0 podman[249954]: 2025-12-05 12:31:46.218937135 +0000 UTC m=+0.065338477 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 05 12:31:46 compute-0 podman[249956]: 2025-12-05 12:31:46.252076702 +0000 UTC m=+0.086991426 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:31:50 compute-0 nova_compute[187208]: 2025-12-05 12:31:50.361 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:55 compute-0 podman[250020]: 2025-12-05 12:31:55.202654548 +0000 UTC m=+0.058075800 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:31:55 compute-0 nova_compute[187208]: 2025-12-05 12:31:55.363 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:31:55 compute-0 nova_compute[187208]: 2025-12-05 12:31:55.365 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:31:55 compute-0 nova_compute[187208]: 2025-12-05 12:31:55.365 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:31:55 compute-0 nova_compute[187208]: 2025-12-05 12:31:55.365 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:31:55 compute-0 nova_compute[187208]: 2025-12-05 12:31:55.400 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:31:55 compute-0 nova_compute[187208]: 2025-12-05 12:31:55.401 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:32:00 compute-0 nova_compute[187208]: 2025-12-05 12:32:00.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:32:00 compute-0 nova_compute[187208]: 2025-12-05 12:32:00.556 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:00 compute-0 nova_compute[187208]: 2025-12-05 12:32:00.557 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:32:00 compute-0 nova_compute[187208]: 2025-12-05 12:32:00.557 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:32:00 compute-0 nova_compute[187208]: 2025-12-05 12:32:00.573 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:32:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:32:03.040 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:32:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:32:03.040 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:32:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:32:03.040 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:32:03 compute-0 nova_compute[187208]: 2025-12-05 12:32:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:03 compute-0 nova_compute[187208]: 2025-12-05 12:32:03.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:32:03 compute-0 podman[250044]: 2025-12-05 12:32:03.199441801 +0000 UTC m=+0.051212395 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:32:05 compute-0 nova_compute[187208]: 2025-12-05 12:32:05.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:05 compute-0 nova_compute[187208]: 2025-12-05 12:32:05.404 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:32:05 compute-0 nova_compute[187208]: 2025-12-05 12:32:05.405 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:32:06 compute-0 nova_compute[187208]: 2025-12-05 12:32:06.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:07 compute-0 nova_compute[187208]: 2025-12-05 12:32:07.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:07 compute-0 podman[250062]: 2025-12-05 12:32:07.198915845 +0000 UTC m=+0.056610629 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Dec 05 12:32:07 compute-0 podman[250063]: 2025-12-05 12:32:07.211106953 +0000 UTC m=+0.059741648 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 12:32:09 compute-0 nova_compute[187208]: 2025-12-05 12:32:09.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:09 compute-0 nova_compute[187208]: 2025-12-05 12:32:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:10 compute-0 nova_compute[187208]: 2025-12-05 12:32:10.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:32:12 compute-0 nova_compute[187208]: 2025-12-05 12:32:12.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:12 compute-0 nova_compute[187208]: 2025-12-05 12:32:12.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:15 compute-0 nova_compute[187208]: 2025-12-05 12:32:15.409 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.226 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.226 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.226 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.227 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.386 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.387 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5671MB free_disk=73.04075241088867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.388 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.388 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.448 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.448 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.471 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.483 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.485 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:32:16 compute-0 nova_compute[187208]: 2025-12-05 12:32:16.485 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:32:17 compute-0 podman[250102]: 2025-12-05 12:32:17.199534064 +0000 UTC m=+0.050675468 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:32:17 compute-0 podman[250101]: 2025-12-05 12:32:17.204327881 +0000 UTC m=+0.059915543 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Dec 05 12:32:17 compute-0 podman[250103]: 2025-12-05 12:32:17.242166843 +0000 UTC m=+0.088678805 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:32:20 compute-0 nova_compute[187208]: 2025-12-05 12:32:20.410 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:32:20 compute-0 nova_compute[187208]: 2025-12-05 12:32:20.412 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:32:20 compute-0 nova_compute[187208]: 2025-12-05 12:32:20.412 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:32:20 compute-0 nova_compute[187208]: 2025-12-05 12:32:20.412 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:32:20 compute-0 nova_compute[187208]: 2025-12-05 12:32:20.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:32:20 compute-0 nova_compute[187208]: 2025-12-05 12:32:20.461 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:32:25 compute-0 nova_compute[187208]: 2025-12-05 12:32:25.462 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:32:26 compute-0 podman[250171]: 2025-12-05 12:32:26.196097615 +0000 UTC m=+0.050664758 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:32:30 compute-0 nova_compute[187208]: 2025-12-05 12:32:30.464 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:32:34 compute-0 podman[250195]: 2025-12-05 12:32:34.205388977 +0000 UTC m=+0.058245605 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:32:35 compute-0 nova_compute[187208]: 2025-12-05 12:32:35.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:32:38 compute-0 podman[250218]: 2025-12-05 12:32:38.203822531 +0000 UTC m=+0.050243586 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:32:38 compute-0 podman[250217]: 2025-12-05 12:32:38.211802379 +0000 UTC m=+0.062106145 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible)
Dec 05 12:32:40 compute-0 nova_compute[187208]: 2025-12-05 12:32:40.468 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:32:40 compute-0 nova_compute[187208]: 2025-12-05 12:32:40.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:32:40 compute-0 nova_compute[187208]: 2025-12-05 12:32:40.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:32:40 compute-0 nova_compute[187208]: 2025-12-05 12:32:40.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:32:40 compute-0 nova_compute[187208]: 2025-12-05 12:32:40.471 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:32:40 compute-0 nova_compute[187208]: 2025-12-05 12:32:40.472 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:32:45 compute-0 nova_compute[187208]: 2025-12-05 12:32:45.471 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:32:48 compute-0 podman[250255]: 2025-12-05 12:32:48.208167055 +0000 UTC m=+0.058726159 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:32:48 compute-0 podman[250256]: 2025-12-05 12:32:48.231107291 +0000 UTC m=+0.078932817 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:32:48 compute-0 podman[250257]: 2025-12-05 12:32:48.239769338 +0000 UTC m=+0.083796025 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 12:32:50 compute-0 nova_compute[187208]: 2025-12-05 12:32:50.473 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:32:55 compute-0 nova_compute[187208]: 2025-12-05 12:32:55.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:32:57 compute-0 podman[250323]: 2025-12-05 12:32:57.197569431 +0000 UTC m=+0.052729607 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:33:00 compute-0 nova_compute[187208]: 2025-12-05 12:33:00.476 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:00 compute-0 nova_compute[187208]: 2025-12-05 12:33:00.478 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:00 compute-0 nova_compute[187208]: 2025-12-05 12:33:00.478 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:33:00 compute-0 nova_compute[187208]: 2025-12-05 12:33:00.479 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:00 compute-0 nova_compute[187208]: 2025-12-05 12:33:00.523 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:00 compute-0 nova_compute[187208]: 2025-12-05 12:33:00.524 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:02 compute-0 nova_compute[187208]: 2025-12-05 12:33:02.486 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:02 compute-0 nova_compute[187208]: 2025-12-05 12:33:02.487 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:33:02 compute-0 nova_compute[187208]: 2025-12-05 12:33:02.487 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:33:02 compute-0 nova_compute[187208]: 2025-12-05 12:33:02.527 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:33:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:33:03.041 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:33:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:33:03.041 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:33:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:33:03.042 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:33:04 compute-0 nova_compute[187208]: 2025-12-05 12:33:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:04 compute-0 nova_compute[187208]: 2025-12-05 12:33:04.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:33:05 compute-0 podman[250349]: 2025-12-05 12:33:05.195133979 +0000 UTC m=+0.050441932 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 12:33:05 compute-0 nova_compute[187208]: 2025-12-05 12:33:05.524 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:07 compute-0 nova_compute[187208]: 2025-12-05 12:33:07.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:08 compute-0 nova_compute[187208]: 2025-12-05 12:33:08.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:08 compute-0 nova_compute[187208]: 2025-12-05 12:33:08.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:09 compute-0 nova_compute[187208]: 2025-12-05 12:33:09.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:09 compute-0 podman[250370]: 2025-12-05 12:33:09.205670039 +0000 UTC m=+0.050211446 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 12:33:09 compute-0 podman[250369]: 2025-12-05 12:33:09.209746235 +0000 UTC m=+0.060073887 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, vcs-type=git)
Dec 05 12:33:10 compute-0 nova_compute[187208]: 2025-12-05 12:33:10.526 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:11 compute-0 nova_compute[187208]: 2025-12-05 12:33:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:11 compute-0 nova_compute[187208]: 2025-12-05 12:33:11.832 187212 DEBUG oslo_concurrency.processutils [None req-dbb20e6f-cbb5-4adf-8656-90ad61357de6 6b85417e2d5f492ab96282fdfe0b4f64 3df4e4eed3454c178c5281d12024579e - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 12:33:11 compute-0 nova_compute[187208]: 2025-12-05 12:33:11.868 187212 DEBUG oslo_concurrency.processutils [None req-dbb20e6f-cbb5-4adf-8656-90ad61357de6 6b85417e2d5f492ab96282fdfe0b4f64 3df4e4eed3454c178c5281d12024579e - - default default] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 12:33:12 compute-0 nova_compute[187208]: 2025-12-05 12:33:12.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:15 compute-0 nova_compute[187208]: 2025-12-05 12:33:15.528 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:15 compute-0 nova_compute[187208]: 2025-12-05 12:33:15.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:15 compute-0 nova_compute[187208]: 2025-12-05 12:33:15.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:33:15 compute-0 nova_compute[187208]: 2025-12-05 12:33:15.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:15 compute-0 nova_compute[187208]: 2025-12-05 12:33:15.569 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:15 compute-0 nova_compute[187208]: 2025-12-05 12:33:15.570 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.099 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.099 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.100 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.288 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.290 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5668MB free_disk=73.04075241088867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.291 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.291 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.378 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.379 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.416 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.439 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.440 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:33:17 compute-0 nova_compute[187208]: 2025-12-05 12:33:17.441 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:33:18 compute-0 nova_compute[187208]: 2025-12-05 12:33:18.184 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:33:18.184 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 12:33:18 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:33:18.185 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 12:33:19 compute-0 podman[250411]: 2025-12-05 12:33:19.207051568 +0000 UTC m=+0.056245668 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:33:19 compute-0 podman[250410]: 2025-12-05 12:33:19.207034737 +0000 UTC m=+0.062882017 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Dec 05 12:33:19 compute-0 podman[250412]: 2025-12-05 12:33:19.240141653 +0000 UTC m=+0.082910170 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 12:33:20 compute-0 nova_compute[187208]: 2025-12-05 12:33:20.571 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:20 compute-0 nova_compute[187208]: 2025-12-05 12:33:20.572 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:25 compute-0 nova_compute[187208]: 2025-12-05 12:33:25.574 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:25 compute-0 nova_compute[187208]: 2025-12-05 12:33:25.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:25 compute-0 nova_compute[187208]: 2025-12-05 12:33:25.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:33:25 compute-0 nova_compute[187208]: 2025-12-05 12:33:25.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:25 compute-0 nova_compute[187208]: 2025-12-05 12:33:25.611 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:25 compute-0 nova_compute[187208]: 2025-12-05 12:33:25.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:28 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:33:28.187 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 12:33:28 compute-0 podman[250478]: 2025-12-05 12:33:28.19918396 +0000 UTC m=+0.056402532 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.060 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.061 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.062 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.062 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.062 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.063 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.079 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.088 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.088 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.088 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.088 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.089 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.089 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.089 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.089 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Removable base files: /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15 /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.090 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.090 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.090 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.092 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.092 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.615 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.615 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.615 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.648 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:30 compute-0 nova_compute[187208]: 2025-12-05 12:33:30.649 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:35 compute-0 nova_compute[187208]: 2025-12-05 12:33:35.651 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:35 compute-0 nova_compute[187208]: 2025-12-05 12:33:35.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:35 compute-0 nova_compute[187208]: 2025-12-05 12:33:35.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5047 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:33:35 compute-0 nova_compute[187208]: 2025-12-05 12:33:35.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:35 compute-0 nova_compute[187208]: 2025-12-05 12:33:35.698 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:35 compute-0 nova_compute[187208]: 2025-12-05 12:33:35.700 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:36 compute-0 podman[250502]: 2025-12-05 12:33:36.2280317 +0000 UTC m=+0.079145712 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:33:40 compute-0 podman[250523]: 2025-12-05 12:33:40.211890618 +0000 UTC m=+0.056320641 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 12:33:40 compute-0 podman[250522]: 2025-12-05 12:33:40.240113384 +0000 UTC m=+0.092114673 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Dec 05 12:33:40 compute-0 nova_compute[187208]: 2025-12-05 12:33:40.702 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:40 compute-0 nova_compute[187208]: 2025-12-05 12:33:40.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:40 compute-0 nova_compute[187208]: 2025-12-05 12:33:40.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:33:40 compute-0 nova_compute[187208]: 2025-12-05 12:33:40.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:40 compute-0 nova_compute[187208]: 2025-12-05 12:33:40.739 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:40 compute-0 nova_compute[187208]: 2025-12-05 12:33:40.740 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:33:45 compute-0 nova_compute[187208]: 2025-12-05 12:33:45.740 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:50 compute-0 podman[250559]: 2025-12-05 12:33:50.205954498 +0000 UTC m=+0.062240219 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Dec 05 12:33:50 compute-0 podman[250560]: 2025-12-05 12:33:50.232127836 +0000 UTC m=+0.083905349 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:33:50 compute-0 podman[250561]: 2025-12-05 12:33:50.241278397 +0000 UTC m=+0.087560463 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 12:33:50 compute-0 nova_compute[187208]: 2025-12-05 12:33:50.743 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:33:55 compute-0 nova_compute[187208]: 2025-12-05 12:33:55.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:33:59 compute-0 podman[250624]: 2025-12-05 12:33:59.193090699 +0000 UTC m=+0.049847395 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 12:34:00 compute-0 nova_compute[187208]: 2025-12-05 12:34:00.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:34:00 compute-0 nova_compute[187208]: 2025-12-05 12:34:00.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:00 compute-0 nova_compute[187208]: 2025-12-05 12:34:00.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:34:00 compute-0 nova_compute[187208]: 2025-12-05 12:34:00.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:34:00 compute-0 nova_compute[187208]: 2025-12-05 12:34:00.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:34:00 compute-0 nova_compute[187208]: 2025-12-05 12:34:00.748 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:02 compute-0 nova_compute[187208]: 2025-12-05 12:34:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:02 compute-0 nova_compute[187208]: 2025-12-05 12:34:02.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 12:34:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:34:03.042 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:34:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:34:03.042 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:34:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:34:03.043 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:34:04 compute-0 nova_compute[187208]: 2025-12-05 12:34:04.178 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:04 compute-0 nova_compute[187208]: 2025-12-05 12:34:04.179 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:34:04 compute-0 nova_compute[187208]: 2025-12-05 12:34:04.179 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:34:04 compute-0 nova_compute[187208]: 2025-12-05 12:34:04.205 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:34:05 compute-0 nova_compute[187208]: 2025-12-05 12:34:05.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:05 compute-0 nova_compute[187208]: 2025-12-05 12:34:05.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:34:05 compute-0 nova_compute[187208]: 2025-12-05 12:34:05.748 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:07 compute-0 nova_compute[187208]: 2025-12-05 12:34:07.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:07 compute-0 podman[250649]: 2025-12-05 12:34:07.193924179 +0000 UTC m=+0.051473172 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:34:08 compute-0 nova_compute[187208]: 2025-12-05 12:34:08.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:08 compute-0 nova_compute[187208]: 2025-12-05 12:34:08.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 12:34:08 compute-0 nova_compute[187208]: 2025-12-05 12:34:08.076 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 12:34:09 compute-0 nova_compute[187208]: 2025-12-05 12:34:09.075 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:10 compute-0 nova_compute[187208]: 2025-12-05 12:34:10.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:10 compute-0 nova_compute[187208]: 2025-12-05 12:34:10.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:10 compute-0 nova_compute[187208]: 2025-12-05 12:34:10.748 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:10 compute-0 nova_compute[187208]: 2025-12-05 12:34:10.750 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:11 compute-0 nova_compute[187208]: 2025-12-05 12:34:11.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:11 compute-0 podman[250670]: 2025-12-05 12:34:11.235262078 +0000 UTC m=+0.070402883 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 05 12:34:11 compute-0 podman[250669]: 2025-12-05 12:34:11.235104774 +0000 UTC m=+0.073645286 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Dec 05 12:34:14 compute-0 nova_compute[187208]: 2025-12-05 12:34:14.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:15 compute-0 nova_compute[187208]: 2025-12-05 12:34:15.750 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:16 compute-0 nova_compute[187208]: 2025-12-05 12:34:16.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.094 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.094 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.232 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.233 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.0407485961914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.233 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.233 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.292 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.292 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.371 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.386 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.388 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.388 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:34:19 compute-0 nova_compute[187208]: 2025-12-05 12:34:19.389 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:20 compute-0 nova_compute[187208]: 2025-12-05 12:34:20.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:21 compute-0 podman[250708]: 2025-12-05 12:34:21.201780292 +0000 UTC m=+0.049457134 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:34:21 compute-0 podman[250709]: 2025-12-05 12:34:21.228244068 +0000 UTC m=+0.075031125 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec 05 12:34:21 compute-0 podman[250707]: 2025-12-05 12:34:21.228043882 +0000 UTC m=+0.082765626 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 12:34:25 compute-0 nova_compute[187208]: 2025-12-05 12:34:25.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:34:30 compute-0 podman[250774]: 2025-12-05 12:34:30.190494317 +0000 UTC m=+0.048326181 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:34:30 compute-0 nova_compute[187208]: 2025-12-05 12:34:30.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:34:30 compute-0 nova_compute[187208]: 2025-12-05 12:34:30.757 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:30 compute-0 nova_compute[187208]: 2025-12-05 12:34:30.757 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:34:30 compute-0 nova_compute[187208]: 2025-12-05 12:34:30.757 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:34:30 compute-0 nova_compute[187208]: 2025-12-05 12:34:30.758 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:34:30 compute-0 nova_compute[187208]: 2025-12-05 12:34:30.758 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:34:35 compute-0 nova_compute[187208]: 2025-12-05 12:34:35.760 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:34:35 compute-0 nova_compute[187208]: 2025-12-05 12:34:35.761 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:34:35 compute-0 nova_compute[187208]: 2025-12-05 12:34:35.762 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:34:35 compute-0 nova_compute[187208]: 2025-12-05 12:34:35.762 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:34:35 compute-0 nova_compute[187208]: 2025-12-05 12:34:35.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:35 compute-0 nova_compute[187208]: 2025-12-05 12:34:35.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:34:38 compute-0 podman[250798]: 2025-12-05 12:34:38.201153939 +0000 UTC m=+0.052558633 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:34:40 compute-0 nova_compute[187208]: 2025-12-05 12:34:40.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:34:40 compute-0 nova_compute[187208]: 2025-12-05 12:34:40.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:40 compute-0 nova_compute[187208]: 2025-12-05 12:34:40.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:34:40 compute-0 nova_compute[187208]: 2025-12-05 12:34:40.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:34:40 compute-0 nova_compute[187208]: 2025-12-05 12:34:40.800 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:34:40 compute-0 nova_compute[187208]: 2025-12-05 12:34:40.802 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:34:42 compute-0 podman[250818]: 2025-12-05 12:34:42.264411184 +0000 UTC m=+0.087708297 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git)
Dec 05 12:34:42 compute-0 podman[250819]: 2025-12-05 12:34:42.279191257 +0000 UTC m=+0.099475554 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 05 12:34:45 compute-0 nova_compute[187208]: 2025-12-05 12:34:45.801 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:47 compute-0 nova_compute[187208]: 2025-12-05 12:34:47.422 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:34:50 compute-0 nova_compute[187208]: 2025-12-05 12:34:50.802 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:34:52 compute-0 podman[250860]: 2025-12-05 12:34:52.206552292 +0000 UTC m=+0.053043017 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:34:52 compute-0 podman[250859]: 2025-12-05 12:34:52.216079674 +0000 UTC m=+0.066968675 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0)
Dec 05 12:34:52 compute-0 podman[250861]: 2025-12-05 12:34:52.235579941 +0000 UTC m=+0.079543954 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 05 12:34:55 compute-0 nova_compute[187208]: 2025-12-05 12:34:55.804 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:00 compute-0 nova_compute[187208]: 2025-12-05 12:35:00.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:01 compute-0 podman[250925]: 2025-12-05 12:35:01.193787837 +0000 UTC m=+0.048337972 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 12:35:01 compute-0 sshd-session[250923]: Invalid user support from 2.57.121.112 port 63894
Dec 05 12:35:02 compute-0 sshd-session[250923]: Received disconnect from 2.57.121.112 port 63894:11: Bye [preauth]
Dec 05 12:35:02 compute-0 sshd-session[250923]: Disconnected from invalid user support 2.57.121.112 port 63894 [preauth]
Dec 05 12:35:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:35:03.043 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:35:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:35:03.044 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:35:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:35:03.044 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:35:05 compute-0 nova_compute[187208]: 2025-12-05 12:35:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:35:05 compute-0 nova_compute[187208]: 2025-12-05 12:35:05.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:35:05 compute-0 nova_compute[187208]: 2025-12-05 12:35:05.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:35:05 compute-0 nova_compute[187208]: 2025-12-05 12:35:05.074 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:35:05 compute-0 nova_compute[187208]: 2025-12-05 12:35:05.075 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:35:05 compute-0 nova_compute[187208]: 2025-12-05 12:35:05.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:35:05 compute-0 nova_compute[187208]: 2025-12-05 12:35:05.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:09 compute-0 nova_compute[187208]: 2025-12-05 12:35:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:35:09 compute-0 nova_compute[187208]: 2025-12-05 12:35:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:35:09 compute-0 podman[250949]: 2025-12-05 12:35:09.224942383 +0000 UTC m=+0.078996558 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 05 12:35:10 compute-0 nova_compute[187208]: 2025-12-05 12:35:10.810 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:11 compute-0 nova_compute[187208]: 2025-12-05 12:35:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:35:12 compute-0 nova_compute[187208]: 2025-12-05 12:35:12.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:35:12 compute-0 nova_compute[187208]: 2025-12-05 12:35:12.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:35:13 compute-0 podman[250970]: 2025-12-05 12:35:13.246392124 +0000 UTC m=+0.092004030 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 12:35:13 compute-0 podman[250971]: 2025-12-05 12:35:13.247606169 +0000 UTC m=+0.088262393 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 05 12:35:15 compute-0 nova_compute[187208]: 2025-12-05 12:35:15.811 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:16 compute-0 nova_compute[187208]: 2025-12-05 12:35:16.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.098 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.274 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.276 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5664MB free_disk=73.0407485961914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.276 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.276 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.427 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.427 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.442 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.550 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.550 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.565 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.583 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.605 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.621 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.622 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:35:19 compute-0 nova_compute[187208]: 2025-12-05 12:35:19.622 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:35:20 compute-0 nova_compute[187208]: 2025-12-05 12:35:20.813 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:23 compute-0 podman[251012]: 2025-12-05 12:35:23.203035676 +0000 UTC m=+0.056046442 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 12:35:23 compute-0 podman[251013]: 2025-12-05 12:35:23.223895782 +0000 UTC m=+0.072703618 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 12:35:23 compute-0 podman[251014]: 2025-12-05 12:35:23.256969257 +0000 UTC m=+0.101828650 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller)
Dec 05 12:35:25 compute-0 nova_compute[187208]: 2025-12-05 12:35:25.814 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:30 compute-0 nova_compute[187208]: 2025-12-05 12:35:30.816 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:32 compute-0 podman[251077]: 2025-12-05 12:35:32.199152015 +0000 UTC m=+0.051400440 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 12:35:35 compute-0 nova_compute[187208]: 2025-12-05 12:35:35.818 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:35 compute-0 nova_compute[187208]: 2025-12-05 12:35:35.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:40 compute-0 podman[251101]: 2025-12-05 12:35:40.202485426 +0000 UTC m=+0.053301184 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:35:40 compute-0 nova_compute[187208]: 2025-12-05 12:35:40.820 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:35:40 compute-0 nova_compute[187208]: 2025-12-05 12:35:40.821 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:40 compute-0 nova_compute[187208]: 2025-12-05 12:35:40.821 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:35:40 compute-0 nova_compute[187208]: 2025-12-05 12:35:40.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:35:40 compute-0 nova_compute[187208]: 2025-12-05 12:35:40.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:35:40 compute-0 nova_compute[187208]: 2025-12-05 12:35:40.824 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:35:44 compute-0 podman[251121]: 2025-12-05 12:35:44.204286756 +0000 UTC m=+0.056494095 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc.)
Dec 05 12:35:44 compute-0 podman[251122]: 2025-12-05 12:35:44.210169994 +0000 UTC m=+0.052079639 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 05 12:35:45 compute-0 nova_compute[187208]: 2025-12-05 12:35:45.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:35:50 compute-0 nova_compute[187208]: 2025-12-05 12:35:50.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:54 compute-0 podman[251160]: 2025-12-05 12:35:54.207750865 +0000 UTC m=+0.058063760 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 12:35:54 compute-0 podman[251161]: 2025-12-05 12:35:54.227359606 +0000 UTC m=+0.075099787 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:35:54 compute-0 podman[251162]: 2025-12-05 12:35:54.23064353 +0000 UTC m=+0.072579035 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 05 12:35:55 compute-0 nova_compute[187208]: 2025-12-05 12:35:55.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:35:55 compute-0 nova_compute[187208]: 2025-12-05 12:35:55.828 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:00 compute-0 nova_compute[187208]: 2025-12-05 12:36:00.828 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:36:03.046 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:36:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:36:03.046 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:36:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:36:03.046 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:36:03 compute-0 podman[251226]: 2025-12-05 12:36:03.235057985 +0000 UTC m=+0.085370540 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.623 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.624 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.624 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.637 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.638 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.638 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.831 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.879 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:05 compute-0 nova_compute[187208]: 2025-12-05 12:36:05.880 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:09 compute-0 nova_compute[187208]: 2025-12-05 12:36:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:10 compute-0 nova_compute[187208]: 2025-12-05 12:36:10.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:10 compute-0 nova_compute[187208]: 2025-12-05 12:36:10.882 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:11 compute-0 nova_compute[187208]: 2025-12-05 12:36:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:11 compute-0 podman[251251]: 2025-12-05 12:36:11.20939284 +0000 UTC m=+0.059037618 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 12:36:12 compute-0 nova_compute[187208]: 2025-12-05 12:36:12.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:12 compute-0 nova_compute[187208]: 2025-12-05 12:36:12.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:15 compute-0 podman[251271]: 2025-12-05 12:36:15.215452333 +0000 UTC m=+0.065622936 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 12:36:15 compute-0 podman[251272]: 2025-12-05 12:36:15.237177054 +0000 UTC m=+0.083021013 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 12:36:15 compute-0 nova_compute[187208]: 2025-12-05 12:36:15.884 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:15 compute-0 nova_compute[187208]: 2025-12-05 12:36:15.886 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:16 compute-0 nova_compute[187208]: 2025-12-05 12:36:16.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.097 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.244 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.245 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5665MB free_disk=73.04133605957031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.246 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.246 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.349 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.350 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.372 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.384 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.386 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:36:19 compute-0 nova_compute[187208]: 2025-12-05 12:36:19.386 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:36:20 compute-0 nova_compute[187208]: 2025-12-05 12:36:20.380 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:36:20 compute-0 nova_compute[187208]: 2025-12-05 12:36:20.887 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:20 compute-0 nova_compute[187208]: 2025-12-05 12:36:20.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:20 compute-0 nova_compute[187208]: 2025-12-05 12:36:20.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:36:20 compute-0 nova_compute[187208]: 2025-12-05 12:36:20.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:20 compute-0 nova_compute[187208]: 2025-12-05 12:36:20.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:20 compute-0 nova_compute[187208]: 2025-12-05 12:36:20.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:25 compute-0 podman[251311]: 2025-12-05 12:36:25.212048066 +0000 UTC m=+0.062951520 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 12:36:25 compute-0 podman[251312]: 2025-12-05 12:36:25.235136346 +0000 UTC m=+0.083410605 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 12:36:25 compute-0 podman[251313]: 2025-12-05 12:36:25.247841179 +0000 UTC m=+0.091087824 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller)
Dec 05 12:36:25 compute-0 nova_compute[187208]: 2025-12-05 12:36:25.892 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:30 compute-0 nova_compute[187208]: 2025-12-05 12:36:30.894 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:34 compute-0 podman[251380]: 2025-12-05 12:36:34.203970563 +0000 UTC m=+0.057233756 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 12:36:35 compute-0 nova_compute[187208]: 2025-12-05 12:36:35.898 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:35 compute-0 nova_compute[187208]: 2025-12-05 12:36:35.900 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:35 compute-0 nova_compute[187208]: 2025-12-05 12:36:35.901 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:36:35 compute-0 nova_compute[187208]: 2025-12-05 12:36:35.901 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:35 compute-0 nova_compute[187208]: 2025-12-05 12:36:35.904 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:35 compute-0 nova_compute[187208]: 2025-12-05 12:36:35.904 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:40 compute-0 nova_compute[187208]: 2025-12-05 12:36:40.906 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:40 compute-0 nova_compute[187208]: 2025-12-05 12:36:40.906 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:40 compute-0 nova_compute[187208]: 2025-12-05 12:36:40.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:36:40 compute-0 nova_compute[187208]: 2025-12-05 12:36:40.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:40 compute-0 nova_compute[187208]: 2025-12-05 12:36:40.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:40 compute-0 nova_compute[187208]: 2025-12-05 12:36:40.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:42 compute-0 podman[251404]: 2025-12-05 12:36:42.210305542 +0000 UTC m=+0.064164234 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 05 12:36:45 compute-0 nova_compute[187208]: 2025-12-05 12:36:45.911 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:45 compute-0 nova_compute[187208]: 2025-12-05 12:36:45.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:45 compute-0 nova_compute[187208]: 2025-12-05 12:36:45.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:36:45 compute-0 nova_compute[187208]: 2025-12-05 12:36:45.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:45 compute-0 nova_compute[187208]: 2025-12-05 12:36:45.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:46 compute-0 podman[251425]: 2025-12-05 12:36:46.011434919 +0000 UTC m=+0.064121374 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 12:36:46 compute-0 podman[251426]: 2025-12-05 12:36:46.03388024 +0000 UTC m=+0.084664640 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 12:36:50 compute-0 nova_compute[187208]: 2025-12-05 12:36:50.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:36:50 compute-0 nova_compute[187208]: 2025-12-05 12:36:50.916 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:50 compute-0 nova_compute[187208]: 2025-12-05 12:36:50.916 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:36:50 compute-0 nova_compute[187208]: 2025-12-05 12:36:50.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:50 compute-0 nova_compute[187208]: 2025-12-05 12:36:50.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:36:50 compute-0 nova_compute[187208]: 2025-12-05 12:36:50.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:55 compute-0 nova_compute[187208]: 2025-12-05 12:36:55.919 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:36:56 compute-0 podman[251464]: 2025-12-05 12:36:56.204366181 +0000 UTC m=+0.055458966 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 12:36:56 compute-0 podman[251465]: 2025-12-05 12:36:56.22847942 +0000 UTC m=+0.078189895 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 12:36:56 compute-0 podman[251463]: 2025-12-05 12:36:56.238930199 +0000 UTC m=+0.092138384 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 05 12:37:00 compute-0 nova_compute[187208]: 2025-12-05 12:37:00.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:37:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:37:03.048 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:37:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:37:03.048 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:37:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:37:03.049 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:37:05 compute-0 nova_compute[187208]: 2025-12-05 12:37:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:37:05 compute-0 nova_compute[187208]: 2025-12-05 12:37:05.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:37:05 compute-0 nova_compute[187208]: 2025-12-05 12:37:05.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:37:05 compute-0 nova_compute[187208]: 2025-12-05 12:37:05.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:37:05 compute-0 nova_compute[187208]: 2025-12-05 12:37:05.076 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:37:05 compute-0 nova_compute[187208]: 2025-12-05 12:37:05.076 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:37:05 compute-0 podman[251529]: 2025-12-05 12:37:05.206325216 +0000 UTC m=+0.056815374 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:37:05 compute-0 nova_compute[187208]: 2025-12-05 12:37:05.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:37:09 compute-0 nova_compute[187208]: 2025-12-05 12:37:09.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:37:10 compute-0 nova_compute[187208]: 2025-12-05 12:37:10.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:37:10 compute-0 nova_compute[187208]: 2025-12-05 12:37:10.924 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:37:12 compute-0 nova_compute[187208]: 2025-12-05 12:37:12.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:37:13 compute-0 nova_compute[187208]: 2025-12-05 12:37:13.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:37:13 compute-0 nova_compute[187208]: 2025-12-05 12:37:13.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:37:13 compute-0 podman[251554]: 2025-12-05 12:37:13.221347501 +0000 UTC m=+0.061698514 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Dec 05 12:37:15 compute-0 nova_compute[187208]: 2025-12-05 12:37:15.926 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:37:16 compute-0 nova_compute[187208]: 2025-12-05 12:37:16.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:37:16 compute-0 podman[251575]: 2025-12-05 12:37:16.211064624 +0000 UTC m=+0.056138375 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 12:37:16 compute-0 podman[251574]: 2025-12-05 12:37:16.237768867 +0000 UTC m=+0.088197751 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Dec 05 12:37:20 compute-0 nova_compute[187208]: 2025-12-05 12:37:20.928 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:37:20 compute-0 nova_compute[187208]: 2025-12-05 12:37:20.931 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:37:20 compute-0 nova_compute[187208]: 2025-12-05 12:37:20.931 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:37:20 compute-0 nova_compute[187208]: 2025-12-05 12:37:20.931 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.137 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.138 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.138 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.138 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.304 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.306 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.0394058227539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.306 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.306 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.615 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.616 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.639 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.758 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.760 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:37:21 compute-0 nova_compute[187208]: 2025-12-05 12:37:21.760 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:37:26 compute-0 nova_compute[187208]: 2025-12-05 12:37:26.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:37:27 compute-0 podman[251617]: 2025-12-05 12:37:27.225089097 +0000 UTC m=+0.062428135 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:37:27 compute-0 podman[251616]: 2025-12-05 12:37:27.225456247 +0000 UTC m=+0.067251163 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 05 12:37:27 compute-0 podman[251618]: 2025-12-05 12:37:27.284983968 +0000 UTC m=+0.117368905 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 05 12:37:31 compute-0 nova_compute[187208]: 2025-12-05 12:37:31.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:37:36 compute-0 nova_compute[187208]: 2025-12-05 12:37:36.121 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:37:36 compute-0 nova_compute[187208]: 2025-12-05 12:37:36.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:37:36 compute-0 nova_compute[187208]: 2025-12-05 12:37:36.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:37:36 compute-0 nova_compute[187208]: 2025-12-05 12:37:36.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:37:36 compute-0 nova_compute[187208]: 2025-12-05 12:37:36.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:37:36 compute-0 nova_compute[187208]: 2025-12-05 12:37:36.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:37:36 compute-0 podman[251684]: 2025-12-05 12:37:36.243437979 +0000 UTC m=+0.051322737 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 12:37:41 compute-0 nova_compute[187208]: 2025-12-05 12:37:41.162 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:37:41 compute-0 nova_compute[187208]: 2025-12-05 12:37:41.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:37:41 compute-0 nova_compute[187208]: 2025-12-05 12:37:41.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:37:41 compute-0 nova_compute[187208]: 2025-12-05 12:37:41.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:37:41 compute-0 nova_compute[187208]: 2025-12-05 12:37:41.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:37:41 compute-0 nova_compute[187208]: 2025-12-05 12:37:41.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:37:44 compute-0 podman[251709]: 2025-12-05 12:37:44.204142384 +0000 UTC m=+0.059443700 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 05 12:37:46 compute-0 nova_compute[187208]: 2025-12-05 12:37:46.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:37:47 compute-0 podman[251730]: 2025-12-05 12:37:47.209938974 +0000 UTC m=+0.052202852 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 12:37:47 compute-0 podman[251729]: 2025-12-05 12:37:47.212766575 +0000 UTC m=+0.058620226 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Dec 05 12:37:51 compute-0 nova_compute[187208]: 2025-12-05 12:37:51.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:37:56 compute-0 nova_compute[187208]: 2025-12-05 12:37:56.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:37:58 compute-0 podman[251771]: 2025-12-05 12:37:58.214815715 +0000 UTC m=+0.063052532 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 05 12:37:58 compute-0 podman[251772]: 2025-12-05 12:37:58.217564634 +0000 UTC m=+0.060164710 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 12:37:58 compute-0 podman[251773]: 2025-12-05 12:37:58.248804386 +0000 UTC m=+0.089170288 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 05 12:38:01 compute-0 nova_compute[187208]: 2025-12-05 12:38:01.170 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:38:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:38:03.049 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:38:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:38:03.050 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:38:03 compute-0 ovn_metadata_agent[104466]: 2025-12-05 12:38:03.050 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:38:06 compute-0 nova_compute[187208]: 2025-12-05 12:38:06.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:38:06 compute-0 nova_compute[187208]: 2025-12-05 12:38:06.761 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:06 compute-0 nova_compute[187208]: 2025-12-05 12:38:06.762 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 12:38:07 compute-0 nova_compute[187208]: 2025-12-05 12:38:07.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:07 compute-0 nova_compute[187208]: 2025-12-05 12:38:07.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 12:38:07 compute-0 nova_compute[187208]: 2025-12-05 12:38:07.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 12:38:07 compute-0 nova_compute[187208]: 2025-12-05 12:38:07.078 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 12:38:07 compute-0 podman[251843]: 2025-12-05 12:38:07.19337465 +0000 UTC m=+0.050549005 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 12:38:11 compute-0 nova_compute[187208]: 2025-12-05 12:38:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:11 compute-0 nova_compute[187208]: 2025-12-05 12:38:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:11 compute-0 nova_compute[187208]: 2025-12-05 12:38:11.175 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:38:12 compute-0 sshd-session[251868]: Accepted publickey for zuul from 192.168.122.10 port 59826 ssh2: ECDSA SHA256:PhH2jQvhQ5fxTjpvZoSW3Qt62TVY0ynk1vRQGqkJC4I
Dec 05 12:38:12 compute-0 systemd-logind[792]: New session 29 of user zuul.
Dec 05 12:38:12 compute-0 systemd[1]: Started Session 29 of User zuul.
Dec 05 12:38:12 compute-0 sshd-session[251868]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 12:38:12 compute-0 sudo[251872]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 05 12:38:12 compute-0 sudo[251872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 12:38:14 compute-0 nova_compute[187208]: 2025-12-05 12:38:14.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:14 compute-0 nova_compute[187208]: 2025-12-05 12:38:14.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:15 compute-0 nova_compute[187208]: 2025-12-05 12:38:15.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:15 compute-0 podman[252012]: 2025-12-05 12:38:15.178812931 +0000 UTC m=+0.069580879 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm)
Dec 05 12:38:16 compute-0 nova_compute[187208]: 2025-12-05 12:38:16.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:38:16 compute-0 nova_compute[187208]: 2025-12-05 12:38:16.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:38:16 compute-0 nova_compute[187208]: 2025-12-05 12:38:16.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:38:16 compute-0 nova_compute[187208]: 2025-12-05 12:38:16.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:38:16 compute-0 nova_compute[187208]: 2025-12-05 12:38:16.230 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:38:16 compute-0 nova_compute[187208]: 2025-12-05 12:38:16.231 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:38:17 compute-0 nova_compute[187208]: 2025-12-05 12:38:17.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:17 compute-0 ovs-vsctl[252064]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 05 12:38:18 compute-0 podman[252112]: 2025-12-05 12:38:18.225494421 +0000 UTC m=+0.067966583 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Dec 05 12:38:18 compute-0 podman[252113]: 2025-12-05 12:38:18.238834592 +0000 UTC m=+0.081382736 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 12:38:18 compute-0 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 05 12:38:18 compute-0 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 05 12:38:18 compute-0 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 05 12:38:19 compute-0 crontab[252517]: (root) LIST (root)
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.231 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.500 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.501 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.501 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.502 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 12:38:21 compute-0 systemd[1]: Starting Hostname Service...
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.656 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.657 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5373MB free_disk=72.95220565795898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.658 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.658 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 12:38:21 compute-0 systemd[1]: Started Hostname Service.
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.753 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.755 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.786 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.806 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.828 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 12:38:21 compute-0 nova_compute[187208]: 2025-12-05 12:38:21.829 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 12:38:25 compute-0 nova_compute[187208]: 2025-12-05 12:38:25.824 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 12:38:26 compute-0 nova_compute[187208]: 2025-12-05 12:38:26.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 12:38:26 compute-0 nova_compute[187208]: 2025-12-05 12:38:26.233 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 12:38:26 compute-0 nova_compute[187208]: 2025-12-05 12:38:26.233 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 12:38:26 compute-0 nova_compute[187208]: 2025-12-05 12:38:26.233 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:38:26 compute-0 nova_compute[187208]: 2025-12-05 12:38:26.233 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 12:38:26 compute-0 nova_compute[187208]: 2025-12-05 12:38:26.234 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
